CN109615026A - A kind of differentiation projecting method and pattern recognition device based on Sparse rules - Google Patents

A kind of differentiation projecting method and pattern recognition device based on Sparse rules Download PDF

Info

Publication number
CN109615026A
CN109615026A CN201811625721.6A CN201811625721A CN109615026A CN 109615026 A CN109615026 A CN 109615026A CN 201811625721 A CN201811625721 A CN 201811625721A CN 109615026 A CN109615026 A CN 109615026A
Authority
CN
China
Prior art keywords
sparse
sample
differentiation
matrix
indicate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811625721.6A
Other languages
Chinese (zh)
Other versions
CN109615026B (en
Inventor
袁森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC Information Science Research Institute
Original Assignee
CETC Information Science Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC Information Science Research Institute filed Critical CETC Information Science Research Institute
Priority to CN201811625721.6A priority Critical patent/CN109615026B/en
Publication of CN109615026A publication Critical patent/CN109615026A/en
Application granted granted Critical
Publication of CN109615026B publication Critical patent/CN109615026B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2136Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on sparsity criteria, e.g. with an overcomplete basis

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

A kind of differentiation projecting method based on Sparse rules, it is characterized in that: include the following steps, step 1) constructs concatenate dictionaries and study sparse representation structure;Step 2) retains sparse representation structure;The part of step 3) learning data and non local structure;Step 4) Sparse rulesization distinguish projection.The present invention makes full use of the rarefaction representation of concatenate dictionaries learning data, avoids and solves L1 norm problem, greatly reduces computation complexity;The geometry topological structure of data has been fully considered by non local maximization and local minimum.

Description

A kind of differentiation projecting method and pattern recognition device based on Sparse rules
Technical field
The invention discloses a kind of feature extracting methods-Sparse rulesization to differentiate projection, belong to biological characteristic extract and Mode identification technology, be related to the study of Sparse expression, the building of local and non local structure, objective function it is excellent Change, can be used for image recognition, data mining, data clusters.
Background technique
Current most of Nonlinear feature extraction methods are faced with the problem of artificially defined neighbour figure, while Neighbourhood parameter Selection is also directly related to the quality of data characteristics extraction.Till now, never a simple and effective standard is come Determine the Neighbourhood parameter of algorithm.The appearance of rarefaction representation avoids the select permeability of field parameter well, it can be adaptive The neighbor relationships of ground acquisition data.Rarefaction representation is introduced area of pattern recognition by some scholars in recent years, is mentioned for processing feature It takes or the problems such as feature selecting, classification, cluster, target detection and information merge.
Rarefaction representation and compressed sensing are the new signal expressions of one kind proposed by Donoho et al. and obtain frame, it swashs The extensive research enthusiasm of information science field is played.One signal is expressed as the sparse line of baseband signal in dictionary by rarefaction representation Property combination, the sparse linear combination in coefficient vector be referred to as sparse coefficient vector.Assuming that sparse in original higher dimensional space SignalPass through observing matrixObservation is done to it obtains the observation signal in lower dimensional spaceWherein l < < m.In statistics, famous Lasso (Least absolute shrinkage and selection operator) is calculated Method is also built upon rarefaction representation inwardly, is less than given constant by the L1 norm of restricted regression coefficient vector, so that table Show that the quadratic sum of error is minimum, to achieve the purpose that obtain Sparse model.
In view of sparse study to the accurate modeling ability of height of problem, have been developed as at a kind of very strong image Reason and pattern-recognition tool.With sparse study for thought, Aharon et al. proposes a kind of new sparse representation method K-SVD, It is based on current dictionary in (1) and carries out the atom in sparse coding and (2) update dictionary to sample to be more preferably fitted sample data It is iterated between two steps until convergence.K-SVD is generalized in image denoising problem by Elad et al., and has been obtained very Good removes dryness effect.Non local thought and sparse study are combined and propose a kind of non-local sparse mould by Mairal et al. Type, and the problems such as be successfully applied to image denoising.Feng et al. proposes a kind of by the projection matrix of feature extraction and dilute The method for differentiating dictionary and carrying out combination learning in indicating is dredged, and the validity of this method is demonstrated by face recognition experiment.
PCA is as most typical Dimensionality Reduction method, and in order to guarantee the sparsity of its projection vector, scholars propose dilute Dredge principal component analysis (Sparse PCA, SPCA) and non-negative sparse principal component analysis (Nonnegative SPCA, NSPCA).For The sparse reconstruction weights of holding, Cheng et al. propose sparse neighbour and keep embedding grammar (Sparse neighborhood preserving projection,SNPE).Qiao et al. combines rarefaction representation with manifold learning, proposes one kind and is based on The unsupervised Dimension Characteristics extracting method of sparse study --- sparse holding projects (Sparsity Preserving Projections, SPP), SPP obtains the sparse Remodeling of data by an objective function based on L1 regularization, and most Achieve the purpose that Dimensionality Reduction eventually to keep this sparse Remodeling.On the one hand, SPP has automatic capture vertex neighborhood The advantages of relationship;On the other hand, it even if in the case where no sample label information, is still wrapped based on SPP projection obtained Containing a degree of discriminant information.SPP is also apparent as very typical sparse learning algorithm, disadvantage.
Referring to figure 1.It is sparse that projection (SPP) is kept to be intended to realize dimension by keeping the rarefaction representation structure of data The purpose of number reduction, specific flow chart are as shown in Figure 1.Training sample set X={ the x given for one1,x2,…,xN}∈ RD×N, wherein D indicates that intrinsic dimensionality, N indicate sample number.SPP is learnt by solving following L1 norm minimum problem first Each sample xiSparse coefficient vector si:
min||si||1
s.t.xi=Xsi, 1=1Tsi (1.1)
Wherein | | | |1Indicate L1 norm, that is, absolute value operation;1 expression one is all 1 vector.Once utilizing The sparse coefficient vector s of all samples is arrived in formula (1.1) studyi(i=1,2 ..., N), sparse reconstruction weights matrix S can be with It is defined as follows:
S=[s1,s2,…,sN] (1.2)
Finally, it is based on above-mentioned weight matrix S, the objective function of SPP can be defined as follows:
Optimal projection vector w can by solve the generalized eigenvalue equation of feature vector corresponding to the smallest characteristic value come It acquires, generalized eigenvalue equation are as follows:
X(I-S-ST+STS)XTW=λ XXTw(1.4)
Found by the calculating process of SPP method, obvious two the disadvantage is that:
(1) n times need to be repeated and solve L1 norm minimum problem to obtain the sparse coefficient vector of all samples, so that In computation complexity height, therefore it is extremely difficult to requirement of real-time in practical applications.
(2) have ignored data part and non local structural information, be difficult study to the character representation most differentiated.
Summary of the invention
The geometry knot that the present invention is directed to high computation complexity present in current rarefaction representation algorithm, does not consider data itself The problems such as structure information, proposes a kind of differentiation projecting method based on Sparse rules, and its technical solution is as follows: including walking as follows It is rapid:
1) building concatenate dictionaries and study sparse representation structure;
2) retain sparse representation structure;
3) part of learning data and non local structure;
4) Sparse rulesization distinguish projection.
Invention additionally discloses a kind of pattern recognition device, which includes the differentiation projecting method based on Sparse rules.
Detailed description of the invention
Fig. 1 is the sparse holding projecting method flow chart of the prior art of the present invention.
Fig. 2 is that the present invention is based on the differentiation projecting method flow charts of Sparse rules.
Specific embodiment
A kind of differentiation projecting method based on Sparse rules, its technical solution is as follows: including the following steps:
1) building concatenate dictionaries and study sparse representation structure
Give a training sample set X=[x1,x2,…,xN], each xiIndicate a D dimensional vector.Then, mark is utilized Label information (namely sample class) rearrange sample set: X=[X1,X2,…,XC], wherein C indicates sample class number,Indicate the data matrix that all samples of the i-th class are constituted.For convenience, every a kind of sample is carried out at centralization Reason, i.e.,
Wherein, μiIndicate the average value of all samples of the i-th class, NiFor the i-th class sample number.
Then, for every a kind of sample matrixPCA decomposition is carried out, specific as follows:
Wherein, ΦiIt indicatesCovariance matrix, d indicate covariance matrix feature vector.In order to guarantee the complete of information Whole property selects m for the i-th class sampleiA principal component (namely miA feature vector, usually makes mi=Ni) building dictionaryTherefore, the sample x in the i-th class can be indicated are as follows:
Wherein, D is the concatenate dictionaries that acquisition is decomposed by PCA, and by all Di(i=1,2 ..., C) it constitutes. Indicate the sparse coefficient vector based on concatenate dictionaries D.
According to above-mentioned calculating process, the corresponding sparse coefficient vector of each sample.Pass through formula (1.7), it has been found that Due to DiThe orthogonality of column, for any one sample in the i-th class, sparse coefficient vectorSquare can be quickly move through Battle array multiplication of vectors obtains, i.e.,
It is not difficult to find that step 1) by simple matrix calculate can Fast Learning data rarefaction representation, avoid L1 norm optimization problem, can greatly reduce computation complexity.
2) retain sparse representation structure
As can be seen that the sparse representation structure of data discloses the local discriminant information of training sample well.In order to obtain The low-dimensional for evidence of fetching indicates, realizes Dimensionality Reduction, it is intended that retain the sparse representation structure of data.Therefore, next definition Following objective function, and retain its sparsity structure by minimizing the optimal projection of reconstructed error searching.Objective function are as follows:
Wherein,Indicate xiThe sparse coefficient vector of sample, the optimal projection vector of w.By algebraic operation, we can be incited somebody to action Formula (1.9) is rewritten are as follows:
Wherein,Indicate the matrix being made of all sample sparse coefficient vectors.
3) part of learning data and non local structure
Part and non local advantage for collective data, define two Scatter Matrixes: local divergence and non local dissipating Degree.Local Scatter Matrix indicates are as follows:
Wherein, xiFor sample data, weight HijIs defined as:
In above formula, O (K, xi) indicate sample xiK Neighbor Points set, K variation range be 1~(NiIt -1), can basis Most suitable value is chosen in experiment.Formula (1.12) means that: if xjBelong to xiK Neighbor Points, then it is assumed that have side phase between two o'clock Even, Hij=1;Otherwise, then it is assumed that point-to-point transmission is boundless to be connected, Hij=0.
So that non local Scatter Matrix can indicate are as follows:
4) Sparse rulesization distinguish projection
The purpose of differentiation projecting method based on Sparse rules is to find one group of optimal projection vector, on the one hand, can Retain the sparse representation structure of data;On the other hand, it maximizes non local divergence and minimizes local divergence.In consideration of it, being based on The objective function of the differentiation projection of Sparse rules can be with is defined as:
Wherein, α (0 < α < 1) is a balance parameters, can be by adjusting different values come two in balance molecule Measurement.SNAnd SLIt is previously defined non local and local Scatter Matrix respectively.
By formula (1.10), (1.11), (1.13) substitute into formula (1.14) and can obtain:
It enables
Ψ=α SL+(1-α)(XXT-XSTDT-DSXT+DSSTDT)
Optimization problem (1.15) may finally be converted into following generalized eigenvalue problem:
SNW=λ Ψ w (1.16)
Therefore, optimal projection matrix W=[w1,w2,…,wd] can be by d that solution above formula generalized eigenvalue problem obtains The corresponding feature vector composition of maximum eigenvalue.
The entire calculating process of inventive algorithm considers the partial structurtes of data twice, is once the structure in the way of k nearest neighbor The process for building Neighborhood Graph, as shown in formula (1.12);It is once the process for learning rarefaction representation, referring to step 1.
For step 3), it individually can only consider the local or non local structure of data, can also be replaced with other structures Structure is finally brought into objective function and optimized by generation.
For objective function of the present invention (1.15), its form of ratios can be transformed into differential form and reach same purpose.
It is summarized as follows:
1, the present invention makes full use of the rarefaction representation of concatenate dictionaries learning data, avoids and solves L1 norm problem, significantly Reduce computation complexity.
2, the present invention not only allows for the rarefaction representation structure of data, also uses relative to sparse holding projection algorithm The label information of data.
3, the present invention is minimized by partial structurtes and non local structure maximizes, and is realized to initial data geometrical property The considerations of.
4, the entire calculating process of inventive algorithm considers the partial structurtes of data twice, is once in the way of k nearest neighbor The process for constructing Neighborhood Graph, as shown in formula (1.12);It is once the process for learning rarefaction representation, referring to step 1.
Invention additionally discloses a kind of pattern recognition device, which includes the differentiation projecting method based on Sparse rules.
Many details are elaborated in the above description to fully understand the present invention.But above description is only Presently preferred embodiments of the present invention, the invention can be embodied in many other ways as described herein, therefore this Invention is not limited by specific implementation disclosed above.Any those skilled in the art are not departing from the technology of the present invention simultaneously In the case of aspects, all technical solution of the present invention is made using the methods and technical content of the disclosure above many possible Changes and modifications or equivalent example modified to equivalent change.Anything that does not depart from the technical scheme of the invention, according to this The technical spirit of invention any simple modifications, equivalents, and modifications made to the above embodiment, still fall within skill of the present invention In the range of the protection of art scheme.

Claims (6)

1. a kind of differentiation projecting method based on Sparse rules, it is characterized in that: include the following steps,
Step 1) constructs concatenate dictionaries and study sparse representation structure;
Step 2) retains sparse representation structure;
The part of step 3) learning data and non local structure;
Step 4) Sparse rulesization distinguish projection.
2. the differentiation projecting method according to claim 1 based on Sparse rules, it is characterized in that: the step 1) is into one Step includes following content:
Give a training sample set X=[x1,x2,…,xN], each xiIndicate then a D dimensional vector is believed using label Breath rearranges sample set: X=[X1,X2,…,XC], wherein C indicates sample class number, Indicate the i-th class institute There is the data matrix that sample is constituted, for convenience, centralization processing is carried out to every a kind of sample, that is,
Wherein, μiIndicate the average value of all samples of the i-th class, NiFor the i-th class sample number,
Then, for every a kind of sample matrixPCA decomposition is carried out, specific as follows:
Wherein, ΦiIt indicatesCovariance matrix, d indicate covariance matrix feature vector, in order to guarantee the integrality of information, For the i-th class sample set, m is selectediA principal component constructs dictionaryTherefore, any one in the i-th class Sample x can be indicated are as follows:
Wherein, D is the concatenate dictionaries that acquisition is decomposed by PCA, and by all Di(i=1,2 ..., C) it constitutes; Indicate the sparse coefficient vector based on concatenate dictionaries D;
According to above-mentioned calculating process, each sample can correspond to a sparse coefficient vector, by formula (1.7), due to DiColumn Orthogonality, for any one sample in the i-th class, sparse coefficient vectorMatrix-vector multiplication can be quickly move through It obtains, that is,
3. the differentiation projecting method according to claim 2 based on Sparse rules, it is characterized in that: the step 2) is into one Step includes following content:
It is indicated to obtain the low-dimensional of data, realizes Dimensionality Reduction, need to retain the sparse representation structure of data, therefore, it is necessary to Following objective function is defined, and retains its sparsity structure, objective function by minimizing the optimal projection of reconstructed error searching are as follows:
Wherein,Indicate xiThe sparse coefficient vector of sample can be rewritten formula (1.9) by algebraic operation are as follows:
Wherein,Indicate the matrix that all sample sparse coefficient vectors are constituted.
4. the differentiation projecting method according to claim 3 based on Sparse rules, it is characterized in that: the step 3) is into one Step includes following content:
For the space geometry characteristic of learning data, two Scatter Matrixes: local divergence and non local divergence, local divergence are defined Matrix is defined as:
Wherein, xiIndicate sample, weight HijIs defined as:
In above formula, O (K, xi) indicate sample xiK Neighbor Points set, formula (1.12) means that: if xjBelong to xiK Neighbor Points, then it is assumed that have Bian Xianglian, H between two o'clockij=1;Otherwise, then it is assumed that point-to-point transmission is boundless to be connected, Hij=0,
Similarly, non local Scatter Matrix can be with is defined as:
5. the differentiation projecting method according to claim 4 based on Sparse rules, it is characterized in that: the step 4) is into one Step includes following content:
Differentiation projection target based on Sparse rules is to find one group of optimal projection vector, on the one hand, can retain data Sparse representation structure;On the other hand, the ratio of non local divergence and local divergence is maximized, in consideration of it, being based on Sparse rules The objective function of the differentiation projection of change can be with is defined as:
Wherein, α (0 < α < 1) is a balance parameters, can be measured by adjusting different values come two in balance molecule, SNAnd SLIt is non local and local Scatter Matrix defined in step 3 respectively,
By formula (1.10), (1.11), (1.13) substitute into formula (1.14) and can obtain:
It enables
Ψ=α SL+(1-α)(XXT-XSTDT-DSXT+DSSTDT)
Optimization problem (1.15) may finally be converted into following generalized eigenvalue problem:
SNW=λ Ψ w (1.16)
Therefore, optimal projection matrix W=[w1,w2,…,wd] can be obtained by solution above formula generalized eigenvalue problem d it is maximum special The corresponding feature vector composition of value indicative.
6. a kind of pattern recognition device, it is characterized in that, it is any described based on Sparse rules including the claims 1-5 Differentiate projecting method.
CN201811625721.6A 2018-12-28 2018-12-28 Discrimination projection method based on sparse regularization and image recognition device Active CN109615026B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811625721.6A CN109615026B (en) 2018-12-28 2018-12-28 Discrimination projection method based on sparse regularization and image recognition device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811625721.6A CN109615026B (en) 2018-12-28 2018-12-28 Discrimination projection method based on sparse regularization and image recognition device

Publications (2)

Publication Number Publication Date
CN109615026A true CN109615026A (en) 2019-04-12
CN109615026B CN109615026B (en) 2020-11-17

Family

ID=66011879

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811625721.6A Active CN109615026B (en) 2018-12-28 2018-12-28 Discrimination projection method based on sparse regularization and image recognition device

Country Status (1)

Country Link
CN (1) CN109615026B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110648276A (en) * 2019-09-25 2020-01-03 重庆大学 High-dimensional image data dimension reduction method based on manifold mapping and dictionary learning
CN110852345A (en) * 2019-09-29 2020-02-28 国网浙江省电力有限公司宁波供电公司 Image classification method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103544507A (en) * 2013-10-15 2014-01-29 中国矿业大学 Method for reducing dimensions of hyper-spectral data on basis of pairwise constraint discriminate analysis and non-negative sparse divergence
CN106066994A (en) * 2016-05-24 2016-11-02 北京工业大学 A kind of face identification method of the rarefaction representation differentiated based on Fisher
CN106815599A (en) * 2016-12-16 2017-06-09 合肥工业大学 A kind of identification sparse coding dictionary learning method general in image classification
CN108647690A (en) * 2017-10-17 2018-10-12 南京工程学院 The sparse holding projecting method of differentiation for unconstrained recognition of face
US20180306882A1 (en) * 2017-04-24 2018-10-25 Cedars-Sinai Medical Center Low-rank tensor imaging for multidimensional cardiovascular mri
US10162794B1 (en) * 2018-03-07 2018-12-25 Apprente, Inc. Hierarchical machine learning system for lifelong learning

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103544507A (en) * 2013-10-15 2014-01-29 中国矿业大学 Method for reducing dimensions of hyper-spectral data on basis of pairwise constraint discriminate analysis and non-negative sparse divergence
CN106066994A (en) * 2016-05-24 2016-11-02 北京工业大学 A kind of face identification method of the rarefaction representation differentiated based on Fisher
CN106815599A (en) * 2016-12-16 2017-06-09 合肥工业大学 A kind of identification sparse coding dictionary learning method general in image classification
US20180306882A1 (en) * 2017-04-24 2018-10-25 Cedars-Sinai Medical Center Low-rank tensor imaging for multidimensional cardiovascular mri
CN108647690A (en) * 2017-10-17 2018-10-12 南京工程学院 The sparse holding projecting method of differentiation for unconstrained recognition of face
US10162794B1 (en) * 2018-03-07 2018-12-25 Apprente, Inc. Hierarchical machine learning system for lifelong learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
袁森: "《张量局部保持投影算法研究及应用》", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110648276A (en) * 2019-09-25 2020-01-03 重庆大学 High-dimensional image data dimension reduction method based on manifold mapping and dictionary learning
CN110852345A (en) * 2019-09-29 2020-02-28 国网浙江省电力有限公司宁波供电公司 Image classification method
CN110852345B (en) * 2019-09-29 2023-06-09 国网浙江省电力有限公司宁波供电公司 Image classification method

Also Published As

Publication number Publication date
CN109615026B (en) 2020-11-17

Similar Documents

Publication Publication Date Title
CN111860612B (en) Unsupervised hyperspectral image hidden low-rank projection learning feature extraction method
CN107133496B (en) Gene feature extraction method based on manifold learning and closed-loop deep convolution double-network model
WO2022041678A1 (en) Remote sensing image feature extraction method employing tensor collaborative graph-based discriminant analysis
CN107590515B (en) Hyperspectral image classification method of self-encoder based on entropy rate superpixel segmentation
CN106023065A (en) Tensor hyperspectral image spectrum-space dimensionality reduction method based on deep convolutional neural network
CN107085716A (en) Across the visual angle gait recognition method of confrontation network is generated based on multitask
CN113723255B (en) Hyperspectral image classification method and storage medium
CN109766858A (en) Three-dimensional convolution neural network hyperspectral image classification method combined with bilateral filtering
CN111639719B (en) Footprint image retrieval method based on space-time motion and feature fusion
CN106846322B (en) The SAR image segmentation method learnt based on curve wave filter and convolutional coding structure
CN105335975B (en) Polarization SAR image segmentation method based on low-rank decomposition and statistics with histogram
CN106683102A (en) SAR image segmentation method based on ridgelet filters and convolution structure model
CN109712150A (en) Optical microwave image co-registration method for reconstructing and device based on rarefaction representation
CN110889865A (en) Video target tracking method based on local weighted sparse feature selection
CN111274905A (en) AlexNet and SVM combined satellite remote sensing image land use change detection method
CN103295031A (en) Image object counting method based on regular risk minimization
CN106157330A (en) A kind of visual tracking method based on target associating display model
CN107610028A (en) A kind of atmosphere pollution on-line monitoring system based on wireless cloud Sensor Network
CN111680579B (en) Remote sensing image classification method for self-adaptive weight multi-view measurement learning
CN111967537A (en) SAR target classification method based on two-way capsule network
CN109615026A (en) A kind of differentiation projecting method and pattern recognition device based on Sparse rules
CN114937173A (en) Hyperspectral image rapid classification method based on dynamic graph convolution network
CN111242028A (en) Remote sensing image ground object segmentation method based on U-Net
CN106528679A (en) Time series analysis method based on multilinear autoregression model
Zhou et al. Optimisation of Gaussian mixture model for satellite image classification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant