CN104008383A - Hyperspectral image characteristic extraction algorithm based on manifold learning linearization - Google Patents

Hyperspectral image characteristic extraction algorithm based on manifold learning linearization Download PDF

Info

Publication number
CN104008383A
CN104008383A CN201410286545.3A CN201410286545A CN104008383A CN 104008383 A CN104008383 A CN 104008383A CN 201410286545 A CN201410286545 A CN 201410286545A CN 104008383 A CN104008383 A CN 104008383A
Authority
CN
China
Prior art keywords
matrix
manifold learning
algorithm
dimensionality reduction
linearization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410286545.3A
Other languages
Chinese (zh)
Other versions
CN104008383B (en
Inventor
张淼
赖镇洲
刘攀
沈毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN201410286545.3A priority Critical patent/CN104008383B/en
Publication of CN104008383A publication Critical patent/CN104008383A/en
Application granted granted Critical
Publication of CN104008383B publication Critical patent/CN104008383B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a hyperspectral image characteristic extraction algorithm based on manifold learning linearization and belongs to the technical field of hyperspectral image data processing and application. The shortcoming that a manifold learning algorithm has no generalization ability is overcome through the improved manifold learning linearization algorithm. The method comprises the steps that first, a preliminary dimensionality reduction result and a Laplacian matrix are computed; second, a matrix equation set constant term matrix and a coefficient matrix are established; third, a characteristic converting matrix is computed; and fourth, a final dimensionality reduction result is computed according to the characteristic converting matrix. The shortcoming that the global linear mapping hypothesis in LPP, NPE and LLTSA linearization manifold learning algorithms is invalid most of the time is overcome, a penalty term which deviates from an original manifold learning algorithm result is added into an original cost function, a bound term in an original target function is removed, and solving of the optimum characteristic transition matrix is converted into solving of a matrix equation set. The algorithm is suitable for hyperspectral image characteristic extraction.

Description

Based on the linearizing high spectrum image feature extraction algorithm of manifold learning
Technical field
The invention belongs to high spectrum image data processing and applied technical field, relate to a kind of high spectrum image feature extraction algorithm, be specifically related to a kind of based on the linearizing high spectrum image feature extraction algorithm of manifold learning.
Background technology
High-spectrum similarly be quantity of information googol according to cube, the corresponding spectral line that comprises up to a hundred wave bands of each pixel, this for people study that relation between material and the curve of spectrum provides may.But high-spectral data exists data redundancy and dimension disaster problem, people have urgent demand to go the redundancy of this information of eliminating high-spectral data.This redundancy of high-spectral data is mainly that the correlativity between high-spectral data wave band causes, dimensionality reduction is a kind of important preprocess method, although it is simple that the linear dimension-reduction algorithm as PCA (Principal Component Analysis) and LDA (Linear DiscriminantAnalysis) is realized, but high spectrum image has nonlinear characteristic, manifold learning arithmetic can be excavated the nonlinear organization of high-spectral data better, improves data analysis capabilities.Classical manifold learning arithmetic has LE (Laplacian Egenmap), LLE (Locally Linear Embedding) and LTSA (Local Tangent Space Alignment) algorithm, can be used for the feature extraction algorithm of high spectrum image.
But classical manifold learning arithmetic is if LE is without generalization ability, in the time having new high spectrum samples data to occur, only have new sample data and original sample data are combined and carry out again overall study and can obtain the dimensionality reduction result of new samples data.In the time that the number of new sample data seems little than the number of original sample data, obviously repeating so large-scale calculating can increase the time complexity of algorithm greatly.From hyperspectral classification demand, manifold learning arithmetic must possess generalization ability, because many times training data and test data to be sorted cannot be put together and learnt, for this situation, test data to be sorted must be mapped to by generalization algorithm the feature space of low-dimensional, otherwise cannot classify to new high-spectral data at the feature space of low-dimensional.
Although linear dimension reduction method is as not good in PCA etc. has nonlinear effect data in processing, they but can obtain an overall mapping function, thereby possess generalization ability.For this reason a lot of scholars to manifold learning, linearization solves the problem of manifold learning without generalization ability.Be exactly very typically wherein the linearization to LE, LLE, LTSA, obtain respectively LPP (Locality preserving projections), NPE (Neighborhood Preserving Embedding), LLTSA (Linear Local Tangent Space Alignment).
Manifold learning arithmetic possesses unified framework, for for the manifold learning arithmetic based on spectral factorization such as LE, LLE and LTSA, and optimum dimensionality reduction result Y *can solve by optimization problem below:
Y * = min Y tr ( YLY T ) s . t YBY Y = I - - - ( 1 ) ,
Wherein tr () asks matrix trace operational symbol, and L is Laplacian Matrix, and B is constraint matrix.LPP, NPE and LLTSA suppose to exist in reduction process an overall linear mapping:
Y=V TX (2),
Optimization problem in formula (1) is changed into:
V * = min V tr ( V T XLX T V ) s . t V T XVX T V = I - - - ( 3 ) .
Obtain optimum linear mapping matrix V *afterwards, no matter be new sample or old sample, can be mapped to dimensionality reduction space by through type (2).But this hypothesis of formula (2) is invalid many times.In order to obtain overall linear mapping, these algorithms are that the local retention performance of script algorithm has been done to certain compromise in fact.
Summary of the invention
The present invention is directed to the deficiency of manifold learning arithmetic without generalization ability, proposed a kind of improved manifold learning linearized algorithm, and use it for the feature extraction algorithm of high spectrum image.
The object of the invention is to be achieved through the following technical solutions:
Do not sacrifice as far as possible in order to make linearizing process the ability that former manifold learning arithmetic keeps local characteristics, the present invention has added the penalty term that departs from former manifold learning arithmetic result in original cost function, and the optimization problem in formula (3) is changed into:
min tr ( V T XLX T V ) + α | | Y L - V T X | | F 2 - - - ( 4 ) ,
Wherein Y lbe the dimensionality reduction result of having learnt, α is penalty coefficient, but can obtain by LE, LLE, LTSA or other any respond well manifold learning arithmetic that does not possess generalization ability.Than formula (3), formula (4) no longer needs to add bound term V txBX tv=I, this makes the region of search of optimization problem obtain widening, and can obtain than the better dimensionality reduction effect of traditional dimension-reduction algorithm.This generalization algorithm is based on Y lthe known process of removing to ask for a linearization mapping matrix V is a kind of linear regression algorithm of the overall situation.Original manifold learning arithmetic is all to obtain optimum dimensionality reduction result by feature decomposition, and improving one's methods that the present invention proposes obtains optimum dimensionality reduction result by solving a matrix equation.
Here provide the solution procedure to the optimization problem in (4):
g ( V ) = tr ( V T XLX T V ) + α | | Y L - V T X | | F 2 = tr ( V T XLX T V ) + αtr { [ Y L - V T X ] [ Y L - V T X ] T } = tr ( V T X ( αI + L ) X T V ) + αtr ( Y L Y L T ) - αtr ( Y L X T V ) - αtr ( V T XT L T ) - - - ( 5 )
To V, differentiate obtains cost function g (V):
∂ g ( V ) ∂ V = 2 [ XAX T ] V - 2 α XY L T = 0 - - - ( 6 ) ,
Wherein A=α I+L.
So the optimization problem in formula (4) becomes the problem of a solution matrix equation:
[XAX T]V=aXY L T (7)。
Be equivalent to system of linear equations problem below:
vec ( V ) = ( XAX T ⊗ I d ) - 1 vec ( α XY L T ) = [ ( XAX T ) - 1 ⊗ I d ] vec ( α XY L T ) - - - ( 8 ) .
So have:
V = avec { [ ( XAX T ) - 1 ⊗ I d ] vec ( B ) , d } - - - ( 9 ) .
Formula (9) is a function expression, and wherein vec () is matrix-vector operational symbol, for vec (B), is D × d dimensional vector by the matrix conversion of D × d dimension.Avec () is moment of a vector array operational symbol.For the matrix of D × d dimension, the operation rule of vec () is the column vector that the element of the every a line of matrix is converted to a d dimension, and the column vector after every row conversion is lined up to D × d dimensional vector according to line number, avec () is its inverse operation, is converted to D × d ties up matrix by the column vector of D × d dimension.
(9) further abbreviation is obtained:
V=(XAX T) -1B (10)。
Provided by the invention based on the linearizing high spectrum image feature extraction algorithm of manifold learning, concrete steps are as follows:
Step 1: calculate preliminary dimensionality reduction result and Laplacian Matrix.
Given high-spectral data collection X, X is that D × N ties up matrix, and D is data dimension, and N is number of samples, and d is dimensionality reduction dimension, carries out as a certain manifold learning arithmetic in LE, LLE and LTSA algorithm, obtains the preliminary dimensionality reduction result Y of this algorithm lwith Laplacian Matrix L, wherein Y lbe that d × N ties up matrix, L is that N × N ties up matrix.
Step 2: build matrix equation group constant term matrix and matrix of coefficients.
The linearizing manifold learnings such as LPP, NPE and LLTSA conventionally all will be by obtaining optimum Feature Conversion matrix to the Laplacian Matrix L feature decomposition of N × N dimension, calculated amount is large, and the innovative point of this step is solving of optimal characteristics transition matrix to be converted to the Solve problems of a matrix equation group.
Matrix equation group, suc as formula shown in (7), provides the structure formula of constant term matrix and matrix of coefficients here:
1) build the matrix equation group constant term matrix B that D × d ties up:
B=αXY L T
Wherein α is a penalty coefficient for positive number, and acquiescence can be set to 1.
2) build the matrix equation group matrix of coefficients C that D × D ties up:
C=X(αI+L)X T
Wherein, I is the unit matrix of N × N dimension.
Step 3: calculated characteristics transition matrix.
In the solution procedure of matrix equation group, conventionally will carry out vectorization and vector is carried out to matrixing computing matrix, but also it is long-pending to relate to the Kronecker of matrix, these computings make the memory headroom of large consumption of the time that solves of matrix equation group also large.The innovative point of this step is that the problem reduction that matrix equation group is solved is that two matrix multiples are realized, and it is convenient to realize, and algorithm complex is little.
1) system of equations matrix of coefficients C is inverted and obtains matrix H:
H=C -1
2) by matrix H and the system of equations constant term matrix B Feature Conversion matrix V that multiplies each other to obtain:
V=HB。
Step 4: by the final dimensionality reduction result of Feature Conversion matrix computations:
Y=V TX,
Wherein, Y is final dimensionality reduction result.
Beneficial effect of the present invention is:
1, this algorithm is a kind of linear regression algorithm of the overall situation, can make after any manifold learning arithmetic that does not possess generalization ability carries out linearization by the method, obtain generalization ability, and can obtain than the better feature extraction effect of traditional dimension-reduction algorithm.
2, for the hypothesis of overall linear mapping in LPP, NPE and LLTSA linearization manifold learning arithmetic in invalid deficiency many times, in original cost function, add the penalty term that departs from former manifold learning arithmetic result, and cast out the bound term in former objective function, this is expanded the region of search of optimization problem, can obtain than the better dimensionality reduction effect of traditional dimension-reduction algorithm.
Brief description of the drawings
Fig. 1 is flow chart of steps of the present invention;
Fig. 2 is the feature scatter diagram of the high spectrum image data in the present invention;
Fig. 3 is the nicety of grading of the high spectrum image based on after feature extraction of the present invention.
Embodiment
Below in conjunction with accompanying drawing, technical scheme of the present invention is further described; but do not limit to so; every technical solution of the present invention is modified or is equal to replacement, and not departing from the spirit and scope of technical solution of the present invention, all should be encompassed in protection scope of the present invention.
The present invention needs to use a kind of existing manifold learning arithmetic to obtain Laplacian Matrix and preliminary dimensionality reduction result in first step, here taking LLE algorithm as example, set it as the manifold learning arithmetic in first step, then use algorithm proposed by the invention to carry out feature extraction to high spectrum image.Experiment is selected IND PINE high spectrum image by high spectrum image data, this high-spectrum similarly is by the Kennedy Space Center of the U.S., taken pictures and obtained in Indiana, USA farmland, in this width image, one has 16 kinds of different crops, and the spatial resolution of image is 20 × 20m 2, each pixel has 224 wave bands, and covering spectral range is the wave spectrum scope of 0.2~2.4 μ m, and spectral resolution is 10nm.
From IND PINE high spectrum image, select randomly 1500 pixels as training sample, from remaining sample, select randomly 1500 pixels as test sample book.In the present embodiment, use the present invention to learn training sample, obtain Feature Conversion matrix, then use characteristic transition matrix carries out feature extraction to test sample book, and uses KNN sorter to classify to test sample book.In order to verify feature extraction of the present invention ground validity, to same training data, test data and sorter, use the linearized algorithm NPE algorithm as a comparison of LLE classics simultaneously.
As shown in Figure 1, adopt the present invention to carry out the concrete steps of feature extraction as follows:
Step 1: obtain dimensionality reduction result Y by LLE manifold learning arithmetic lwith Laplacian Matrix L.
1) input training set wherein each sample data dimension is 224, and test sample book number is 1500; The number that neighbour territory is set is 20; Dimensionality reduction result dimension 30 is set.
2) find neighborhood collection
To the each sample x in training set i, wherein i represents x ilocation index number in training sample set X, i=1,2 ..., 1500, by Euclidean distance sequence between sample, in search training set with x i20 nearest samples, form sample x ineighborhood collection
3) build reconstruction coefficients matrix W:
W ij = T ( x j , X i ) / S ( x j , X i ) x j ∈ X i 0 others i , j = 1,2 , . . . , 1500 ,
Wherein: W ijfor x jto x ireconstruction coefficients, x jfor the test sample book that in training sample set X, position call number is j, wherein, T (x j, X i) calculate by formula below:
T ( x j , X i ) = Σ l = 1 20 P l , index ( x j , X i ) .
Wherein: index (x j, X i) expression x jat X imiddle location index, that T calculates is index (x in matrix P j, X i) row all elements and; Matrix P is neighborhood collection X ilocal covariance matrix contrary, can calculate by formula below:
P=[(X i-x i) T(X i-x i)] -1
Wherein: S be matrix P all elements and, can calculate by formula below:
S ( x j , X i ) = Σ l = 1 20 Σ m = 1 20 P l , m .
4) build Laplacian Matrix:
L=(I-W) T(I-W),
Wherein: I is that N × N ties up unit matrix.
5) carry out feature decomposition:
Lz=λz。
6) obtain preliminary dimensionality reduction result:
Wherein: z iiof L iilittle eigenwert characteristic of correspondence vector, ii=2 ..., d+1.
Step 2: build matrix equation group constant term matrix B and matrix of coefficients C.
1) positive number penalty coefficient α is set, is made as α=1 here.
2) matrix equation group constant term matrix B is passed through formula calculating below:
B=αXY L T
3) matrix equation group matrix of coefficients C passes through formula calculating below:
C=X(αI+L)X T
Step 3: calculated characteristics transition matrix V.
1) system of equations matrix of coefficients C is inverted and obtains matrix H:
H=C -1
2) by matrix H and the system of equations constant term matrix B Feature Conversion matrix V that multiplies each other to obtain:
V=HB。
Step 4: by the final dimensionality reduction result of Feature Conversion matrix computations.
1) pass through the final dimensionality reduction result Y of formula calculation training collection below:
Y=(V) TX。
2) by the final dimensionality reduction result of the test set of formula calculating below :
Y ~ = ( V ) T X ~ ,
Wherein: for test sample book collection.
The final dimensionality reduction result of test set and training set as shown in Figure 2.The extensive result of new samples and the dimensionality reduction result of test sample book are basically identical as we can see from the figure, and the extensive effect of feature extraction algorithm that known the present invention carries is better.
Step 5: use KNN sorting algorithm to carry out class test to test sample book.
1) the final dimensionality reduction result of calculating test sample book collection in each test sample book and the Euclidean distance of the final dimensionality reduction result Y of all training sample sets.
2) get the neighbour of 5 nearest training samples as test sample book.
3) according to the primary categories of these 5 neighbour's ownership, come test sample book to classify.
4) calculate the accuracy of classifying.
Classification accuracy rate as shown in Figure 3.Carrying out the extensive of high spectrum image by feature extraction algorithm that the present invention carries as can see from Figure 3 is afterwards classified and is had higher overall nicety of grading by KNN, the nicety of grading of LLE-GLR algorithm obviously will be higher than the nicety of grading of LLE, NPE algorithm, and the feature extraction algorithm proposing in this explanation the present invention is conducive to improve the nicety of grading of high spectrum image.
The above analysis, known the present invention propose based on the linearizing high spectrum image feature extraction algorithm of manifold learning, existing manifold learning linearized algorithm is improved really.The feature extraction algorithm proposing in the present invention can effectively solve the deficiency that existing manifold learning cannot be learnt new samples, can also improve the nicety of grading of high spectrum image simultaneously, has very large engineering actual value.

Claims (1)

1. based on the linearizing high spectrum image feature extraction algorithm of manifold learning, it is characterized in that described high spectrum image feature extraction algorithm step is as follows:
One, given high-spectral data collection X, obtains preliminary dimensionality reduction result Y by manifold learning arithmetic lwith Laplacian Matrix L, wherein X is that D × N ties up matrix, and D is data dimension, and N is number of samples, Y lbe that d × N ties up matrix, L is that N × N ties up matrix, and d is dimensionality reduction dimension;
Two, build matrix equation group constant term matrix B and matrix of coefficients C:
1) build the matrix equation group constant term matrix B that D × d ties up:
B=αXY L T
Wherein, α is a penalty coefficient for positive number;
2) build the matrix equation group matrix of coefficients C that D × D ties up:
C=X(αI+L)X T
Wherein, I is the unit matrix of N × N dimension;
Three, calculated characteristics transition matrix:
1) matrix equation group matrix of coefficients C is inverted and obtains matrix H:
H=C -1
2) by matrix H and the matrix equation group constant term matrix B Feature Conversion matrix V that multiplies each other to obtain:
V=HB;
Four, by the final dimensionality reduction result of Feature Conversion matrix computations:
Y=V TX,
Wherein, Y is final dimensionality reduction result.
CN201410286545.3A 2014-06-24 2014-06-24 Based on manifold learning linearizing high spectrum image feature extracting method Active CN104008383B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410286545.3A CN104008383B (en) 2014-06-24 2014-06-24 Based on manifold learning linearizing high spectrum image feature extracting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410286545.3A CN104008383B (en) 2014-06-24 2014-06-24 Based on manifold learning linearizing high spectrum image feature extracting method

Publications (2)

Publication Number Publication Date
CN104008383A true CN104008383A (en) 2014-08-27
CN104008383B CN104008383B (en) 2017-03-08

Family

ID=51369032

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410286545.3A Active CN104008383B (en) 2014-06-24 2014-06-24 Based on manifold learning linearizing high spectrum image feature extracting method

Country Status (1)

Country Link
CN (1) CN104008383B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107203779A (en) * 2017-05-11 2017-09-26 中国科学院西安光学精密机械研究所 Hyperspectral dimensionality reduction method based on spatial-spectral information maintenance
CN110222631A (en) * 2019-06-04 2019-09-10 电子科技大学 Based on deblocking it is parallel cut space arrangement SAR image target recognition method
CN110781974A (en) * 2019-10-31 2020-02-11 上海融军科技有限公司 Dimension reduction method and system for hyperspectral image
CN112016366A (en) * 2019-05-31 2020-12-01 北京车和家信息技术有限公司 Obstacle positioning method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060165284A1 (en) * 2005-01-25 2006-07-27 Shmuel Aharon Manifold learning for discriminating pixels in multi-channel images, with application to image/volume/video segmentation and clustering
CN103136736A (en) * 2013-03-19 2013-06-05 哈尔滨工业大学 Hyperspectral remote sensing data non-linear dimension descending method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060165284A1 (en) * 2005-01-25 2006-07-27 Shmuel Aharon Manifold learning for discriminating pixels in multi-channel images, with application to image/volume/video segmentation and clustering
CN103136736A (en) * 2013-03-19 2013-06-05 哈尔滨工业大学 Hyperspectral remote sensing data non-linear dimension descending method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ZHI HE ET AL.: "Hyperspectral image classification with multivariate empirical mode decomposition-based features", 《2014 IEEE INTERNATIONAL INSTRUMENTATION AND MEASUREMENT TECHNOLOGY CONFERENCE》 *
普晗晔 等: "基于流形学习的新高光谱图像降维算法", 《红外与激光工程》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107203779A (en) * 2017-05-11 2017-09-26 中国科学院西安光学精密机械研究所 Hyperspectral dimensionality reduction method based on spatial-spectral information maintenance
CN112016366A (en) * 2019-05-31 2020-12-01 北京车和家信息技术有限公司 Obstacle positioning method and device
CN110222631A (en) * 2019-06-04 2019-09-10 电子科技大学 Based on deblocking it is parallel cut space arrangement SAR image target recognition method
CN110222631B (en) * 2019-06-04 2022-03-15 电子科技大学 Data block parallel based cut space arrangement SAR image target identification method
CN110781974A (en) * 2019-10-31 2020-02-11 上海融军科技有限公司 Dimension reduction method and system for hyperspectral image

Also Published As

Publication number Publication date
CN104008383B (en) 2017-03-08

Similar Documents

Publication Publication Date Title
Ghandorh et al. Semantic segmentation and edge detection—Approach to road detection in very high resolution satellite images
Shihavuddin et al. Image-based coral reef classification and thematic mapping
Arabameri et al. Modeling spatial flood using novel ensemble artificial intelligence approaches in northern Iran
Jie et al. Combined multi-layer feature fusion and edge detection method for distributed photovoltaic power station identification
Zhang et al. Spatial–spectral feature refinement for hyperspectral image classification based on attention-dense 3D-2D-CNN
CN114694039B (en) Remote sensing hyperspectral and laser radar image fusion classification method and device
Ma et al. Local feature search network for building and water segmentation of remote sensing image
Pérez-Benito et al. Smoothing vs. sharpening of colour images: Together or separated
Zhou et al. ECA-mobilenetv3 (large)+ SegNet model for binary sugarcane classification of remotely sensed images
Yousefi et al. Image classification and land cover mapping using sentinel-2 imagery: optimization of SVM parameters
CN104008383A (en) Hyperspectral image characteristic extraction algorithm based on manifold learning linearization
Deb et al. LS-Net: A convolutional neural network for leaf segmentation of rosette plants
Ding et al. Multi-level attention interactive network for cloud and snow detection segmentation
Khoshnoodmotlagh et al. Transboundary basins need more attention: Anthropogenic impacts on land cover changes in aras river basin, monitoring and prediction
Yang et al. MRA-SNet: Siamese networks of multiscale residual and attention for change detection in high-resolution remote sensing images
Chaudhary et al. Satellite imagery analysis for road segmentation using U-Net architecture
Deng et al. Rahc_gan: A data augmentation method for tomato leaf disease recognition
Jin et al. Real-time fire smoke detection method combining a self-attention mechanism and radial multi-scale feature connection
Jing et al. Remote sensing change detection based on unsupervised multi-attention slow feature analysis
Zhang et al. Hyper-LGNet: Coupling local and global features for hyperspectral image classification
Bao et al. Application of transformer models to landslide susceptibility mapping
Wu et al. A distributed fusion framework of multispectral and panchromatic images based on residual network
Huang et al. Adaptive-Attention Completing Network for Remote Sensing Image
CN104050482B (en) A kind of manifold learning generalization algorithm based on local linear smoothing
Liu et al. Image semantic segmentation use multiple-threshold probabilistic R-CNN with feature fusion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant