CN107153839A - A kind of high-spectrum image dimensionality reduction processing method - Google Patents

A kind of high-spectrum image dimensionality reduction processing method Download PDF

Info

Publication number
CN107153839A
CN107153839A CN201710260721.XA CN201710260721A CN107153839A CN 107153839 A CN107153839 A CN 107153839A CN 201710260721 A CN201710260721 A CN 201710260721A CN 107153839 A CN107153839 A CN 107153839A
Authority
CN
China
Prior art keywords
msub
data
neighborhood
algorithm
dimensionality reduction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710260721.XA
Other languages
Chinese (zh)
Inventor
郑泽忠
付垚
俞振璐
卢雨风
李江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201710260721.XA priority Critical patent/CN107153839A/en
Publication of CN107153839A publication Critical patent/CN107153839A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods

Abstract

A kind of high-spectrum image dimensionality reduction processing method of the disclosure of the invention, belongs to image processing field, particularly after lifting dimensionality reduction in terms of data precision.For Isometric Maps algorithm and Local Liner Prediction the problems such as neighborhood choice present on shortcoming, introduce with sparse characteristic L1 norms, above-mentioned algorithm is improved.First, " short circuit " problem in the neighborhood connectivity problem existed for conventional Isometric Maps algorithm, the algorithm combined using L1 with L2 norms, optimized algorithm during neighborhood choice.Secondly, for the less problem of neighborhood choice scope in Local Liner Prediction, L1 norms is introduced, algorithm is adaptively selected Neighbor Points in larger contiguous range.Finally, using data with existing collection label information, the neighborhood relationships figure that supervised learning is obtained is set up, further reinforcing neighborhood choice effect.

Description

A kind of high-spectrum image dimensionality reduction processing method
Technical field
The invention belongs to image processing field, particularly after lifting dimensionality reduction in terms of data precision.
Background technology
High spectrum image generally has hundreds of wave bands, has mass data redundancy, causes data to be difficult by.Most of manifold Learning algorithm carries out characteristic vector analysis to size for N × N data similar matrix, and wherein N represents the quantity of data point, this point The complexity of analysis is at least O (N2).For common computer, large-scale calculations are carried out very inconvenient with storage.Therefore we The method of popular study is introduced to handle high spectrum image.
Manifold learning is divided into according to projection pattern:(1) linear algorithm, widely used manifold learning is calculated on a large scale Method, represent algorithm have be widely used in statistics principal component analysis (Principal Component Analysis, PCA) algorithm.(2) nonlinear algorithm, is based primarily upon Neighborhood Graph (Graph Map) and sets up mapping relations, and representing algorithm has including protecting Isometric Maps (Isometric Mapping, Isomap) algorithm of distance between stationary point.
Principal component analysis finds the line that data are tied to its principal component coordinate system from natural coordinates in the case of least square Property conversion, to realize optimum expression of the initial data after the conversion under coordinate system.Principal Component Analysis Algorithm by retain variance compared with Big linear combination composition make it that the primary structure of data is preserved, but the algorithm is not involved with data internal structure group Into so the data are poor to tackling the possible effect of nonlinear data set.
Isometric Maps algorithm, which is tried one's best, preserves the inherent geometry of data point, and algorithm efficiency itself is higher, principle compared with Simply directly, for the stronger high-spectrum remote sensing data applicability of data nonlinear characteristic itself preferably, to be usually used in height In the dimension-reduction treatment of spectrum picture.But, it can be produced using the shortest path estimation geodesic curve distance in neighborhood relationships figure following Problem:(1) it is more sensitive to noise;(2) calculate geodesic curve distance matrix very cumbersome, be difficult to realize.Therefore, we are in field In the structure of figure, introduce L1 norms to be improved the algorithm.
The content of the invention
In view of the above-mentioned problems, we have proposed a kind of new high-spectrum image dimensionality reduction framework, and for high spectrum image Feature, computationally intensive to equidistant mapping algorithm, the shortcoming of time length proposes to improve.
Present invention is generally directed to Isometric Maps algorithm and Local Liner Prediction the problems such as neighborhood choice on exist Shortcoming, introduce with sparse characteristic L1 norms, above-mentioned algorithm is improved.First, for conventional Isometric Maps algorithm " short circuit " problem in the neighborhood connectivity problem of presence, the algorithm combined using L1 with L2 norms, optimized algorithm during neighborhood choice. Secondly, for being locally linear embedding into the less problem of neighborhood choice scope in (LLE) algorithm, L1 norms is introduced, make algorithm adaptive Ground is answered to select Neighbor Points in larger contiguous range.Finally, using data with existing collection label information, set up supervised learning and obtain Neighborhood relationships figure, further reinforcing neighborhood choice effect.Therefore technical solution of the present invention is at a kind of high-spectrum image dimensionality reduction Reason method, this method includes:
Step 1:Former high spectrum image is normalized, using the statistics method of sampling to the image after normalization Data sampling is carried out, a data subset is obtained;
Step 2:Using Isomap dimension reduction methods, manifold learning dimensionality reduction is carried out to the data subset that step 1 is obtained, flowed Shape skeleton;
Step 3:The data not being sampled in step 1 are embedded into the stream that step 2 is obtained using Local Liner Prediction In shape skeleton, high-spectrum image dimensionality reduction is completed;
Step 4:Using k arest neighbors sorting techniques, the data before and after different manifold learning arithmetic dimensionality reductions are classified, obtained To after classification results, using confusion matrix, user's precision, overall accuracy, kappa coefficients evaluate dimensionality reduction result.
Further, neighborhood relationships figure is set up using equation below in the Isomap dimension reduction methods of the step 2;
Wherein:E(xi) represent to calculate the number of obtained neighborhood point, wiRepresent the weight of High dimensional space data point, X{i}Table Show sample point xiThe k arest neighbors point sets of surrounding;λ is tuning coefficient.
A kind of high-spectrum image dimensionality reduction processing method of the present invention, this method carries out dimensionality reduction by the low volume data to sampling, Manifold skeleton is obtained, remainder data is being embedded into manifold skeleton, amount of calculation has so been greatly reduced, calculating is shortened Time.
Brief description of the drawings
Fig. 1 is EO-1 hyperion manifold learning dimensionality reduction flow chart.
Fig. 2 is dimensionality reduction effect analysis before and after Salinas-A data set Isomap algorithm improvements.
Fig. 3 is Salinas-A data set initial data perspective views;(a) it is equatorial projection, (b) throws for 3 D stereo Shadow.
Fig. 4 is perspective view after Salinas-A data set initial data L1-Isomap dimensionality reductions;(a) it is equatorial projection, (b) projected for 3 D stereo.
Embodiment
The a subset that step 1. obtains original data set by the use of the statistics method of sampling is used as boundary mark (Landmark).
In this step, the fortune in the data point number for subsequent treatment, subsequent process is reduced by the method for sampling Calculation amount will be greatly lowered.The point that these samplings are obtained is referred to as boundary mark, is primarily involved in subsequent Data Dimensionality Reduction.Ideal situation Under, obtained boundary mark of sampling should can retain the smallest subset of former data geometry.Next step after sampling is exactly several According to dimensionality reduction, in order to express original manifold structure exactly, the boundary mark learnt for manifold skeleton must be concentrated in initial data Selection meticulously.
Step 2. carries out manifold learning dimensionality reduction formation manifold skeleton according to obtained boundary mark;
The manifold that the sample points obtained to sampling are obtained according to collection manifold learning is considered as to whole data set manifold The manifold that acquistion is arrived it is approximate.Manifold learning dimension reduction method may apply on the less data set extracted after sampling.We The manifold that study sampled data is obtained is referred to as manifold skeleton.
In Isomap (Isometric Maps algorithm), it should avoid the neighborhood chosen excessive as far as possible.Built in primal algorithm adjacent In the step of domain graph of a relation, neighborhood chooses k closest approach via Euclidean distance is calculated.And the selection of k values is to the structure of graph of a relation It is very important.If k values are excessive, producing the possibility of " short-circuit distance " can increase substantially;But if k values are too small, foundation Graph of a relation may be sufficiently complete.Moreover, accurate k values in another region and may not applied in a certain region.Cause This, is used to handle this problem we have invented the neighborhood choice algorithm based on L1 norms.
Assuming that data set x is the High Dimensional Data Set for needing to set up neighborhood relationships figure, x is need to carry out neighborhood connection one Target sample point.Rebuild using following cost function:
And
||wi| | > 0
Wherein, first term represents the error in reconstruct, and Section 2 make use of the sparse characteristic of L1 norms.In above-mentioned formula, Tune the shared weight of coefficient lambda control coefrficient.It is to ensure that at least one weighted value is non-zero to add restrictive condition.x In arbitrarily have non-zero weight wijSample point xj, all be considered as neighbour is calculated.
That is, every bit has potentially possible as x in xiNeighbor Points.Because the quantity of sample point is general all very Greatly, computing cost is also one big problem.If the selection of L1 norm neighborhoods is selected around according to the neighborhood in L2 norms, neighborhood The risk that mistake occurs in selection can also be reduced.Therefore the quantity for limiting potential neighborhood can solve the problem that excessive present in calculating open Pin problem.Neighborhood is limited in the vicinity of k arest neighbors first, this can ensure neighborhood point will not distance sample point too far, also may be used To accelerate the process that L1 optimizes.Above-mentioned consideration is added, formula can be rewritten as:
Wherein, X{i}It is sample point xiK arest neighbors;Whole neighborhood relationships figure is by finding all samples in data set x Neighborhood of a point point is set up.
After the completion of Neighborhood Graph is built, Isomap calculates similar matrix DG=dG(i,j), wherein dG(i,j)Represent data point i and j Between geodesic curve distance in graph of a relation G;Geodesic curve distance can use Dijkstra's algorithm fast and effeciently Calculated:
Wherein, dijRepresent point xiWith point xjBetween geodesic curve distance, NiRepresent point xiThe set of k Neighbor Points, NjTable Show point xjK Neighbor Points set, k represent Neighborhood Graph build in arest neighbors number.
Similar matrix DGAfter foundation, Isomap algorithms are carried out using traditional MDS technologies by minimizing following cost function Dimensionality reduction:
Wherein, DGRepresent similar matrix DYBe matrix dY (i, j)=| | yi-yj| | Euclidean distance,Represent L2 squares Battle array norm, τ () is the second order variable of geodesic curve distance, and geodesic curve distance is converted into inner product form:
Wherein,
S=(d (i, j))2
S is square distance matrix, and H is the number of element in center matrix, n matrix.D (i, j) represents xiWith point xjBetween Distance, δ (i, j) represents unit matrix.By by matrix τ (DG) the individual characteristic vectors of preceding d ' be set to yiCoordinate, cost function E can be minimized, and the residual error of data is sequentially reduced according to these characteristic values.When the quantity of characteristic vector drops to the potential stream of data The dimension of shape, residual error is reduced to minimum with the increase of characteristic vector number.The result for carrying out Nonlinear Dimension Reduction is to obtain data Low-dimensional expression in d ' dimensions.
Remaining data are embedded into manifold skeleton by step 3. using Local Liner Prediction, so as to complete EO-1 hyperion Image dimensionality reduction;
Study is obtained after manifold skeleton, and remaining data point will use LLE algorithms to be embedded in.LLE algorithms can be attributed to Three steps:
(1) k Neighbor Points of each point to be reconstructed are determined;
(2) the partial reconstruction coefficient matrix rebuild for next step is calculated by these Neighbor Points put;
(3) this point is reconstructed by the Neighbor Points and its corresponding partial reconstruction coefficient matrix of the sample point.
Step 4. uses existing k arest neighbors (KNN) sorting technique, to the data before and after different manifold learning arithmetic dimensionality reductions Classified, verify classifying quality.
The basic ideas of KNN algorithms are that the sample x and each training category set for needing to classify are solved by distance function Distance, the distance estimates as one of similitude.The selection k point minimum with the sample point distance for needing to classify, these Point will be taken as x k arest neighbors, and finally the most classification of affiliated number will be used as x classification using in these arest neighbors.
Algorithm is divided into following several steps:
Step 1. carries out learning training by the set of characteristic item to known sample;
Step 2. reaches the point after new sample is reached using vector table;
Step 3. chooses k sample point of the Sample Similarity maximum (distance is most short) with newly entering, meter in training set Calculating formula is:
Step 4. is in k neighbour of new sample point, and order calculates the weight per class, and calculation formula is as follows:
Wherein, x represents the characteristic vector of more new sample point, and Sim () is the solution formula of similarity measurement, and in step 3 Calculating process is consistent, and y (di,Cj) it is category attribute function, if diBelong to CjClass, then functional value is 1, and other situations are 0.
Step 5. calculates the weights of classification, and data point is referred in the maximum classification of value
Confusion matrix (Confusion Matrix):
Comparison-of-pair sorting's situation and actual terrestrial object information are frequently used for, can be believed by the classification before and after matrix comparison-of-pair sorting Breath.The concrete class of the more each pixel of confusion matrix and pass through the sorted classification of grader.Each list of confusion matrix Show that the true pixel of digitized representation in some ground species in reference point information, every a line corresponds in classification chart picture corresponding The quantity of species, is gone out by statistical pixel number table.
Overall accuracy (Overall Accuracy), calculation formula is as follows:
Overall accuracy is nicety of grading and the pixel summation proportion being actually consistent in all pixel numbers in image, as The actual classification of member is limited by the actual image of atural object or the actual area-of-interest of atural object.The pixel correctly sorted out is distributed in mixed Confuse on the diagonal of matrix, it represents the pixel number being correctly categorized into atural object concrete class, pixel sum is equal to allly Pixel sum in thing concrete class.
Kappa coefficients (Kappa Coefficient):
The problem of Kappa coefficients are used to handle the consistency check faced in data analysis.Judge different models Or whether analysis method mutually meets on predicting the outcome, the result analyzed and actual result are with the presence or absence of coming in and going out etc.. In remote sensing image classification, Kappa coefficients to measure two width input pictures between degree of agreement, evaluate output image.Kappa systems Several calculation formula is as follows:
Wherein,
With the pixel ratio for being attributed to i classes in the width images of Pii=two;
Pi.=refers to the pixel ratio for being attributed to i classes in image;
It is attributed to the pixel ratio of i classes in P.i=non-reference images.
If two width images of input are completely the same (not changing), Kappa coefficients are 1.If two images are whole It is inconsistent, Kappa coefficient values -1.If the image judged result of input is caused by probability, Kappa coefficients are equal to 0.When During Kappa coefficient > 0, represent that result is valuable, and the coefficient is bigger, it is as a result more preferable.Kappa < 0.4, illustrate uniformity journey Degree is very poor;After Kappa >=0.75, represent that the uniformity of two width images is satisfactory.
Producer's precision (Producer ' s Accuracy):Producer's precision is also referred to as cartographic accuracy, refers to that certain classification is real The ratio shared by the pixel of this class is correctly referred in the pixel sum of border, its calculation formula is as follows:
User's precision (User ' s Accuracy):Criticize the pixel sum for being really referred to a class and divided with view picture image Class device is divided into the ratio of such pixel number, is calculated according to following formula:
Table 1 be Salinas-A data sets remove after unmarked point dimensionality reduction ratio of precision compared with.
Table 2 is that Salinasa-A data sets remove classification confusion matrix after unmarked point.
Wherein, in table 1 and table 2, O, B, C, L4, L5, L6, L7 are respectively different atural object classifications.
The Salinas-A data sets of table 1 totality dimensionality reduction ratio of precision compared with
The Salinas-A data sets of table 2 remove classification confusion matrix after unmarked point L1-Isomap dimensionality reductions

Claims (2)

1. a kind of high-spectrum image dimensionality reduction processing method, this method includes:
Step 1:Former high spectrum image is normalized, the image after normalization carried out using the statistics method of sampling Data sampling, obtains a data subset;
Step 2:Using Isomap dimension reduction methods, manifold learning dimensionality reduction is carried out to the data subset that step 1 is obtained, manifold bone is obtained Frame;
Step 3:The data not being sampled in step 1 are embedded into the manifold bone that step 2 is obtained using Local Liner Prediction In frame, high-spectrum image dimensionality reduction is completed;
Step 4:Using k arest neighbors sorting techniques, the data before and after different manifold learning arithmetic dimensionality reductions are classified, divided After class result, using confusion matrix, user's precision, overall accuracy, kappa coefficients evaluate dimensionality reduction result.
2. a kind of high-spectrum image dimensionality reduction processing method as claimed in claim 1, it is characterised in that the Isomap of the step 2 Neighborhood relationships figure is set up using equation below in dimension reduction method;
<mrow> <mi>E</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mi>min</mi> <mrow> <msub> <mi>w</mi> <mi>i</mi> </msub> <mo>&amp;GreaterEqual;</mo> <mn>0</mn> </mrow> </munder> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mo>|</mo> <mo>|</mo> <msup> <mi>X</mi> <mrow> <mo>{</mo> <mi>i</mi> <mo>}</mo> </mrow> </msup> <msub> <mi>w</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>|</mo> <msubsup> <mo>|</mo> <mn>2</mn> <mn>2</mn> </msubsup> <mo>+</mo> <mi>&amp;lambda;</mi> <mo>|</mo> <mo>|</mo> <msub> <mi>w</mi> <mi>i</mi> </msub> <mo>|</mo> <msub> <mo>|</mo> <mn>1</mn> </msub> </mrow>
Wherein:E(xi) represent to calculate the number of obtained neighborhood point, wiRepresent the weight of High dimensional space data point, X{i}Represent sample This xiThe k arest neighbors point sets of surrounding;λ is tuning coefficient.
CN201710260721.XA 2017-04-20 2017-04-20 A kind of high-spectrum image dimensionality reduction processing method Pending CN107153839A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710260721.XA CN107153839A (en) 2017-04-20 2017-04-20 A kind of high-spectrum image dimensionality reduction processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710260721.XA CN107153839A (en) 2017-04-20 2017-04-20 A kind of high-spectrum image dimensionality reduction processing method

Publications (1)

Publication Number Publication Date
CN107153839A true CN107153839A (en) 2017-09-12

Family

ID=59793994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710260721.XA Pending CN107153839A (en) 2017-04-20 2017-04-20 A kind of high-spectrum image dimensionality reduction processing method

Country Status (1)

Country Link
CN (1) CN107153839A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108876715A (en) * 2018-05-24 2018-11-23 海南大学 A kind of robust bilateral 2D linear discriminant analysis dimension-reduction algorithm
CN110619370A (en) * 2019-09-23 2019-12-27 云南电网有限责任公司电力科学研究院 Hyperspectral image super-pixel local linear embedding dimension reduction method
CN111191617A (en) * 2020-01-02 2020-05-22 武汉大学 Remote sensing scene classification method based on hierarchical structure
CN112257807A (en) * 2020-11-02 2021-01-22 曲阜师范大学 Dimension reduction method and system based on self-adaptive optimization linear neighborhood set selection
CN117173496A (en) * 2023-09-20 2023-12-05 重庆大学 High-dimensional data dimension reduction method and system for maintaining one-dimensional topological characteristics

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102903116A (en) * 2012-10-20 2013-01-30 复旦大学 Manifold dimension reduction method of hyperspectral images based on image block distance
US20130129256A1 (en) * 2011-11-22 2013-05-23 Raytheon Company Spectral image dimensionality reduction system and method
CN103413151A (en) * 2013-07-22 2013-11-27 西安电子科技大学 Hyperspectral image classification method based on image regular low-rank expression dimensionality reduction

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130129256A1 (en) * 2011-11-22 2013-05-23 Raytheon Company Spectral image dimensionality reduction system and method
CN102903116A (en) * 2012-10-20 2013-01-30 复旦大学 Manifold dimension reduction method of hyperspectral images based on image block distance
CN103413151A (en) * 2013-07-22 2013-11-27 西安电子科技大学 Hyperspectral image classification method based on image regular low-rank expression dimensionality reduction

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
卢雨风: "高光谱图像流形学习算法研究", 《中国优秀硕士学位论文全文数据库.信息科技辑》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108876715A (en) * 2018-05-24 2018-11-23 海南大学 A kind of robust bilateral 2D linear discriminant analysis dimension-reduction algorithm
CN108876715B (en) * 2018-05-24 2021-06-01 海南大学 Image data robust bilateral 2D linear discriminant analysis dimension reduction method
CN110619370A (en) * 2019-09-23 2019-12-27 云南电网有限责任公司电力科学研究院 Hyperspectral image super-pixel local linear embedding dimension reduction method
CN111191617A (en) * 2020-01-02 2020-05-22 武汉大学 Remote sensing scene classification method based on hierarchical structure
CN111191617B (en) * 2020-01-02 2022-02-01 武汉大学 Remote sensing scene classification method based on hierarchical structure
CN112257807A (en) * 2020-11-02 2021-01-22 曲阜师范大学 Dimension reduction method and system based on self-adaptive optimization linear neighborhood set selection
CN112257807B (en) * 2020-11-02 2022-05-27 曲阜师范大学 Dimension reduction method and system based on self-adaptive optimization linear neighborhood set selection
CN117173496A (en) * 2023-09-20 2023-12-05 重庆大学 High-dimensional data dimension reduction method and system for maintaining one-dimensional topological characteristics
CN117173496B (en) * 2023-09-20 2024-04-02 重庆大学 High-dimensional data dimension reduction method and system for maintaining one-dimensional topological characteristics

Similar Documents

Publication Publication Date Title
CN110443143B (en) Multi-branch convolutional neural network fused remote sensing image scene classification method
CN107153839A (en) A kind of high-spectrum image dimensionality reduction processing method
McIver et al. Estimating pixel-scale land cover classification confidence using nonparametric machine learning methods
CN104820841B (en) Hyperspectral classification method based on low order mutual information and spectrum context waveband selection
CN104866871B (en) Hyperspectral image classification method based on projection structure sparse coding
CN113095409B (en) Hyperspectral image classification method based on attention mechanism and weight sharing
CN108182449A (en) A kind of hyperspectral image classification method
CN107563442A (en) Hyperspectral image classification method based on sparse low-rank regular graph qualified insertion
CN112115967B (en) Image increment learning method based on data protection
CN112017192B (en) Glandular cell image segmentation method and glandular cell image segmentation system based on improved U-Net network
CN109446894A (en) The multispectral image change detecting method clustered based on probabilistic segmentation and Gaussian Mixture
CN111860124B (en) Remote sensing image classification method based on space spectrum capsule generation countermeasure network
CN110751072B (en) Double-person interactive identification method based on knowledge embedded graph convolution network
Tang et al. A multiple-point spatially weighted k-NN method for object-based classification
CN106529563A (en) High-spectral band selection method based on double-graph sparse non-negative matrix factorization
Limper et al. Mesh Saliency Analysis via Local Curvature Entropy.
CN104881684A (en) Stereo image quality objective evaluate method
CN111325134B (en) Remote sensing image change detection method based on cross-layer connection convolutional neural network
CN115311502A (en) Remote sensing image small sample scene classification method based on multi-scale double-flow architecture
CN115861837A (en) Arable land identification method based on Ares-UNet + + network and related device
CN114419406A (en) Image change detection method, training method, device and computer equipment
CN116977750B (en) Construction method and classification method of land covering scene classification model
CN108280474A (en) A kind of food recognition methods based on neural network
CN116258877A (en) Land utilization scene similarity change detection method, device, medium and equipment
CN115953330A (en) Texture optimization method, device, equipment and storage medium for virtual scene image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170912

RJ01 Rejection of invention patent application after publication