CN103383775A - Method for evaluating remote-sensing image fusion effect - Google Patents

Method for evaluating remote-sensing image fusion effect Download PDF

Info

Publication number
CN103383775A
CN103383775A CN2013102723674A CN201310272367A CN103383775A CN 103383775 A CN103383775 A CN 103383775A CN 2013102723674 A CN2013102723674 A CN 2013102723674A CN 201310272367 A CN201310272367 A CN 201310272367A CN 103383775 A CN103383775 A CN 103383775A
Authority
CN
China
Prior art keywords
image
fusion
remote sensing
accuracy
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013102723674A
Other languages
Chinese (zh)
Other versions
CN103383775B (en
Inventor
董张玉
王宗明
刘殿伟
任春颖
汤旭光
贾明明
丁智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeast Institute of Geography and Agroecology of CAS
Original Assignee
Northeast Institute of Geography and Agroecology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeast Institute of Geography and Agroecology of CAS filed Critical Northeast Institute of Geography and Agroecology of CAS
Priority to CN201310272367.4A priority Critical patent/CN103383775B/en
Publication of CN103383775A publication Critical patent/CN103383775A/en
Application granted granted Critical
Publication of CN103383775B publication Critical patent/CN103383775B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

本发明涉及遥感影像效果评估,特别是涉及一种遥感影像融合效果评价方法。本发明是要解决现有遥感影像融合效果评价方法主要视觉效果人为干扰因素较大,容易产生误差,客观数理统计分析中指标的选取没有统一的标准,难以实现全面评价影像质量的问题。步骤一:原始待融合影像的预处理;步骤二:预处理后的多光谱影像与全色影像进行融合处理;步骤三:对融合前后的影像进行面向对象分割;步骤四:采用分类规则,对分割后的遥感影像进行分类;步骤五:从生产精度、用户精度以及Kappa系数三个角度对分类结果进行精度评价,对比分析融合前后影像的分类精度实现融合效果评价。本发明应用于遥感影像处理技术领域。

Figure 201310272367

The invention relates to remote sensing image effect evaluation, in particular to a remote sensing image fusion effect evaluation method. The present invention aims to solve the problems that the existing remote sensing image fusion effect evaluation method has relatively large main visual effect human interference factors, which are prone to errors, and there is no uniform standard for the selection of indicators in objective mathematical statistical analysis, making it difficult to achieve a comprehensive evaluation of image quality. Step 1: Preprocessing of the original image to be fused; Step 2: Fusion processing of the preprocessed multispectral image and panchromatic image; Step 3: Object-oriented segmentation of the image before and after fusion; Step 4: Using classification rules to Classify the segmented remote sensing images; step five: evaluate the accuracy of the classification results from the three perspectives of production accuracy, user accuracy, and Kappa coefficient, and compare and analyze the classification accuracy of images before and after fusion to achieve fusion effect evaluation. The invention is applied in the technical field of remote sensing image processing.

Figure 201310272367

Description

A kind of Remote Sensing Image Fusion effect evaluation method
Technical field
The present invention relates to the remote sensing image recruitment evaluation, particularly relate to a kind of Remote Sensing Image Fusion effect evaluation method.Belong to Remote Sensing Image Processing Technology.
Background technology
Along with the develop rapidly of modern remote sensing and correlation technique thereof, realize that the algorithm of Remote Sensing Image Fusion has a lot, but do not form unified theoretical system, the effect of various fusion methods is difficult to quantitatively determine.Current, there are two kinds of viewpoints in the visual fusion quality assessment, a kind ofly think that any image all sees to the people, therefore should adopt the evaluation method based on visual effect, with the visual basis that is judged as of subjectivity, come the formulation standard according to statistics; Another kind thinks that subjective interpretation method is not comprehensive, has one-sidedness, and is unable to undergo rechecking, because when observation condition changes, the possibility of result of evaluation can produce difference, advocates to adopt the method for objectively evaluating that not affected by the appraiser, but all has defective.Therefore, realize the Remote Sensing Image Fusion effect assessment in the urgent need to a kind of brand-new angle.Classification of remote-sensing images is one of important application direction of image, is also most basic application.Therefore, estimating the fusion results image from the angle of nicety of grading can be so that evaluation result be directly application-oriented.
Traditional Remote Sensing Image Fusion effect can be estimated from the angle of subjective vision effect and objective Mathematical Statistics Analysis, but all has defective, and main visual effect artificial interference factor is larger, easily produces error; Objective Mathematical Statistics Analysis middle finger target is chosen does not have unified standard, is difficult to realize the thoroughly evaluating quality of image.The present invention is directed to the demand of application, proposed a kind of angle from the image classification precision evaluation and realized the Remote Sensing Image Fusion recruitment evaluation, make the result after fusion directly application-oriented.
Summary of the invention
The present invention is that will to solve the existing main visual effect artificial interference of Remote Sensing Image Fusion effect evaluation method factor larger, easily produce error, objective Mathematical Statistics Analysis middle finger target is chosen does not have unified standard, be difficult to realize the problem of the thoroughly evaluating quality of image, and a kind of Remote Sensing Image Fusion effect evaluation method is provided.
The Remote Sensing Image Fusion effect evaluation method mainly comprises the following steps:
Step 1: the pre-service of image original to be merged;
Step 2: pretreated multispectral image and panchromatic image carry out fusion treatment, the image as a result after being merged;
Step 3: the image before and after merging is carried out object-oriented cut apart, the remote sensing image after being cut apart;
Step 4: adopt classifying rules, to the Remote sensing image classification after cutting apart, the classification results of image before and after obtaining merging;
Step 5: from production precision, user's precision and three angles of Kappa coefficient, classification results is carried out precision evaluation, before and after comparative analysis is merged, the nicety of grading of image realizes the syncretizing effect evaluation.
The invention effect:
A kind of new Remote Sensing Image Fusion effect evaluation method that the present invention proposes can overcome the deficiency that has evaluation method now.Adopt the nicety of grading Appraising Methods to realize the visual fusion recruitment evaluation, namely to Remote sensing image classification before and after merging, realize effect assessment by the match stop precision, make evaluation result directly application-oriented.
Propose a kind of new Remote Sensing Image Fusion effect evaluation method, made up the deficiency of traditional evaluation method, made evaluation result directly application-oriented.The present invention nicety of grading can be improved 3%, and evaluation result is objective through new evaluation method.
Description of drawings
Fig. 1 is process flow diagram of the present invention;
Fig. 2 is SPOT multispectral image in embodiment;
Fig. 3 is SPOT panchromatic image in embodiment;
Fig. 4 is fusion results image in embodiment;
Fig. 5 is original multispectral image classification results in embodiment; Blue expression traffic land used, red expression residential area, blackish green expression nonirrigated farmland, yellow expression reservoir/swag, green expression paddy field;
Fig. 6 merges the image classification result in embodiment; Blue expression traffic land used, red expression residential area, blackish green expression nonirrigated farmland, yellow expression reservoir/swag, green expression paddy field.
Embodiment
Embodiment one: the Remote Sensing Image Fusion effect evaluation method of present embodiment mainly comprises the following steps:
Step 1: the pre-service of image original to be merged;
Step 2: pretreated multispectral image and panchromatic image carry out fusion treatment, the image as a result after being merged;
Step 3: the image before and after merging is carried out object-oriented cut apart, the remote sensing image after being cut apart;
Step 4: adopt classifying rules, to the Remote sensing image classification after cutting apart, the classification results of image before and after obtaining merging;
Step 5: from production precision, user's precision and three angles of Kappa coefficient, classification results is carried out precision evaluation, before and after comparative analysis is merged, the nicety of grading of image realizes the syncretizing effect evaluation.
The present embodiment effect:
A kind of new Remote Sensing Image Fusion effect evaluation method that present embodiment proposes can overcome the deficiency that has evaluation method now.Adopt the nicety of grading Appraising Methods to realize the visual fusion recruitment evaluation, namely to Remote sensing image classification before and after merging, realize effect assessment by the match stop precision, make evaluation result directly application-oriented.
Present embodiment has proposed a kind of new Remote Sensing Image Fusion effect evaluation method, has made up the deficiency of traditional evaluation method, makes evaluation result directly application-oriented.Present embodiment nicety of grading can be improved 3%, and evaluation result is objective through new evaluation method.
Embodiment two: what present embodiment was different from embodiment one is: the pre-service of image original to be merged described in step 1 comprises the following steps:
Adopt the quadratic polynomial method to carry out geometrical registration to original multispectral image, make itself and panchromatic image keep consistance how much;
Adopt neighbor interpolation method that multispectral image is resampled, make the Pixel size of image to be merged consistent;
On the basis of above processing, cut out the pilot region image, obtain original multispectral and panchromatic the to be merged image of same area.Other step and parameter are identical with embodiment one.
Embodiment three: what present embodiment was different from embodiment one or two is: described in step 2, pretreated multispectral image and panchromatic image are carried out fusion treatment, in image as a result after being merged, the principle that multispectral image and panchromatic image merge is: adopt certain fusion method that pretreated multispectral image and panchromatic image are merged.Other step and parameter are identical with embodiment one or two.
embodiment four: what present embodiment was different from one of embodiment one to three is: the image to before and after merging described in step 3 carries out object-oriented to be cut apart, remote sensing image after being cut apart, wherein the principle cut apart of object-oriented is: selected texture as required, average, color or shape facility, Image Segmentation is become each characteristic area, and therefrom extract technology and the process of area-of-interest, make the picture dot with identical or close characteristic properties in the same area, has the picture dot of notable difference feature in zones of different, each zone is as the unit of later stage image classification.Other step and parameter are identical with one of embodiment one to three.
Embodiment five: what present embodiment was different from one of embodiment one to four is: the employing classifying rules described in step 4, to the Remote sensing image classification after cutting apart, the classification results of image before and after obtaining merging, wherein, classifying rules mainly comprises the following aspects: at first, utilize NDVI (normalization difference vegetation index) can distinguish the arable land in conjunction with shape index, then utilize the wetland index to distinguish nonirrigated farmland and paddy field; Secondly, brightness index can extract residential area and traffic land used, and the recycling shape index is distinguished residential area and traffic land used; At last, adopt wetland exponential sum brightness index to extract reservoir/swag.Other step and parameter are identical with one of embodiment one to four.
Embodiment six: what present embodiment was different from one of embodiment one to five is: analyzing three angles from production precision, user's precision and Kappa classification results is carried out precision evaluation described in step 5, during before and after comparative analysis is merged, the nicety of grading of image realized that syncretizing effect is estimated, the production precision referred to that batch total deducts and Lou divides again divided by batch total; User's precision refers to that batch total deducts wrong minute again divided by batch total; The Kappa analysis index adopts a kind of discrete polytechnics, has considered all factors of matrix, is the index of the goodness of fit or precision between a kind of mensuration two width figure, and its computing formula is:
K appa = N Σ i = 1 x ii - Σ i = 1 r ( x i + , x + i ) N 2 - Σ i = 1 r ( x i + , x + i )
In formula: r is total ordered series of numbers (being total classification number) in error matrix; X iiThat in error matrix, capable, the i of i lists picture dot quantity (i.e. the number of correct classification); X i+And X + iTotal picture dot quantity that i is capable and i lists; N is total picture dot quantity that is used for accuracy evaluation.Other step and parameter are identical with one of embodiment one to five.
Embodiment seven: what present embodiment was different from one of embodiment one to six is: before and after the comparative analysis described in step 5 is merged, the nicety of grading of image realizes the syncretizing effect evaluation: before and after merging by analysis, the image classification precision is determined the effect of Remote Sensing Image Fusion; If after merging, the nicety of grading of image is higher, illustrate that the quality of image is higher, on the contrary low, thus realize the Remote Sensing Image Fusion effect assessment.Other step and parameter are identical with one of embodiment one to six.
Embodiment
(1) the multispectral and panchromatic image of the original SPOT of input, and the SPOT multispectral image is carried out geometric correction and resampling, make its picture dot corresponding to panchromatic image have consistance, then cut out SPOT multispectral image and panchromatic image to be merged after processing, as Fig. 2, shown in 3;
(2) adopt the fusion method of PCA conversion to carry out fusion treatment to pretreated SPOT is multispectral with panchromatic image, obtain the fusion results image, as shown in Figure 4;
(3) selecting to cut apart yardstick is 5, and SPOT image before and after merging is carried out dividing processing, the image after being cut apart;
(4) adopt classifying rules that the image after cutting apart is classified, the classification results of SPOT multispectral image before and after obtaining merging is as Fig. 5, shown in 6;
(5) analyze three angles from production precision, user's precision and Kappa classification results is carried out precision evaluation, draw through the statistical classification precision, production precision, user's precision and the Kappa coefficient of original SPOT multispectral image classification are respectively: 84%, 86%, 0.82; Employing PCA conversion fusion method production precision, user's precision and the Kappa coefficient of image classification as a result is respectively: 90%, 91%, 0.89.Therefore, the image after fusion is significantly improved to nicety of grading, illustrates that the effect of fusion method of PCA conversion is better, satisfies the demand of using.

Claims (7)

1.一种遥感影像融合效果评价方法,其特征在于遥感影像融合效果评价方法主要包括以下步骤:1. A remote sensing image fusion effect evaluation method is characterized in that the remote sensing image fusion effect evaluation method mainly includes the following steps: 步骤一:原始待融合影像的预处理;Step 1: Preprocessing of the original image to be fused; 步骤二:预处理后的多光谱影像与全色影像进行融合处理,得到融合后的结果影像;Step 2: The preprocessed multispectral image is fused with the panchromatic image to obtain a fused result image; 步骤三:对融合前后的影像进行面向对象分割,得到分割后的遥感影像;Step 3: Carry out object-oriented segmentation on the images before and after fusion, and obtain the remote sensing images after segmentation; 步骤四:采用分类规则,对分割后的遥感影像进行分类,得到融合前后影像的分类结果;Step 4: Using classification rules to classify the segmented remote sensing images to obtain classification results of images before and after fusion; 步骤五:从生产精度、用户精度以及Kappa系数三个角度对分类结果进行精度评价,对比分析融合前后影像的分类精度实现融合效果评价。Step 5: Evaluate the accuracy of the classification results from the perspectives of production accuracy, user accuracy, and Kappa coefficient, and compare and analyze the classification accuracy of images before and after fusion to evaluate the fusion effect. 2.根据权利要求1所述的一种遥感影像融合效果评价方法,其特征在于步骤一中所述原始待融合影像的预处理包括以下步骤:2. A method for evaluating the fusion effect of remote sensing images according to claim 1, wherein the preprocessing of the original images to be fused described in step 1 comprises the following steps: 采用二次多项式法对原始多光谱影像进行几何配准,使其与全色影像保持几何一致性;The quadratic polynomial method is used to geometrically register the original multispectral image to maintain geometric consistency with the panchromatic image; 采用最邻近插值法对多光谱影像进行重采样,使得待融合影像的像元大小一致;The nearest neighbor interpolation method is used to resample the multispectral image, so that the pixel size of the image to be fused is consistent; 在以上处理的基础上,裁剪出试验区域影像,获得相同区域原始多光谱与全色待融合影像。On the basis of the above processing, the image of the test area was cut out to obtain the original multispectral and panchromatic images to be fused in the same area. 3.根据权利要求1所述的一种遥感影像融合效果评价方法,其特征在于步骤二中所述对预处理后的多光谱影像与全色影像进行融合处理,得到融合后的结果影像中,多光谱影像与全色影像进行融合的原理是:采用一定的融合方法对预处理后的多光谱影像与全色影像进行融合。3. a kind of remote sensing image fusion effect evaluation method according to claim 1, it is characterized in that described in step 2 to carry out fusion processing to multispectral image after preprocessing and panchromatic image, obtain in the result image after fusion, The principle of merging multispectral images and panchromatic images is to use a certain fusion method to fuse the preprocessed multispectral images and panchromatic images. 4.根据权利要求1所述的一种遥感影像融合效果评价方法,其特征在于步骤三中所述的对融合前后的影像进行面向对象分割,得到分割后的遥感影像,其中面向对象分割的原理是:根据需要选定纹理、均值、颜色或形状特征,将影像分割成各特征区域,并从中提取出感兴趣区域的技术和过程,使得具有相同或相近特征性质的象元在同一区域内,而具有明显差异特征的象元则在不同区域内,每一个区域作为后期影像分类的单元。4. A kind of remote sensing image fusion effect evaluation method according to claim 1, characterized in that the image before and after fusion described in step 3 is carried out object-oriented segmentation, obtains the remote sensing image after segmentation, wherein the principle of object-oriented segmentation It is: the technology and process of selecting texture, mean value, color or shape features according to the needs, dividing the image into characteristic regions, and extracting the region of interest from it, so that pixels with the same or similar characteristic properties are in the same region, The pixels with obvious difference characteristics are in different regions, and each region is used as a unit for later image classification. 5.根据权利要求1所述的一种遥感影像融合效果评价方法,其特征在于步骤四中所述的采用分类规则,对分割后的遥感影像进行分类,得到融合前后影像的分类结果,其中,分类规则主要包括以下几个方面:首先,利用NDVI结合形状指数能够区分出耕地,然后利用湿地指数区分旱地和水田;其次,亮度指数能够提取出居民点和交通用地,再利用形状指数区分居民点和交通用地;最后,采用湿地指数和亮度指数提取水库/坑塘。5. A method for evaluating the fusion effect of remote sensing images according to claim 1, characterized in that the classification rules described in step 4 are used to classify the remote sensing images after segmentation to obtain the classification results of images before and after fusion, wherein, The classification rules mainly include the following aspects: First, use NDVI combined with shape index to distinguish cultivated land, then use wetland index to distinguish dry land and paddy field; secondly, brightness index can extract residential areas and traffic land, and then use shape index to distinguish residential areas and transportation land; finally, the wetland index and brightness index are used to extract the reservoir/pond. 6.根据权利要求1所述的一种遥感影像融合效果评价方法,其特征在于步骤五中所述的从生产精度、用户精度以及Kappa分析三个角度对分类结果进行精度评价,对比分析融合前后影像的分类精度实现融合效果评价中,生产精度是指分类总数减去漏分再除以分类总数;用户精度是指分类总数减去错分再除以分类总数;Kappa分析指数采用一种离散的多元技术,考虑了矩阵的所有因素,是一种测定两幅图之间吻合度或精度的指标,其计算公式为:6. a kind of remote sensing image fusion effect evaluation method according to claim 1, it is characterized in that described in the step 5 from three angles of production accuracy, user accuracy and Kappa analysis, classification results are evaluated for accuracy, before and after comparative analysis fusion In the evaluation of the fusion effect of image classification accuracy, production accuracy refers to the total number of classifications minus missing points and then divided by the total number of classifications; user accuracy refers to the total number of classifications minus misclassifications and then divided by the total number of classifications; the Kappa analysis index adopts a discrete The multivariate technique, which takes into account all the factors of the matrix, is an index for measuring the degree of agreement or accuracy between two images, and its calculation formula is: KK appaappa == NN ΣΣ ii == 11 xx iii -- ΣΣ ii == 11 rr (( xx ii ++ ,, xx ++ ii )) NN 22 -- ΣΣ ii == 11 rr (( xx ii ++ ,, xx ++ ii )) 式中:r是误差矩阵中总数列(即总的类别数);Xii是误差矩阵中第i行、第i列上象元数量(即正确分类的数目);Xi+和X+i是第i行和第i列上的总象元数量;N是总的用于精度评估的象元数量。In the formula: r is the total number of columns in the error matrix (that is, the total number of categories); X ii is the number of pixels on the i-th row and i-th column in the error matrix (that is, the number of correct classifications); X i+ and X +i are The total number of pixels on the i-th row and i-th column; N is the total number of pixels used for accuracy evaluation. 7.根据权利要求6所述的一种遥感影像融合效果评价方法,其特征在于步骤五中所述的对比分析融合前后影像的分类精度实现融合效果评价:通过分析融合前后影像分类精度来确定遥感影像融合的效果;若融合后影像的分类精度越高,说明影像质量越高,反之则低,从而实现遥感影像融合效果评价。7. A method for evaluating the fusion effect of remote sensing images according to claim 6, characterized in that the comparison and analysis of the classification accuracy of the images before and after the fusion described in step 5 realizes the evaluation of the fusion effect: the remote sensing image is determined by analyzing the classification accuracy of the images before and after the fusion. The effect of image fusion; if the classification accuracy of the fused image is higher, it means that the image quality is higher, otherwise it is lower, so as to realize the evaluation of remote sensing image fusion effect.
CN201310272367.4A 2013-07-02 2013-07-02 A kind of Remote Sensing Image Fusion effect evaluation method Expired - Fee Related CN103383775B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310272367.4A CN103383775B (en) 2013-07-02 2013-07-02 A kind of Remote Sensing Image Fusion effect evaluation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310272367.4A CN103383775B (en) 2013-07-02 2013-07-02 A kind of Remote Sensing Image Fusion effect evaluation method

Publications (2)

Publication Number Publication Date
CN103383775A true CN103383775A (en) 2013-11-06
CN103383775B CN103383775B (en) 2016-08-10

Family

ID=49491558

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310272367.4A Expired - Fee Related CN103383775B (en) 2013-07-02 2013-07-02 A kind of Remote Sensing Image Fusion effect evaluation method

Country Status (1)

Country Link
CN (1) CN103383775B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794481A (en) * 2015-03-25 2015-07-22 北京师范大学 Fused panchromatic image and multispectral image classifying method and device
CN104867154A (en) * 2015-05-29 2015-08-26 华中科技大学 Remote-sensing image quality evaluation method based on gradient
CN105430378A (en) * 2015-11-26 2016-03-23 航天恒星科技有限公司 An image quality evaluation system and method
CN106023111A (en) * 2016-05-23 2016-10-12 中国科学院深圳先进技术研究院 Image fusion quality evaluating method and system
CN110310246A (en) * 2019-07-05 2019-10-08 广西壮族自治区基础地理信息中心 A kind of cane -growing region remote sensing information extracting method based on three-line imagery
CN111681207A (en) * 2020-05-09 2020-09-18 宁波大学 A method for quality evaluation of remote sensing image fusion
CN112506104A (en) * 2020-12-07 2021-03-16 沈阳工程学院 Method and system for supervising wetland resources and ecological environment
CN112633155A (en) * 2020-12-22 2021-04-09 生态环境部卫星环境应用中心 Natural conservation place human activity change detection method based on multi-scale feature fusion
CN113191440A (en) * 2021-05-12 2021-07-30 济南大学 Remote sensing image instance classification method, system, terminal and storage medium
CN113850850A (en) * 2021-09-23 2021-12-28 武汉九天高分遥感技术有限公司 Method for rapidly fusing high resolution No. 6 high resolution camera image and wide camera image
CN114937038A (en) * 2022-07-21 2022-08-23 北京数慧时空信息技术有限公司 Remote sensing image quality evaluation method oriented to usability

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916436A (en) * 2010-08-30 2010-12-15 武汉大学 A Multi-Scale Spatial Projection Remote Sensing Image Fusion Method
CN102005037A (en) * 2010-11-12 2011-04-06 湖南大学 Multimodality image fusion method combining multi-scale bilateral filtering and direction filtering

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916436A (en) * 2010-08-30 2010-12-15 武汉大学 A Multi-Scale Spatial Projection Remote Sensing Image Fusion Method
CN102005037A (en) * 2010-11-12 2011-04-06 湖南大学 Multimodality image fusion method combining multi-scale bilateral filtering and direction filtering

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794481B (en) * 2015-03-25 2018-01-19 北京师范大学 Merge the sorting technique and device of panchromatic image and multispectral image
CN104794481A (en) * 2015-03-25 2015-07-22 北京师范大学 Fused panchromatic image and multispectral image classifying method and device
CN104867154A (en) * 2015-05-29 2015-08-26 华中科技大学 Remote-sensing image quality evaluation method based on gradient
CN105430378A (en) * 2015-11-26 2016-03-23 航天恒星科技有限公司 An image quality evaluation system and method
CN106023111A (en) * 2016-05-23 2016-10-12 中国科学院深圳先进技术研究院 Image fusion quality evaluating method and system
CN110310246B (en) * 2019-07-05 2023-04-11 广西壮族自治区基础地理信息中心 Sugarcane planting area remote sensing information extraction method based on three-linear array image
CN110310246A (en) * 2019-07-05 2019-10-08 广西壮族自治区基础地理信息中心 A kind of cane -growing region remote sensing information extracting method based on three-line imagery
CN111681207A (en) * 2020-05-09 2020-09-18 宁波大学 A method for quality evaluation of remote sensing image fusion
CN111681207B (en) * 2020-05-09 2023-10-27 四维高景卫星遥感有限公司 Remote sensing image fusion quality evaluation method
CN112506104A (en) * 2020-12-07 2021-03-16 沈阳工程学院 Method and system for supervising wetland resources and ecological environment
CN112633155A (en) * 2020-12-22 2021-04-09 生态环境部卫星环境应用中心 Natural conservation place human activity change detection method based on multi-scale feature fusion
CN112633155B (en) * 2020-12-22 2021-08-31 生态环境部卫星环境应用中心 Natural conservation place human activity change detection method based on multi-scale feature fusion
CN113191440A (en) * 2021-05-12 2021-07-30 济南大学 Remote sensing image instance classification method, system, terminal and storage medium
CN113850850A (en) * 2021-09-23 2021-12-28 武汉九天高分遥感技术有限公司 Method for rapidly fusing high resolution No. 6 high resolution camera image and wide camera image
CN114937038A (en) * 2022-07-21 2022-08-23 北京数慧时空信息技术有限公司 Remote sensing image quality evaluation method oriented to usability
CN114937038B (en) * 2022-07-21 2022-09-20 北京数慧时空信息技术有限公司 Usability-oriented remote sensing image quality evaluation method

Also Published As

Publication number Publication date
CN103383775B (en) 2016-08-10

Similar Documents

Publication Publication Date Title
CN103383775A (en) Method for evaluating remote-sensing image fusion effect
JP6999812B2 (en) Bone age evaluation and height prediction model establishment method, its system and its prediction method
CN109993733A (en) Detection method, system, storage medium, terminal and the display system of pulmonary lesions
CN110363246B (en) Fusion method of vegetation index NDVI with high space-time resolution
CN107423537B (en) Surface temperature downscaling method based on self-adaptive threshold
CN106295124A (en) Utilize the method that multiple image detecting technique comprehensively analyzes gene polyadenylation signal figure likelihood probability amount
CN110555843B (en) A high-precision non-reference fusion remote sensing image quality analysis method and system
CN109409429B (en) A tree species classification method based on lidar point cloud data
US20210097687A1 (en) Image analysis in pathology
CN110793462B (en) Nylon gear reference circle measuring method based on vision technology
CN102096826A (en) Compound classification method for multi-resolution remote sensing image on basis of real likelihood characteristic
CN111814563A (en) A kind of classification method and device of planting structure
CN102855485A (en) Automatic wheat earing detection method
CN115115841A (en) Shadow spot image processing and analyzing method and system
CN103186893A (en) Universal high-resolution remote sensing image fusion method
CN119091259A (en) Medicinal materials production quality detection method based on image recognition
CN118429798A (en) A method, system and storage medium for collecting and planning building basic data
CN105930863A (en) Determination method for spectral band setting of satellite camera
CN119600454B (en) A water quality parameter inversion method based on hyperspectral data analysis
CN114332534B (en) Hyperspectral image small sample classification method
CN105046265A (en) Iris image intestinal loop area detection method based on texture difference
CN102521830B (en) Optimum band selection method for hyperspectral images of canopy of crop under disease stress
CN105224808B (en) Projecting integral's function skin condition integrated evaluating method based on three-dimensional coordinate
CN116778343B (en) Target image feature extraction method for comprehensive identification
CN103778413A (en) Remote-sensing image under-segmentation object automatic recognition method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160810

Termination date: 20200702