CN117274763B - Remote sensing image space-spectrum fusion method, system, equipment and medium based on balance point analysis - Google Patents

Remote sensing image space-spectrum fusion method, system, equipment and medium based on balance point analysis Download PDF

Info

Publication number
CN117274763B
CN117274763B CN202311548781.3A CN202311548781A CN117274763B CN 117274763 B CN117274763 B CN 117274763B CN 202311548781 A CN202311548781 A CN 202311548781A CN 117274763 B CN117274763 B CN 117274763B
Authority
CN
China
Prior art keywords
fusion
coefficient
remote sensing
band
coefficients
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311548781.3A
Other languages
Chinese (zh)
Other versions
CN117274763A (en
Inventor
余顺超
刘超群
王森
顾祝军
何颖清
王行汉
邹显勇
王晓刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pearl River Hydraulic Research Institute of PRWRC
Original Assignee
Pearl River Hydraulic Research Institute of PRWRC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pearl River Hydraulic Research Institute of PRWRC filed Critical Pearl River Hydraulic Research Institute of PRWRC
Priority to CN202311548781.3A priority Critical patent/CN117274763B/en
Publication of CN117274763A publication Critical patent/CN117274763A/en
Application granted granted Critical
Publication of CN117274763B publication Critical patent/CN117274763B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Abstract

The invention discloses a remote sensing image space-spectrum fusion method, a system, equipment and a medium based on balance point analysis, wherein the method comprises the following steps: performing spatial registration on the full-color remote sensing image and the multispectral remote sensing image, and simulating a low-resolution full-color image by using the multispectral remote sensing image to serve as an intermediate band; selecting any wave band in the multispectral remote sensing image as a target wave band and selecting the full-color remote sensing image as a source wave band; determining a balance point and a target correlation coefficient of a fusion parameter vector space according to two correlation coefficients of the linear fusion model; maintaining the feature extraction coefficient in the balance point unchanged, and determining a feature fusion coefficient according to the target correlation coefficient; calculating fusion results of the target wave bands according to the feature extraction coefficients in the balance points and the determined feature fusion coefficients; and merging and storing fusion results obtained by a plurality of wave bands in the multispectral remote sensing image. The method can obviously improve the fidelity capability of the spectrum information of the multispectral image of the fusion result and the space detail information of the high-resolution image.

Description

Remote sensing image space-spectrum fusion method, system, equipment and medium based on balance point analysis
Technical Field
The invention belongs to the technical field of remote sensing image fusion enhancement, and particularly relates to a remote sensing image space-spectrum fusion method, a system, computer equipment and a computer readable storage medium based on balance point analysis.
Background
With the development of multi-platform, multi-sensor, multi-weather, multi-time-phase and multi-resolution remote sensing technologies, the variety of remote sensing images available for various industries is increasingly abundant. The colorful remote sensing images not only provide flexible object selection space for remote sensing image application research, but also provide challenges for image application preprocessing research such as remote sensing image selection, synthesis, correction, enhancement and the like.
Different remote sensing image data have different image basic characteristics such as spatial resolution, time resolution, spectral resolution and the like, and have different application performances and application potentials in different application fields. Traditional remote sensing image processing focuses on the enhancement of general image characteristics such as color, texture, hierarchy and the like in a single image; the remote sensing image fusion processing focuses on integrating basic features of different images on a new remote sensing image, fully excavating comprehensive application potential of the remote sensing image and improving application performance of the remote sensing image. In more than twenty years, as a new direction of remote sensing image processing, the remote sensing image fusion technology has been developed to a great extent, and a series of new achievements are obtained.
Fusion methods based on color space theory develop CMYK, lab, IHS (HSB), HSV and other fusion methods, and IHS methods are classical. Based on the principle of mathematical statistics analysis and four-rule operation, a ratio method, a difference method, weighted superposition, multiple amplification and a four-rule mixed operation method are developed. Classical methods are the Brovey fusion method and the CN fusion method. Fusion methods based on the principle of signal analysis have developed fusion techniques and methods such as high-pass filtering, principal Component Analysis (PCA), fourier transform (FFT), wavelet (Wavelet) transform, gram-Schimdt transform, curvelet transform (Curvelet), contourlet transform (Contourlet), ridge wave transform (ridge let), band wave transform (band let), wedge wave transform (wedge let), and Wavelet transform (Beamlet), and classical PCA and GS fusion methods. All classical methods are integrated into commercial remote sensing image processing software.
The fusion method of IHS, PCA, GS and the like belongs to a component substitution method from the fusion principle, and belongs to a linear combination fusion model from the fusion expression, wherein the fusion result is essentially linear combination of multispectral wave bands, panchromatic wave bands and intermediate wave bands (simulated low-resolution panchromatic wave bands) (Hu Jiawei and the like, the remote sensing image fusion method based on AIHS and application research thereof, information technology and network safety, the 3 rd period of 2018, p61-64; showa and the like, the progress and challenges of the multisource air-spectrum remote sensing image fusion method, chinese image graphic report, the 25 th volume of the month of 2020, the 5 th period and p 851-860).
For remote sensing image fusion, the basic requirements are to ensure the richness and definition of geometrical space information such as texture, level, detail and the like of the fused multispectral image and the stability of fine spectral characteristics and color information of the remote sensing images before and after fusion. The quality of the remote sensing images before and after fusion can be quantitatively evaluated from the following aspects:
firstly, the richness and brightness of the image colors before and after fusion can be measured by using the band statistical characteristics, namely the maximum value, the minimum value, the mean value, the standard deviation and the correlation indexes between bands, namely the correlation coefficients, the covariance and other indexes; secondly, the richness of the whole information of the images before and after fusion can be measured by indexes such as information entropy of the image wave band; thirdly, the level (edge), detail (texture) and the definition of the images before and after fusion can be measured by indexes such as gradient, average gradient and the like; fourth, consistency and inheritance of image information before and after fusion, namely, the degree of inheritance of the fused image to original image space information and spectrum information can be reflected by the degree of relativity of the multispectral image after fusion with full-color wave band and original multispectral wave band information, if the relativity of the fused image with full-color image is high, the fused image well injects space geometric detail information of the full-color image, if the relativity of the fused image with the original multispectral wave band is high, the fine spectrum information of the multispectral image is well reserved, and therefore the quality of image fusion can be evaluated.
By comparing the difference of the respective indexes of the images before and after fusion, the change directions of the spectrum (gray level and tone) information, the edge (level and difference) information and the texture (detail) information of the images can be analyzed, so that the quality of the fused images can be judged.
For the space-spectrum fusion, two basic indexes reflect the geometric definition and the spectrum fidelity, namely the correlation coefficient of the fusion result and the original multispectral data, and the correlation coefficient of the fusion result and the full-color band. It is generally considered that the higher the correlation coefficient between the fusion result and the multispectral is, the better the spectrum characteristic is, and the larger the information entropy is; the higher the correlation coefficient between the fusion result and the full-color band is, the larger the average gradient ratio is, the higher the geometric definition is, and the better the image detail is kept.
For a linear combination fusion model, the two correlation coefficients may be evaluated prior to the fusion calculation. The two basic indices may be used to determine the fusion parameters of a linear combined fusion model, depending on the particular fusion objective. It is noted that for the linear combination fusion model, a pair of teeterboard is arranged between the correlation coefficient of the fusion result and the multispectral (multispectral correlation coefficient for short) and the correlation coefficient of the fusion result and the panchromatic (panchromatic correlation coefficient for short), and the higher multispectral correlation coefficient is generally obtained at the expense of the panchromatic correlation coefficient, the higher panchromatic correlation coefficient is generally obtained at the expense of the multispectral correlation coefficient. HIS, PCA, GS, etc., each feature in constructing appropriate fusion parameters to maintain a balance of the two.
Disclosure of Invention
In order to remarkably improve the spectrum information fidelity capability of the multispectral image of the fusion result and the space detail information of the high-resolution image, the invention provides a remote sensing image space-spectrum fusion method, a remote sensing image space-spectrum fusion system, a remote sensing image space-spectrum fusion computer device and a computer readable storage medium based on balance point analysis.
The first object of the invention is to provide a remote sensing image space-spectrum fusion method based on balance point analysis.
The second object of the invention is to provide a remote sensing image space-spectrum fusion system based on balance point analysis.
A third object of the present invention is to provide a computer device.
A fourth object of the present invention is to provide a computer-readable storage medium.
The first object of the present invention can be achieved by adopting the following technical scheme:
a remote sensing image space-spectrum fusion method based on balance point analysis, the method comprising:
performing spatial registration on the panchromatic remote sensing image and the multispectral remote sensing image, and simulating a low-resolution panchromatic image by using the multispectral remote sensing image; selecting any wave band in the multispectral remote sensing image as a fusion target wave band, taking the panchromatic remote sensing image as a fusion source wave band, and taking the panchromatic image with low resolution as an intermediate wave band;
According to the two correlation coefficients of the linear combination fusion model, determining a balance point and a target correlation coefficient of a fusion parameter vector space; the two correlation coefficients are the correlation coefficient of the fusion result and the fusion target band and the correlation coefficient of the fusion result and the fusion source band, and the fusion parameter vector space consists of undetermined feature extraction coefficients and undetermined feature fusion coefficients in the linear combination fusion model;
maintaining the feature extraction coefficient in the balance point unchanged, and determining a feature fusion coefficient according to the target correlation coefficient;
taking the characteristic extraction coefficient in the balance point as a determined characteristic extraction coefficient, and calculating a fusion result of the fusion target wave band according to the determined characteristic extraction coefficient and the determined characteristic fusion coefficient;
and taking any wave band in the multispectral remote sensing image as a fusion target wave band, correspondingly obtaining a plurality of fusion results, and merging and storing the fusion results.
Further, the determining the balance point of the fusion parameter vector space according to the two correlation coefficients of the linear combination fusion model includes:
according to the undetermined feature extraction coefficient and the undetermined feature fusion coefficient, obtaining an extremum condition curve of a fusion parameter vector space through extremum analysis of any one of the two correlation coefficients;
According to the undetermined feature extraction coefficient and the undetermined feature fusion coefficient, obtaining an equivalent condition curve of a fusion parameter vector space through equivalent analysis of two correlation coefficients;
and determining the balance point of the fusion parameter vector space by the intersection point of the extreme value condition curve and the equivalent condition curve.
Further, the extremum condition curve is:
wherein:
in the method, in the process of the invention,k I k E the method comprises the steps of extracting coefficients of undetermined features and fusion coefficients of undetermined features respectively, and mutually independent;STIrespectively fusing source bands, fusing target bands and intermediate bands;r(TS)、r(TI)、r(SI) Respectively isTAnd (3) withSTAnd (3) withISAnd (3) withIIs a correlation coefficient of (2);
the equivalent condition curve is as follows:
wherein:
the determining the balance point of the fusion parameter vector space by the intersection point of the extreme value condition curve and the equivalent condition curve comprises the following steps:
the extremum condition curve and the equivalence condition curve are intersected to obtain:
the equilibrium point is obtained as follows:
wherein:
in the method, in the process of the invention,k Ib k Eb and respectively extracting coefficients and feature fusion coefficients in the balance points.
Further, the extremum condition curve extracts coefficients according to given characteristicsk I By the following constitutionObtaining; wherein,r(FS) Is a fusion resultFFusion source bandSIs used for the correlation coefficient of the (c).
Further, the equivalent condition curve consists of r(FT)=r(FS) The product is obtained by, among other things,r(FT)、r(FS) Respectively fusion achievementsFFusing target bandsTFusion effortFFusion source bandSIs used for the correlation coefficient of the (c).
Further, the process of determining the target correlation coefficient is as follows:
when (when)k I =0,k E When=1The two correlation coefficients are equal and are:
for different intermediate bandsIUnder the assumption of scale invariancek I =r(TI)/r(TS),k E =r(TS) The two correlation coefficients are as follows:
when the intermediate band is fully correlated with the fusion source band, i.er(SI) =1, the correlation coefficient of the fusion target band and the intermediate band is equal to the correlation coefficient of the fusion target band and the fusion source band, i.e.r(TI)=r(TS) The two correlation coefficients have extreme values under the assumption that the scale is unchanged, and the extreme values are:
r siftp (FT)=1;
r siftp (FS)=r(TS);
the target correlation coefficient is taken as follows:
r O (FT)=max(r maxb (FT),r sift (FT));
in the method, in the process of the invention,k I k E the method comprises the steps of extracting coefficients of undetermined features and fusion coefficients of undetermined features respectively, and mutually independent;FSTIrespectively fusing results, fusing source wave bands, fusing target wave bands and intermediate wave bands;r(TS)、r(TI)、r(SI) Respectively isTAnd (3) withSTAnd (3) withISAnd (3) withIIs a correlation coefficient of (2);r maxb (FT)、r maxb (FS) Respectively isFAnd (3) withTFAnd (3) withSA maximum value of the correlation coefficient of (a);r siftp (FT)、r siftp (FS) Is thatFAnd (3) withT、FAnd (3) withSIs used for the correlation coefficient of the block.
Further, according to the relation that the fusion result is equal to the correlation coefficient of the fusion target wave band and the target correlation coefficient, the characteristic fusion coefficient is determined.
Further, the calculating the fusion result of the fusion target band according to the determined feature extraction coefficient and the determined feature fusion coefficient includes:
according to the determined feature extraction coefficient and the determined feature fusion coefficient, the expression of the linear combination fusion model is as follows:
wherein,STIrespectively merging source bands, merging target bands and intermediate bands,μ s 、μ T 、μ I is thatSTIIs used for the average value of (a),σ s σ T σ I respectively isSTIIs used for the mean square error of (c),k Ib k Er for the determined feature extraction coefficients and the determined feature fusion coefficients, ƒ is the fusion outcome of the mean filtered image,σ ƒ is the mean square error of ƒ,k T for the feature matching coefficients to be a function of the feature matching coefficients,Fis a fusion result.
The second object of the invention can be achieved by adopting the following technical scheme:
a remote sensing image space-spectrum fusion system based on balance point analysis, the system comprising:
the selecting module is used for carrying out spatial registration on the full-color remote sensing image and the multispectral remote sensing image, and simulating a low-resolution full-color image by utilizing the multispectral remote sensing image; selecting any wave band in the multispectral remote sensing image as a fusion target wave band, taking the panchromatic remote sensing image as a fusion source wave band, and taking the panchromatic image with low resolution as an intermediate wave band;
the first determining module is used for determining a balance point and a target correlation coefficient of the fusion parameter vector space according to the two correlation coefficients of the linear combination fusion model; the two correlation coefficients are the correlation coefficient of the fusion result and the fusion target band and the correlation coefficient of the fusion result and the fusion source band, and the fusion parameter vector space consists of undetermined feature extraction coefficients and undetermined feature fusion coefficients in the linear combination fusion model;
The second determining module is used for keeping the characteristic extraction coefficient in the balance point unchanged and determining a characteristic fusion coefficient according to the target correlation coefficient;
the calculation module is used for taking the characteristic extraction coefficient in the balance point as a determined characteristic extraction coefficient and calculating a fusion result of the fusion target wave band according to the determined characteristic extraction coefficient and the determined characteristic fusion coefficient;
and the merging and storing module is used for taking any wave band in the multispectral remote sensing image as a merging target wave band, correspondingly obtaining a plurality of merging results, merging and storing the merging results.
The third object of the present invention can be achieved by adopting the following technical scheme:
the computer equipment comprises a processor and a memory for storing a program executable by the processor, wherein the remote sensing image space-spectrum fusion method is realized when the processor executes the program stored by the memory.
The fourth object of the present invention can be achieved by adopting the following technical scheme:
a computer readable storage medium storing a program which, when executed by a processor, implements the above-described remote sensing image space-spectrum fusion method.
Compared with the prior art, the invention has the following beneficial effects:
1. According to the characteristic that the full-color correlation coefficient and the multispectral correlation coefficient of the linear combination fusion model can be calculated in advance, two key parameters of the linear combination fusion model, namely a characteristic extraction coefficient and a reference value of a characteristic fusion coefficient, are determined based on balance point analysis; and taking a given multispectral correlation coefficient as a target, keeping the characteristic extraction coefficient unchanged, and performing relaxation calculation on the characteristic fusion coefficient so as to calculate a space-spectrum fusion result. Compared with the original image, the geometrical details, textures, edges, layers and other spatial information of the ground features in the fused image are greatly enriched, and compared with the original multispectral image, the spectral characteristics and color display of the ground features such as water body, vegetation, bare earth surface, buildings and the like in the fused multispectral image are kept stable;
2. the invention performs space-spectrum fusion based on the linear combination of the fusion target wave band, the fusion source wave band and the intermediate wave band, and fuses any data pair consisting of the multispectral wave band and the full-color wave band under the condition of determining the intermediate wave band. And (3) repeating the fusion calculation for how many times according to the same mode by how many data pairs are formed by the multispectral wave bands and the panchromatic wave bands, and finally obtaining the fusion result of all wave bands. Thus, the present invention has no limitation on the number of multispectral bands, and is open.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to the structures shown in these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a remote sensing image space-spectrum fusion method based on balance point analysis in embodiment 1 of the present invention;
FIG. 2 is a detailed flowchart of the remote sensing image space-spectrum fusion method based on balance point analysis in embodiment 2 of the present invention;
FIG. 3 is a full-color image (0.8 m resolution) of example 3 of the present invention;
FIG. 4 is a diagram of a standard pseudo-color composite image (3.2 m resolution) according to example 3 of the present invention;
FIG. 5 is a true color composite image (3.2 m resolution) of example 3 of the present invention;
FIG. 6 is a diagram of an image of the intermediate band I (3.2 m resolution) of embodiment 3 of the present invention;
FIG. 7 is a true color composite plot (0.5 m resolution) of the fused multispectral R ' G ' B ' band of example 3 of the present invention;
FIG. 8 is a standard pseudo-color synthesis plot (0.5 m resolution) of the fused multispectral N ' R ' G ' band of example 3 of the present invention;
fig. 9 is a structural block diagram of a remote sensing image space-spectrum fusion system based on balance point analysis in embodiment 4 of the present invention;
fig. 10 is a block diagram showing the structure of a computer device according to embodiment 5 of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments, and all other embodiments obtained by those skilled in the art without making any inventive effort based on the embodiments of the present invention are within the scope of protection of the present invention. It should be understood that the description of the specific embodiments is intended for purposes of illustration only and is not intended to limit the scope of the present application.
Example 1:
according to the embodiment, space-spectrum information fusion is carried out on full-color band images with high spatial resolution and multispectral band images with rich spectral information, balance point analysis of space detail maintenance and spectral feature maintenance of a remote sensing image space-spectrum data linear combination fusion model is carried out, and feature extraction coefficients, feature fusion coefficients and fusion results of the linear combination fusion model are determined under given multispectral correlation coefficients. The linear combination fusion model is provided with two independent fusion parameters, namely a feature extraction coefficient and a feature fusion coefficient, and the two coefficients form a vector space of the linear combination fusion model parameters. For a specific fusion target wave band, a fusion source wave band and an intermediate wave band, a characteristic extraction coefficient is given, a characteristic curve-extremum condition curve of a vector space can be found through extremum analysis of the correlation coefficient, another characteristic curve-equivalent condition curve of the vector space can be found through equivalent analysis of two correlation coefficients, and the intersection point of the two characteristic curves is a characteristic point-balance point of a fusion parameter vector space. Taking the coordinate of a balance point as a reference value, giving a multispectral correlation coefficient which is larger than a full-color coefficient as a target correlation coefficient, and carrying out relaxation calculation on characteristic fusion coefficient components of the balance point under the condition of keeping the characteristic extraction coefficient components of the balance point unchanged to obtain a group of fusion parameters of a linear combination fusion model, thereby obtaining a space-spectrum fusion scheme based on balance point analysis.
As shown in fig. 1, the remote sensing image space-spectrum fusion method based on balance point analysis provided in this embodiment includes the following steps:
s101, acquiring a full-color remote sensing image and a multispectral remote sensing image.
Obtaining full-color wave band of remote sensing imagePMultispectral wave band of remote sensing imageM i (near infrared N, red R, green G, blue B, etc.), the number of multispectral bands is not limited by this embodiment.
S102, performing spatial registration on the full-color remote sensing image and the multispectral remote sensing image, and constructing a low-resolution full-color image by using the multispectral remote sensing image and taking the full-color image as an intermediate band.
Carrying out space registration on full-color wave band images and multispectral wave band images to ensure that the geometric space positions of the same ground object in the two images are consistent; constructing simulated panchromatic images at low resolution using multispectral band images and as fused intermediate bandsIWill be full-color wave bandPIntermediate bandIAnd multispectral wave bandsM i Resampling according to the high spatial resolution image, and combining the three images into a file.
S103, determining a balance point according to two correlation coefficients of the linear combination fusion model.
Further, step S103 specifically includes:
(1) General expressions of the fusion model are linearly combined.
Let the fusion target band of the space-spectrum fusion object beTFusing source bands intoSFused intermediate bands areIFusion result isF. IHS, PCA, GS and the like, the fusion result can be abstracted into a unified linear combination fusion model as follows:
μ T μ S μ I respectively fusing target wave bands asTFusing source bands to beSFused intermediate bands areIIs used for the average value of (a),T- μ T 、S-μ S 、I-μ I respectively isTSIIs used for the mean value filtering image of the (a),σ T σ S σ I is thatTSIThe mean square error of ƒ mean filtered image fusion outcome,Fis a result of the space-spectrum fusion.
For the feature extraction factor(s),k I the feature extraction coefficients for the feature extraction factors,k E the two coefficients are mutually independent and form a fusion parameter vector space of the linear fusion model.
k M For the characteristic matching coefficient of the mean value filtering image fusion result and the multispectral wave band,. Once it isk I k E It is determined that the number of the cells,σ ƒk M then confirm and thusk M Not independent fusion parameters.σ M μ M Respectively mean square error and mean value of multispectral wave bands,σ ƒ the mean square error of the result ƒ is fused for the mean-filtered image.
(2) The equilibrium point is determined.
According to the fusion expression, the correlation coefficient between the fusion result and the fusion target band is as follows:
the correlation coefficient between the fusion result and the fusion source band is as follows:
Wherein,r(FT)、r(FS) Respectively fusion achievementsFFusing target bandsTFusion effortFFusion source bandSIs a correlation coefficient of (2);r(TS)、r(TI)、r(SI) Respectively fusion target wave bandsTFusion source bandSFusion of target bandsTAnd fusion intermediate bandIFusion source bandSAnd fusion intermediate bandIIs used for the correlation coefficient of the (c).
Extracting coefficients for a given featurek I By the following constitutionCan be obtainedr(FS) Characteristic curve of fusion parameter with extremum-extremum condition curve is as follows:
wherein:
from the following componentsr(TS)=r(FS)The characteristic curve-equivalent condition curve of the fusion parameters when the two phase relation numbers are equal can be obtained as follows:
wherein:
the coordinate of the balance point is obtained by the intersection point of the extreme value condition curve and the equivalent condition curve, namely, the coordinate of the balance point is obtained by:
the method can obtain:
wherein:
k Ib as determined sign extraction coefficients.
S104, determining a target correlation coefficient according to the two correlation coefficients of the linear combination fusion model.
Is provided withTIn order to be in the multi-spectral band,Sfor full-color bands, for arbitrary intermediate bandsIWhen (when)k I =0,k E When=1, the two correlation coefficients are equal and are:
this value is allk I ≥0Corresponding equivalent conditionk E ≥0A maximum value of the time-averaged correlation coefficient. The correlation coefficient is used as a specific intermediate bandIThe target correlation coefficient of the multispectral and fusion result is a reasonable choice.
For different intermediate bandsIUnder the assumption of scale invariancek I =r(TI)/r(TS),k E =r(TS) The two correlation coefficients are as follows:
the fusion result under the assumption of scale invariance automatically realizes the relative balance of the full-color correlation coefficient and the multispectral correlation coefficient, and the multispectral correlation coefficient is also a reasonable target correlation coefficient.
When the analog low resolution panchromatic band is fully correlated with the panchromatic band, i.er(SI) =1, thisThe correlation coefficient of the time multispectral band and the intermediate band is equal to that of the full-color band, namelyr(TI)=r(TS) The two correlation coefficients have extreme values under the assumption that the scale is unchanged, and the extreme values are:
r siftp (FT)=1;
r siftp (FS)=r(TS);
since simulated low resolution panchromatic images tend to fail to achieve such an effect, the multispectral correlation coefficient tends to be less than 1 or even less thanThe panchromatic correlation coefficient tends to be higher than that of multispectral and panchromatic correlation coefficientsr(TS) Is also generally less than->
Therefore, taking the target correlation coefficient of the multispectral as the target correlation coefficient is the relatively larger of the two multispectral correlation coefficients, namely:
r O (FT)=max(r maxb (FT),r sift (FT));
s105, maintaining the feature extraction coefficient in the balance point unchanged, and determining a feature fusion coefficient according to the target correlation coefficient.
Here is assumed to beTThe spectral band of the multi-spectrum,Sthe target correlation coefficient is the value of the multispectral correlation coefficient for the full-color band. Extracting coefficients at balance point features k Ib Under the condition of unchanged state, the corresponding characteristic fusion coefficient relaxation value is obtainedk Er
The method comprises the following steps:
the method can obtain:
wherein:
obtained byk Er And fusing coefficients for the determined features.
S106, taking the characteristic extraction coefficient in the balance point as the determined characteristic extraction coefficient, and calculating a fusion result of the fusion target wave band according to the determined characteristic extraction coefficient and the determined characteristic fusion coefficient.
Will bek Ib k Er The fusion expression is brought in, and the fusion result is as follows:
and S107, taking any wave band in the multispectral remote sensing image as a fusion target wave band, correspondingly obtaining a plurality of fusion results, and merging and storing the fusion results. The method provided by the embodiment is suitable for fusing the panchromatic wave band and the multispectral wave band of the remote sensing image so as to enhance the spatial resolution and the geometric texture information of the multispectral image, and can realize the consistency of the spectral characteristics and the color information of the original multispectral image and the multispectral image after fusion while greatly improving the spatial resolution and the spatial geometric detail information of the multispectral image.
Example 2:
as shown in fig. 2, this embodiment mainly further illustrates the method provided in embodiment 1, and specifically includes the following steps:
step one and step two are the same as steps S101 and S102 in embodiment 1.
Step three, calculating the mean value of each wave band of the fusion imageμMean square errorσAnd covariance thereofCovMatrix and correlation coefficientrA matrix, the expression of which is:
;/>
wherein the method comprises the steps ofi=1,2,…nj=1,2,…nnThe number of bands for the multispectral;RCthe number of rows and the number of columns of the fusion image are respectively;pqrespectively a row number and a column number of the band image pixels.
Fourth, in the multispectral wave bandM i And full color bandPSelecting multispectral wave bands from the twoM i To merge target wave bandsTFull color bandPTo merge source bandsSCalculating a characteristic extraction coefficient and a characteristic fusion coefficient of a multispectral correlation coefficient extremum curve balance point, wherein the characteristic extraction coefficient and the characteristic fusion coefficient are represented by the following formula:
wherein:
r(TS)、r(SI)、r(TI) Respectively fusion target wave bandsTFusion source bandSIs to merge source bandsSAnd intermediate band of wavelengthsIIs to fuse the target bandsTAnd intermediate band of wavelengthsIIs used for the correlation coefficient of the (c).
Step five, determining a target value of a fusion result and a multispectral correlation coefficient:
intermediate band of wavesIThe maximum value when the multispectral correlation coefficient of the lower fusion result is equal to the panchromatic correlation coefficient is as follows:
under the assumption of scale invariancek I =r(TI)/r(TS),k E =r(TS) The two correlation coefficients of the fusion result are as follows:
taking the target correlation coefficient of the multispectral as the target correlation coefficient, wherein the target correlation coefficient is the relatively larger of the two multispectral correlation coefficients, namely:
r O (FT)=max(r maxb (FT),r sift (FT));
Step six, determining characteristic fusion coefficients corresponding to the characteristic extraction coefficients of the balance points under the multispectral correlation coefficient target:
r(FT)=r O (FT) At the time of feature extraction coefficientk Ib The corresponding feature fusion coefficients are:
wherein:
and step seven, calculating fusion results of the multispectral image and the full-color image.
k Ib k Er The mean filtering fusion result of the corresponding multispectral image and the full-color image is as follows:
due to the present embodimentTFor multispectral wave bands as fusion target wave bands, the matching coefficients are as follows:
the fusion outcome of multispectral and panchromatic is:
wherein,μ S 、μ T 、μ I respectively isSTIIs used for the average value of (a),σ S σ T σ I respectively isSTIIs used for the mean square error of (c),μ M 、σ M respectively multispectral imagesTThe mean value and the mean square error of (c),σ and the mean square error of the fusion result is the mean value filtering.
In the embodiment, the expression of the fusion result has definite physical meaning, the expression is concise and clear, and the calculation is quick and efficient.
Each multispectral bandM i As a fusion target imageTAnd carrying out space-spectrum fusion calculation one by one to obtain fusion results of all wave bands and full-color wave bands.
And step eight, synthesizing and storing the multispectral fusion image result.
Example 3:
in order to achieve the purpose of remote sensing image space-spectrum fusion based on correlation analysis, the embodiment is mainly implemented by using ENVI remote sensing image processing software, and embodiment 2 is further described by using a satellite remote sensing image map with full-color wave bands (P), blue wave bands (B), green wave bands (G), red wave bands (R) and near infrared (N).
(1) And inputting a remote sensing image graph.
A GF-2 remote sensing image (taken from gf2_pms1_e112_8_n23_1_20191207_l1a 0004450945) having full color band (P), blue band (B), green band (G), red band (R), near infrared (N) is turned on. Fig. 3, 4, 5 are full-color band diagrams (0.8 m resolution), multi-spectral standard false color composite image diagrams, true color composite image diagrams (3.2 m resolution), respectively (effect diagram of 1% stretch by envi default).
(2) Constructing an intermediate band in ENVI using two multispectral band imagesIThe band operation expression is b= (1.0×b1+b2)/2, wherein b1 and b2 are near infrared (N) and green band (G), respectively. Intermediate wave band for fusing target image with calculation resultIAs shown in fig. 6. Full color band using ENVI softwarePIntermediate bandIAnd multispectral wave bandsNAnd R, G, B resampling according to the high-spatial-resolution fusion source image to synthesize an image file.
(3) Calculating panchromatic bandsPIntermediate bandIAnd multispectral wave bandsN、R、G、NMean of (2)μ、Standard deviation ofσAnd counting parameters of the image features. The statistical parameters of the basic features of the images of each wave band are shown in tables 1 and 2.
TABLE 1 mean and mean square error statistics parameters table for each band of images
TABLE 2 statistics of correlation coefficient matrix for each band of images
(4) Selecting multispectral imagesN、R、G、BTo merge the target band T, the panchromatic bandPFor fusing the source wave band S, calculating a characteristic extraction coefficient corresponding to the extremum curve balance point of the multispectral correlation coefficientk Ib k Eb Target value r of multispectral correlation coefficient O (T, F), relaxed feature fusion coefficientk Er Characteristic matching coefficientk M The calculated results are shown in Table 3.
TABLE 3 fusion parameter calculation results Table for TIS data combinations
(5) And calculating fusion results of the multispectral image and the full-color image.
(a) The fusion operation expression corresponding to the near infrared band is as follows:
uint (0.916254730481121 ((B1-350.242687) + (0.828430060964442) (-0.731524532878029 (B2-377.696015) -0.881452368468189 x 0.772509011122974 (B3-402.2575))) + 350.242687+0.5), wherein B1 is near infrared band (B), and B2 is full color bandPB3 is the intermediate bandIAnd calculating to obtain a fused near infrared band (NF) image.
(b) The fusion operation expression corresponding to the red wave band is as follows:
uint (1.14977249383767 ((b 1-396.465098) + (0.549750391920787) (-1.34227068803092 (b 2-377.696015) -1.223721141760203 x 1.4174729011351 (b 3-402.2575))) +396.465098+0.5), wherein b1 is the red band (G) and b2 is the full color bandPB3 is the intermediate bandIAnd calculating to obtain a fused red band (RF) image.
(c) The fusion operation expression corresponding to the green wave band is as follows:
uint (1.18490496969993 ((b 1-454.272313) + (0.53357541455675) (1.24309022033366 (b 2-377.696015) -1.27789227039292 (b 3-402.2575))) +454.272313+0.5), wherein b1 is a green band (R), b2 is a full-color band P, and b3 is an intermediate band I, and the fused green band (GF) image is calculated.
(d) The fusion operation expression corresponding to the blue band is as follows:
uint (1.02031170027321 ((b 1-590.942422) + (0.752939162134971) (1.22402771793525 (b 2-377.696015) -1.009197751592 (1.29260523669537 (b 3-402.2575))))) +590.942422+0.5), wherein b1 is blue band (N), b2 is full-color band P, and b3 is intermediate band I, and the fused blue Band (BF) image is calculated.
The fused red band (NF), green band (GF) and blue Band (BF) are synthesized into a true color image according to red, green and blue channels as shown in FIG. 7; the standard false color image synthesized by the fused near infrared band (NF), red band (RF) and green band (GF) according to the red, green and blue channels is shown in FIG. 8.
The original multispectral remote sensing image is rich in spectral information, the color information of ground objects such as water, vegetation, buildings, bare rock, bare soil and the like is quite rich, the spectral information and color display difference of different ground objects are obvious, but the multispectral image is low in spatial resolution, and the ground object texture and detail information are lack, so that the ground object type and position identification is inaccurate during remote sensing analysis, and the application effect of the multispectral image is influenced. The spatial resolution of the multispectral image after fusion is greatly improved, the spatial information is greatly enriched, the geometric texture, the spatial detail, the edge definition and the layering of the ground object on the image are comprehensively improved, meanwhile, the stability of the ground object spectral characteristics and the color display of the original multispectral image can be maintained, and the information enrichment degree of the remote sensing image and the integral quality of the image are greatly improved.
In this embodiment, three remote sensing images, namely, GF-2 original multispectral image, multispectral image fused by the method of the present invention, and multispectral image fused by the Gram-schmidt method, are subjected to band data statistical analysis, and the image band statistical characteristic parameters are compared in table 4 (at the end page of the specification). The data in the table shows that the spatial resolution of the multispectral image fused by the method is improved from original 3.2 meters to 0.8 meters by comparing the original multispectral image, and the spatial precision of the image is greatly improved; the fused image has the images of the original multispectral wave band and the full-color wave band, and the information consistency indexes are basically consistent; the information entropy of the multispectral wave bands is slightly reduced, but the gradient information of the wave bands is greatly enhanced, and the ground feature space information of the multispectral image after fusion is more abundant, so that the development of various image applications such as ground feature identification, interpretation and analysis is facilitated. The data in the table shows that compared with the multispectral image obtained by fusion by adopting the ENVI Gram-Schmidt method, the multispectral image fused by adopting the method has lower correlation coefficient between wave bands, smaller information redundancy between the wave bands and better data structure; the information entropy of each wave band is basically consistent with the gradient index, and the two image fusion methods have the characteristics that the overall performance is not divided into primary and secondary.
TABLE 4 comparison Table of band statistical characteristic parameters of original multispectral image, fusion image of the method, GS fusion image
It will be understood that all or part of the steps in the method for implementing the above embodiments may be implemented by a program for instructing relevant hardware, and the corresponding program may be stored in a computer readable storage medium.
It should be noted that although the method operations of the above embodiments are depicted in the drawings in a particular order, this does not require or imply that the operations must be performed in that particular order or that all illustrated operations be performed in order to achieve desirable results. Rather, the depicted steps may change the order of execution. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform.
Example 4:
as shown in fig. 9, the present embodiment provides a remote sensing image space-spectrum fusion system based on balance point analysis, which includes a selection module 901, a first determination module 902, a second determination module 903, a calculation module 904, and a merging and storing module 905, wherein:
the selecting module 901 is used for performing spatial registration on the full-color remote sensing image and the multispectral remote sensing image, and simulating a low-resolution full-color image by utilizing the multispectral remote sensing image; selecting any wave band in the multispectral remote sensing image as a fusion target wave band, taking the panchromatic remote sensing image as a fusion source wave band, and taking the panchromatic image with low resolution as an intermediate wave band;
A first determining module 902, configured to determine a balance point and a target correlation coefficient of the fusion parameter vector space according to two correlation coefficients of the linear combination fusion model; the two correlation coefficients are the correlation coefficient of the fusion result and the fusion target band and the correlation coefficient of the fusion result and the fusion source band, and the fusion parameter vector space consists of undetermined feature extraction coefficients and undetermined feature fusion coefficients in the linear combination fusion model;
a second determining module 903, configured to keep the feature extraction coefficient in the balance point unchanged, determine a feature fusion coefficient according to the target correlation coefficient;
the calculating module 904 is configured to calculate a fusion result of the fusion target band according to the determined feature extraction coefficient and the determined feature fusion coefficient by using the feature extraction coefficient in the balance point as the determined feature extraction coefficient;
the merging and storing module 905 is configured to use any band of the multispectral remote sensing image as a merging target band, obtain a plurality of merging results correspondingly, and merge and store the merging results.
Specific implementation of each module in this embodiment may be referred to embodiment 1 above, and will not be described in detail herein; it should be noted that, in the system provided in this embodiment, only the division of the above functional modules is used as an example, in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure is divided into different functional modules to perform all or part of the functions described above.
Example 5:
the present embodiment provides a computer device, which may be a computer, as shown in fig. 10, and is connected through a system bus 1001 to a processor 1002, a memory, an input device 1003, a display 1004, and a network interface 1005, where the processor is configured to provide computing and control capabilities, the memory includes a nonvolatile storage medium 1006 and an internal memory 1007, where the nonvolatile storage medium 1006 stores an operating system, a computer program, and a database, and the internal memory 1007 provides an environment for the operating system and the computer program in the nonvolatile storage medium, and when the processor 1002 executes the computer program stored in the memory, the remote sensing image space-spectrum fusion method of the foregoing embodiment 1 is implemented as follows:
performing spatial registration on the panchromatic remote sensing image and the multispectral remote sensing image, and simulating a low-resolution panchromatic image by using the multispectral remote sensing image; selecting any wave band in the multispectral remote sensing image as a fusion target wave band, taking the panchromatic remote sensing image as a fusion source wave band, and taking the panchromatic image with low resolution as an intermediate wave band;
according to the two correlation coefficients of the linear combination fusion model, determining a balance point and a target correlation coefficient of a fusion parameter vector space; the two correlation coefficients are the correlation coefficient of the fusion result and the fusion target band and the correlation coefficient of the fusion result and the fusion source band, and the fusion parameter vector space consists of undetermined feature extraction coefficients and undetermined feature fusion coefficients in the linear combination fusion model;
Maintaining the feature extraction coefficient in the balance point unchanged, and determining a feature fusion coefficient according to the target correlation coefficient;
taking the characteristic extraction coefficient in the balance point as a determined characteristic extraction coefficient, and calculating a fusion result of the fusion target wave band according to the determined characteristic extraction coefficient and the determined characteristic fusion coefficient;
and taking any wave band in the multispectral remote sensing image as a fusion target wave band, correspondingly obtaining a plurality of fusion results, and merging and storing the fusion results.
Example 6:
the present embodiment provides a computer readable storage medium storing a computer program, where the computer program is executed by a processor to implement the remote sensing image space-spectrum fusion method of the foregoing embodiment 1, as follows:
performing spatial registration on the panchromatic remote sensing image and the multispectral remote sensing image, and simulating a low-resolution panchromatic image by using the multispectral remote sensing image; selecting any wave band in the multispectral remote sensing image as a fusion target wave band, taking the panchromatic remote sensing image as a fusion source wave band, and taking the panchromatic image with low resolution as an intermediate wave band;
according to the two correlation coefficients of the linear combination fusion model, determining a balance point and a target correlation coefficient of a fusion parameter vector space; the two correlation coefficients are the correlation coefficient of the fusion result and the fusion target band and the correlation coefficient of the fusion result and the fusion source band, and the fusion parameter vector space consists of undetermined feature extraction coefficients and undetermined feature fusion coefficients in the linear combination fusion model;
Maintaining the feature extraction coefficient in the balance point unchanged, and determining a feature fusion coefficient according to the target correlation coefficient;
taking the characteristic extraction coefficient in the balance point as a determined characteristic extraction coefficient, and calculating a fusion result of the fusion target wave band according to the determined characteristic extraction coefficient and the determined characteristic fusion coefficient;
and taking any wave band in the multispectral remote sensing image as a fusion target wave band, correspondingly obtaining a plurality of fusion results, and merging and storing the fusion results.
The computer readable storage medium of the present embodiment may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In summary, the remote sensing image space-spectrum fusion method, system, equipment and medium based on balance point analysis provided by the invention are mainly aimed at remote sensing images with full-color wave bands and multispectral wave bands such as near infrared, red, green and blue. Firstly, carrying out space registration on full-color wave band images and multispectral wave band images to ensure that the geometric space positions of the same ground object in the two images are consistent; then, using the multispectral band image to construct an analog low resolutionAnd as a fused intermediate bandIWill be full-color wave bandPIntermediate bandIAnd multispectral wave bandsMiResampling the images according to the high spatial resolution, synthesizing the three images into a file, and calculating the mean value mu, the mean square error sigma and the correlation coefficient r matrix of each wave band of the fused image; and finally, selecting a fused target wave band and a source wave band, analyzing and determining characteristic extraction coefficients, characteristic fusion coefficients and target values of multispectral correlation coefficients corresponding to balance points of multispectral correlation coefficient extremum curves, calculating characteristic fusion coefficients and characteristic matching coefficients under the target values of the correlation coefficients, and carrying out fusion calculation on each multispectral, panchromatic and intermediate wave band data combination according to a fusion scheme to obtain a multispectral wave band and panchromatic wave band fusion image. The method is suitable for fusing the full-color wave band and the multispectral wave band, realizes the space-spectrum fusion of the remote sensing image based on balance point analysis, enhances the information of space geometry, texture, edge, layer and the like of the ground object in the multispectral image, improves the image definition and the space resolution, and simultaneously can highly maintain the stability of the spectral characteristics and the color display of various ground objects of the original multispectral image. The technical method has the advantages of solid theoretical basis, clear physical meaning, wide application objects, simple and convenient operation and high operation efficiency. The fused image has bright color, rich information, stable spectral information, easy visual and automatic classification, and has great promotion effect on promoting the popularization and application of domestic high-resolution images in various industries at home and abroad especially under the current high-resolution satellite remote sensing rapid development background.
The above-mentioned embodiments are only preferred embodiments of the present invention, but the protection scope of the present invention is not limited thereto, and any person skilled in the art can make equivalent substitutions or modifications according to the technical solution and the inventive concept of the present invention within the scope of the present invention disclosed in the present invention patent, and all those skilled in the art belong to the protection scope of the present invention.

Claims (9)

1. The remote sensing image space-spectrum fusion method based on balance point analysis is characterized by comprising the following steps of:
performing spatial registration on the panchromatic remote sensing image and the multispectral remote sensing image, and simulating a low-resolution panchromatic image by using the multispectral remote sensing image; selecting any wave band in the multispectral remote sensing image as a fusion target wave band, taking the panchromatic remote sensing image as a fusion source wave band, and taking the panchromatic image with low resolution as an intermediate wave band;
according to the two correlation coefficients of the linear combination fusion model, determining a balance point and a target correlation coefficient of a fusion parameter vector space; the two correlation coefficients are respectively the correlation coefficient of the fusion result and the fusion target band and the correlation coefficient of the fusion result and the fusion source band, and the fusion parameter vector space consists of undetermined feature extraction coefficients and undetermined feature fusion coefficients in the linear combination fusion model;
Maintaining the feature extraction coefficient in the balance point unchanged, and determining a feature fusion coefficient according to the target correlation coefficient;
taking the characteristic extraction coefficient in the balance point as a determined characteristic extraction coefficient, and calculating a fusion result of the fusion target wave band according to the determined characteristic extraction coefficient and the determined characteristic fusion coefficient;
taking any wave band in the multispectral remote sensing image as a fusion target wave band, correspondingly obtaining a plurality of fusion results, and merging and storing the fusion results;
the expression of the linear combination fusion model is as follows:
in the method, in the process of the invention,STIrespectively merging source bands, merging target bands and intermediate bands,μ s 、μ T 、μ I respectively isSTIIs used for the average value of (a),σ s σ T σ I respectively isSTIIs used for the mean square error of (c),k I k E extracting coefficients and undetermined features respectivelyThe fusion coefficients are characterized and mutually independent; ƒ is the fusion result of the mean value filtered image;
wherein, the determining the balance point of the fusion parameter vector space according to the two correlation coefficients of the linear combination fusion model comprises:
according to the undetermined feature extraction coefficient and the undetermined feature fusion coefficient, obtaining an extremum condition curve of a fusion parameter vector space through extremum analysis of any one of the two correlation coefficients;
According to the undetermined feature extraction coefficient and the undetermined feature fusion coefficient, obtaining an equivalent condition curve of a fusion parameter vector space through equivalent analysis of two correlation coefficients;
and determining the balance point of the fusion parameter vector space by the intersection point of the extreme value condition curve and the equivalent condition curve.
2. The remote sensing image space-spectrum fusion method according to claim 1, wherein the extremum condition curve is:
wherein:
in the method, in the process of the invention,r(TS)、r(TI)、r(SI) Respectively isTAnd (3) withSTAnd (3) withISAnd (3) withIIs a correlation coefficient of (2);
the equivalent condition curve isK D
Wherein:
the determining the balance point of the fusion parameter vector space by the intersection point of the extreme value condition curve and the equivalent condition curve comprises the following steps:
the extremum condition curve and the equivalence condition curve are intersected to obtain:
the equilibrium point is obtained as follows:
wherein:
in the method, in the process of the invention,k Ib k Eb and respectively extracting coefficients and feature fusion coefficients in the balance points.
3. The method of claim 2, wherein the extremum condition curve extracts coefficients according to a given characteristick I By the following constitutionObtaining; wherein r (F, S) is the correlation coefficient between the fusion result F and the fusion source band S.
4. The method of claim 2, wherein the equivalent condition curve is composed of r(FT)=r(FS) The product is obtained by, among other things,r(FT)、r(FS) Respectively fusion achievementsFFusing target bandsTFusion effortFFusion source bandSIs used for the correlation coefficient of the (c).
5. The remote sensing image space-spectrum fusion method according to claim 1, wherein the process of determining the target correlation coefficient is as follows:
when (when)k I =0,k E When=1The two correlation coefficients are equal and are:
for different intermediate bandsIUnder the assumption of scale invariancek I =r(TI)/r(TS),k E =r(TS) Correlation coefficient between fusion result and fusion target bandr sift (F,T) Correlation coefficient between fusion result and fusion source bandr sift (F,S) The expression of (2) is as follows:
when the intermediate band is fully correlated with the fusion source band, i.er(SI) =1, the correlation coefficient of the fusion target band and the intermediate band is equal to the correlation coefficient of the fusion target band and the fusion source band, i.e.r(TI)=r(TS) Thenr sift (F,T)、r sift (F,S) The extreme values under the assumption of unchanged scale are respectively:
r siftp (FT)=1;
r siftp (FS)=r(TS);
the target correlation coefficient is taken as follows:
r O (FT)=max(r maxb (FT),r sift (FT));
in the method, in the process of the invention,Fis a fusion result;r(TS)、r(TI)、r(SI) Respectively isTAnd (3) withSTAnd (3) withISAnd (3) withIIs a correlation coefficient of (2);r maxb (FT)、r maxb (FS) Respectively isFAnd (3) withTFAnd (3) withSA maximum value of the correlation coefficient of (a);r siftp (FT)、r siftp (FS) Is thatUnder the assumption of scale invariancek I =r(TI)/r(TS),k E =r(TS) In the time-course of which the first and second contact surfaces,Fand (3) withTFAnd (3) withSIs used for the correlation coefficient of the block.
6. The remote sensing image space-spectrum fusion method according to any one of claims 1-5, wherein the feature fusion coefficient is determined according to the relationship that the fusion result is equal to the correlation coefficient of the fusion target band and the target correlation coefficient.
7. The remote sensing image space-spectrum fusion method according to any one of claims 1-5, wherein the calculating a fusion result of the fusion target band according to the determined feature extraction coefficient and the determined feature fusion coefficient comprises:
according to the determined feature extraction coefficient and the determined feature fusion coefficient, the expression of the linear combination fusion model is as follows:
wherein,
in the method, in the process of the invention,k Ib k Er sigma for the determined feature extraction coefficients and the determined feature fusion coefficients ƒ Is the mean square error of ƒ,k T for the feature matching coefficients to be a function of the feature matching coefficients,Fis a fusion result.
8. A remote sensing image space-spectrum fusion system based on balance point analysis, the system comprising:
the selecting module is used for carrying out spatial registration on the full-color remote sensing image and the multispectral remote sensing image, and simulating a low-resolution full-color image by utilizing the multispectral remote sensing image; selecting any wave band in the multispectral remote sensing image as a fusion target wave band, taking the panchromatic remote sensing image as a fusion source wave band, and taking the panchromatic image with low resolution as an intermediate wave band;
the first determining module is used for determining a balance point and a target correlation coefficient of the fusion parameter vector space according to the two correlation coefficients of the linear combination fusion model; the two correlation coefficients are respectively the correlation coefficient of the fusion result and the fusion target band and the correlation coefficient of the fusion result and the fusion source band, and the fusion parameter vector space consists of undetermined feature extraction coefficients and undetermined feature fusion coefficients in the linear combination fusion model;
The second determining module is used for keeping the characteristic extraction coefficient in the balance point unchanged and determining a characteristic fusion coefficient according to the target correlation coefficient;
the calculation module is used for taking the characteristic extraction coefficient in the balance point as a determined characteristic extraction coefficient and calculating a fusion result of the fusion target wave band according to the determined characteristic extraction coefficient and the determined characteristic fusion coefficient;
the merging and storing module is used for taking any wave band in the multispectral remote sensing image as a merging target wave band, correspondingly obtaining a plurality of merging results, merging and storing the merging results;
the expression of the linear combination fusion model is as follows:
in the method, in the process of the invention,STIrespectively merging source bands, merging target bands and intermediate bands,μ s 、μ T 、μ I respectively isSTIIs used for the average value of (a),σ s σ T σ I respectively isSTIIs used for the mean square error of (c),k I k E the method comprises the steps of extracting coefficients of undetermined features and fusion coefficients of undetermined features respectively, and mutually independent; ƒ is the fusion result of the mean value filtered image;
wherein, the determining the balance point of the fusion parameter vector space according to the two correlation coefficients of the linear combination fusion model comprises:
according to the undetermined feature extraction coefficient and the undetermined feature fusion coefficient, obtaining an extremum condition curve of a fusion parameter vector space through extremum analysis of any one of the two correlation coefficients;
According to the undetermined feature extraction coefficient and the undetermined feature fusion coefficient, obtaining an equivalent condition curve of a fusion parameter vector space through equivalent analysis of two correlation coefficients;
and determining the balance point of the fusion parameter vector space by the intersection point of the extreme value condition curve and the equivalent condition curve.
9. A computer readable storage medium having stored thereon a computer program, which when executed by a processor, implements the remote sensing image space-spectrum fusion method of any of claims 1-7.
CN202311548781.3A 2023-11-21 2023-11-21 Remote sensing image space-spectrum fusion method, system, equipment and medium based on balance point analysis Active CN117274763B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311548781.3A CN117274763B (en) 2023-11-21 2023-11-21 Remote sensing image space-spectrum fusion method, system, equipment and medium based on balance point analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311548781.3A CN117274763B (en) 2023-11-21 2023-11-21 Remote sensing image space-spectrum fusion method, system, equipment and medium based on balance point analysis

Publications (2)

Publication Number Publication Date
CN117274763A CN117274763A (en) 2023-12-22
CN117274763B true CN117274763B (en) 2024-04-05

Family

ID=89216359

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311548781.3A Active CN117274763B (en) 2023-11-21 2023-11-21 Remote sensing image space-spectrum fusion method, system, equipment and medium based on balance point analysis

Country Status (1)

Country Link
CN (1) CN117274763B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916435A (en) * 2010-08-30 2010-12-15 武汉大学 Method for fusing multi-scale spectrum projection remote sensing images
CN102013093A (en) * 2010-12-02 2011-04-13 南京大学 High resolution remote sensing image segmentation method based on Gram-Schmidt fusion and locally excitatory globally inhibitory oscillator networks (LEGION)
CN102609929A (en) * 2012-01-12 2012-07-25 河南大学 Self-adaptive independent-information remote sensing image fusion method
CN105096286A (en) * 2015-06-30 2015-11-25 中国石油天然气股份有限公司 Method and device for fusing remote sensing image
CN110415199A (en) * 2019-07-26 2019-11-05 河海大学 Multi-spectral remote sensing image fusion method and device based on residual error study
CN110533600A (en) * 2019-07-10 2019-12-03 宁波大学 A kind of same/heterogeneous remote sensing image high-fidelity broad sense sky-spectrum fusion method
CN110853026A (en) * 2019-11-16 2020-02-28 四创科技有限公司 Remote sensing image change detection method integrating deep learning and region segmentation
CN115082582A (en) * 2022-06-09 2022-09-20 珠江水利委员会珠江水利科学研究院 True color simulation method, system, equipment and medium for satellite remote sensing data
CN115423853A (en) * 2022-07-29 2022-12-02 荣耀终端有限公司 Image registration method and device
CN115527123A (en) * 2022-10-21 2022-12-27 河北省科学院地理科学研究所 Land cover remote sensing monitoring method based on multi-source feature fusion
CN116229272A (en) * 2023-03-14 2023-06-06 中国人民解放军陆军军事交通学院镇江校区 High-precision remote sensing image detection method and system based on representative point representation
CN116309227A (en) * 2023-03-23 2023-06-23 北京理工大学 Remote sensing image fusion method based on residual error network and spatial attention mechanism
CN117058053A (en) * 2023-07-18 2023-11-14 珠江水利委员会珠江水利科学研究院 IHS space-spectrum fusion method, system, equipment and medium based on mean value filtering

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050094887A1 (en) * 2003-11-05 2005-05-05 Cakir Halil I. Methods, systems and computer program products for fusion of high spatial resolution imagery with lower spatial resolution imagery using correspondence analysis
US9977978B2 (en) * 2011-11-14 2018-05-22 San Diego State University Research Foundation Image station matching, preprocessing, spatial registration and change detection with multi-temporal remotely-sensed imagery
CN110503620B (en) * 2019-07-31 2023-01-06 茂莱(南京)仪器有限公司 Image fusion method based on Fourier spectrum extraction
CN111325165B (en) * 2020-02-26 2023-05-05 中南大学 Urban remote sensing image scene classification method considering spatial relationship information

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916435A (en) * 2010-08-30 2010-12-15 武汉大学 Method for fusing multi-scale spectrum projection remote sensing images
CN102013093A (en) * 2010-12-02 2011-04-13 南京大学 High resolution remote sensing image segmentation method based on Gram-Schmidt fusion and locally excitatory globally inhibitory oscillator networks (LEGION)
CN102609929A (en) * 2012-01-12 2012-07-25 河南大学 Self-adaptive independent-information remote sensing image fusion method
CN105096286A (en) * 2015-06-30 2015-11-25 中国石油天然气股份有限公司 Method and device for fusing remote sensing image
CN110533600A (en) * 2019-07-10 2019-12-03 宁波大学 A kind of same/heterogeneous remote sensing image high-fidelity broad sense sky-spectrum fusion method
CN110415199A (en) * 2019-07-26 2019-11-05 河海大学 Multi-spectral remote sensing image fusion method and device based on residual error study
CN110853026A (en) * 2019-11-16 2020-02-28 四创科技有限公司 Remote sensing image change detection method integrating deep learning and region segmentation
CN115082582A (en) * 2022-06-09 2022-09-20 珠江水利委员会珠江水利科学研究院 True color simulation method, system, equipment and medium for satellite remote sensing data
CN115423853A (en) * 2022-07-29 2022-12-02 荣耀终端有限公司 Image registration method and device
CN115527123A (en) * 2022-10-21 2022-12-27 河北省科学院地理科学研究所 Land cover remote sensing monitoring method based on multi-source feature fusion
CN116229272A (en) * 2023-03-14 2023-06-06 中国人民解放军陆军军事交通学院镇江校区 High-precision remote sensing image detection method and system based on representative point representation
CN116309227A (en) * 2023-03-23 2023-06-23 北京理工大学 Remote sensing image fusion method based on residual error network and spatial attention mechanism
CN117058053A (en) * 2023-07-18 2023-11-14 珠江水利委员会珠江水利科学研究院 IHS space-spectrum fusion method, system, equipment and medium based on mean value filtering

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
High-Resolution Remote Sensing Image Information Extraction and Target Recognition Based on Multiple Information Fusion;Yi Liu;Min Chang;Jie Xu;IEEE Access;20200701;第8卷(第0期);第121486页-第121500页 *
基于RS和GIS的风电场微观选址的应用研究;刘超群 等;《建筑节能》;第46卷(第329期);第108页-第112页 *

Also Published As

Publication number Publication date
CN117274763A (en) 2023-12-22

Similar Documents

Publication Publication Date Title
CN110363215B (en) Method for converting SAR image into optical image based on generating type countermeasure network
Yuan et al. Remote sensing image segmentation by combining spectral and texture features
CN110070518B (en) Hyperspectral image super-resolution mapping method based on dual-path support
CN109934154B (en) Remote sensing image change detection method and detection device
Peddle et al. Large area forest classification and biophysical parameter estimation using the 5-Scale canopy reflectance model in Multiple-Forward-Mode
JP2014002738A (en) Device and method for refining and decomposing material composition of mixed pixels in remote sensing image
CN105096286A (en) Method and device for fusing remote sensing image
CN112819737A (en) Remote sensing image fusion method of multi-scale attention depth convolution network based on 3D convolution
CN113744136A (en) Image super-resolution reconstruction method and system based on channel constraint multi-feature fusion
CN111680579B (en) Remote sensing image classification method for self-adaptive weight multi-view measurement learning
CN114863173B (en) Self-mutual-attention hyperspectral image classification method for land resource audit
CN106097252A (en) High spectrum image superpixel segmentation method based on figure Graph model
CN115082582B (en) True color simulation method, system, equipment and medium for satellite remote sensing data
CN115760814A (en) Remote sensing image fusion method and system based on double-coupling deep neural network
CN113570536A (en) Panchromatic and multispectral image real-time fusion method based on CPU and GPU cooperative processing
CN112949738A (en) Multi-class unbalanced hyperspectral image classification method based on EECNN algorithm
CN117058053B (en) IHS space-spectrum fusion method, system, equipment and medium based on mean value filtering
CN115272093A (en) Hyperspectral image unmixing method based on spatial structure information constraint
Li et al. Spatial-temporal super-resolution land cover mapping with a local spatial-temporal dependence model
Liu et al. Circle-net: An unsupervised lightweight-attention cyclic network for hyperspectral and multispectral image fusion
CN117274763B (en) Remote sensing image space-spectrum fusion method, system, equipment and medium based on balance point analysis
Wang et al. Subpixel land cover mapping based on dual processing paths for hyperspectral image
Rajesh Effective morphological transformation and sub-pixel classification of clustered images
CN117197625B (en) Remote sensing image space-spectrum fusion method, system, equipment and medium based on correlation analysis
CN117253125B (en) Space-spectrum mutual injection image fusion method, system and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant