CN117078563A - Full-color sharpening method and system for hyperspectral image of first satellite of staring star - Google Patents
Full-color sharpening method and system for hyperspectral image of first satellite of staring star Download PDFInfo
- Publication number
- CN117078563A CN117078563A CN202311334228.XA CN202311334228A CN117078563A CN 117078563 A CN117078563 A CN 117078563A CN 202311334228 A CN202311334228 A CN 202311334228A CN 117078563 A CN117078563 A CN 117078563A
- Authority
- CN
- China
- Prior art keywords
- image
- hyperspectral
- fusion
- full
- resolution
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 98
- 230000004927 fusion Effects 0.000 claims abstract description 63
- 238000005070 sampling Methods 0.000 claims abstract description 9
- 230000006870 function Effects 0.000 claims description 21
- 238000004590 computer program Methods 0.000 claims description 9
- 238000002347 injection Methods 0.000 claims description 8
- 239000007924 injection Substances 0.000 claims description 8
- 230000008569 process Effects 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 5
- 238000006467 substitution reaction Methods 0.000 claims description 3
- 239000007858 starting material Substances 0.000 claims 4
- 238000001228 spectrum Methods 0.000 abstract description 11
- 230000000007 visual effect Effects 0.000 abstract description 2
- 238000004458 analytical method Methods 0.000 description 12
- 230000003595 spectral effect Effects 0.000 description 12
- 230000006698 induction Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 238000013441 quality evaluation Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 239000000243 solution Substances 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4023—Decimation- or insertion-based scaling, e.g. pixel or line decimation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4053—Super resolution, i.e. output image resolution higher than sensor resolution
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10036—Multispectral image; Hyperspectral image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20192—Edge enhancement; Edge preservation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/10—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The utility model discloses a method and a system for full-color sharpening of a hyperspectral image of a satellite No. 1 of a enlightenment, which comprises the following steps: downsampling the full-color data by using an instruction method to obtain a low-resolution full-color image; reconstructing the original hyperspectral image and the downsampled panchromatic image by using a GIHS-TV method to obtain a low-resolution fusion image; up-sampling the fusion image to obtain a high-resolution hyperspectral image; finally, reconstructing the full-color image and the original full-color image by using a GIHS-TV method to obtain a high-resolution hyperspectral fusion image with rich space details; and evaluating visual effect and image objective index of the fusion result. The application solves the spectrum fidelity problem of full-color sharpening, has rich space details, and further expands the application scene of the first-order hyperspectral data of the enlightenment.
Description
Technical Field
The application relates to an optimized full-color sharpening method, in particular to a full-variation-based stars-satellite hyperspectral full-color sharpening method under a multi-resolution analysis frame, and belongs to the fields of satellite image processing and computer vision.
Background
The first stars are hyperspectral and noctilucent multimode in-orbit programmable micro-nano remote sensing satellites. The hyperspectral data comprise spectrum information of features in visible light and near infrared, but the lower spatial resolution limits the difficulty in meeting the remote sensing task of high spatial resolution. The full-color image contains more spatial information than the hyperspectral image and can be used as an effective data source for supplementing the space detail of the hyperspectral image. Full color sharpening refers to the generation of a hyperspectral image with high spatial resolution and little spectral distortion using a hyperspectral image and a full color image of the same region.
The multi-resolution analysis full-color sharpening method is an important image fusion method, has the advantage of strong spectrum fidelity capability, and can maintain multi-scale spatial characteristics. The following problems are mainly present in the multi-resolution analytical panchromatic sharpening method: 1) Spatial fidelity performance is generally not well enhanced for edge information 2) poor fault tolerance to misregistration and computation time-consuming relative to component replacement class methods.
Disclosure of Invention
The full-color sharpening method for solving the fusion image by using the optimization mode under the multi-resolution analysis framework is provided based on the strong constraint capacity of the L1 norm full-variation and fully utilizes the enhancement capacity of the multi-resolution analysis framework on different scale information, and aims to solve the problem of poor space fidelity in the multi-resolution analysis full-color sharpening method, thereby enriching the application scene of data.
The application provides a full-color sharpening method for carrying out multi-scale fusion by using an optimization method under a multi-resolution analysis frame based on strong constraint capability of L1 norm total variation on gray information and edge information and holding capability of an instruction method on edges and positions in an up-down sampling process, and aims to improve the problem of poor space fidelity capability of the multi-resolution analysis method.
The full-color sharpening method of the multi-resolution analysis remote sensing data is easy to cause spatial distortion, the problem is innovatively solved by an optimized thought, and the spatial information of the L1 norm full-variation constraint fusion image is introduced to be constrained. In the multi-scale fusion process, a GIHS-TV method is adopted to reconstruct a fusion image of each scale. And then up-sampling the image to perform higher resolution fusion reconstruction, and finally completing multi-scale fusion image reconstruction. For any scale, the objective function construction starting point of the GIHS-TV method and its specific constraints are as follows: (1) the brightness distribution of the spatial components is changed to bring spectral distortion, so that the brightness values of the spatial components of the fused image and the spatial components of the original hyperspectral image can be directly subjected to norm constraint to preserve the brightness distribution; (2) the new spatial component needs to obtain abundant spatial details from the panchromatic image to supplement details of the fused image, and the gradient can characterize spatial detail information such as edges, so that the new spatial component and the gradient of the panchromatic image are subjected to norm constraint to expect to obtain high spatial and high spectral fidelity at the same time.
The application provides a staring star-number satellite panchromatic sharpening method based on L1 norm total variation under a multi-resolution analysis frame, which is based on strong constraint capacity of L1 norm total variation and combines with stronger holding capacity of an instruction method for edge information and spatial positioning in the up-down sampling process, is used for staring star-number data enhancement, and aims to fully utilize spectral information of a hyperspectral image and spatial details of a panchromatic image to generate the hyperspectral image with high spatial resolution, improve scene visual perception, promote remote sensing tasks such as subsequent semantic segmentation, target recognition, classification and the like.
According to a first aspect of an embodiment of the present application, there is provided a method for full-color sharpening of a hyperspectral image of a satellite number one of a start, including: for full color imagesPANDownsampling to obtain a low resolution panchromatic imageThe method comprises the steps of carrying out a first treatment on the surface of the For original hyperspectral imageHSAnd downsampled panchromatic image->Reconstruction of low-resolution fusion images using the GIHS-TV method>The method comprises the steps of carrying out a first treatment on the surface of the Fusion image for low resolution +.>Up-sampling to obtain high-resolution hyperspectral image +.>The method comprises the steps of carrying out a first treatment on the surface of the For hyperspectral image->And full color imagePANReconstruction of high-resolution hyperspectral fusion images using the GIHS-TV method>。
According to a second aspect of an embodiment of the present application, there is provided a system for full-color sharpening of a satellite-first-satellite hyperspectral image of a start, comprising: a downsampling module configured to sample a full color imagePANDownsampling to obtain a low resolution panchromatic imageThe method comprises the steps of carrying out a first treatment on the surface of the A first fusion module configured to fuse the original hyperspectral imageHSAnd downsampled panchromatic image->Reconstruction of low-resolution fusion images using the GIHS-TV method>The method comprises the steps of carrying out a first treatment on the surface of the An upsampling module configured to +_for a low resolution fused image>Up-sampling to obtain high-resolution hyperspectral image +.>The method comprises the steps of carrying out a first treatment on the surface of the A second fusion module configured to +_for hyperspectral image>And full color imagePANReconstruction of high-resolution hyperspectral fusion images using the GIHS-TV method>。
According to a third aspect of an embodiment of the present application, there is provided a computer including: a processor; a memory including one or more computer program modules; wherein the one or more computer program modules are stored in the memory and configured to be executed by the processor, the one or more computer program modules comprising instructions for implementing the method for full color sharpening of a satellite hyperspectral image of the start star.
According to a fourth aspect of embodiments of the present application, there is provided a computer readable storage medium storing non-transitory computer readable instructions that when executed by a computer enable the method of full color sharpening of satellite-first-order hyperspectral images of a start.
Drawings
FIG. 1 is a flowchart of a full-color sharpening method for a satellite with a first satellite, which is a start star based on full variation under a multi-resolution analysis framework according to an embodiment of the present application.
FIG. 2 shows a hyperspectral image, a panchromatic image, and a fusion image of the hyperspectral image and the panchromatic image according to one embodiment of the present application.
Detailed Description
The hyperspectral data of the first class of the stars can reflect the reflection spectrum information of the ground object in the range of visible light and near infrared bands, but the spatial resolution of the hyperspectral data is difficult to meet the remote sensing tasks of high spatial resolution, such as fine classification, geometric feature extraction and the like. The full-color image with abundant detail supplement is used for enhancing the space information (full-color sharpening) of the full-color image, so that a fused image with abundant space information and spectrum information can be obtained, and the application field of the first data of the enlightenment star is expanded. FIG. 1 shows a flow chart of a full-color sharpening method for a full-variation-based satellite-first-satellite hyperspectral image of a start under a multi-resolution analysis framework. Fig. 2 is a hyperspectral image, full color image, and a fusion image of the two. The method shown in fig. 1 is described in detail from 4 steps below.
Step 1, using an instruction method to carry out full-color imagePAN) Downsampling to obtain a low-resolution panchromatic image)。
Downsampling is performed specifically using the CDF9 method of the Induction, the filter coefficient vector of which may be [0.026749, -0.016864, -0.078223,0.266864,0.602949,0.266864, -0.078223, -0.016864,0.026749 ]] T 。
Step 2, for the original hyperspectral imageHS) And downsampled panchromatic image) Reconstructing a fused image with low resolution using the GIHS-TV method (/ for)>)。
Step 2.1, constructing an objective function of the GIHS-TV method under the low resolution scale
(1)
Wherein the method comprises the steps ofAnd->The spatial components of the original multispectral image and the fusion image are respectively, and the gray level distribution of the original multispectral image and the fusion image is as consistent as possible in the objective function, so that the spatial components are taken as fidelity terms. As for->The space component calculation formula under IHS transformation can be selected as (2),
(2)
wherein k is the kth band, N represents the total band number, HS k Representing the original hyperspectral image of the kth band.
New spatial componentShould be consistent with the panchromatic image, it is desirable that the spatial components of the fused image should transfer as much gradient information as possible from the panchromatic image as a regularization term.
Step 2.2, use IRNTV iterative method pairSolving, changing it into a variable +.>The L1 norm total variation minimization problem of (2) solving +.>The solution flow is shown in table 1.
(3)
TABLE 1 solving flow chart of GIHS-TV fusion algorithm
Solving for new spatial components by (4)
(4)
2.2, carrying out detail injection on each wave band of the original hyperspectral image by using a GIHS-TV method to obtain a fusion image, and completing fusion of the low-resolution hyperspectral image; first, a residual image is calculated,
(5)
Performing detail injection on hyperspectral image
(6)
Wherein,and->Respectively representing the first and the second fused and original hyperspectral imageskEach wave band, for anyk,The value is 1.
Step 3, for the fusion image with low resolution) Upsampling using the method of Induction to obtain a high resolution hyperspectral image (++>)。
Upsampling is performed specifically using the CDF7 method of the Induction, and the filter coefficient vector of the method can be set to be [ -0.091271, -0.057543,0.591271,1.115085,0.591271, -0.057543, -0.091271] T 。
Step 4, for hyperspectral image) And full color image [ ]PAN) Reconstructing a high-resolution hyperspectral fusion image (++) using GIHS-TV method>);
Step 4.1, solving an objective function of the GIHS-TV method under the high resolution scale to obtain a new space component, wherein the objective function of the GIHS-TV method under the high resolution scale is expressed as follows:
(7)
wherein the method comprises the steps ofAnd->Respectively hyperspectral image->And hyperspectral fusion image->In the target function, the gray level distribution of the two is consistent, the target function is taken as a fidelity item, and the spatial component under IHS conversion is selected to calculate +.>,
(8)
In the method, in the process of the application,Nthe number of total bands is indicated and,represent the firstkHyperspectral image of individual wavebandsHS 2 ;
Spatial componentGradient and panchromatic image of (2)PANThe gradients of (2) remain consistent and are treated as regular terms.
Pair using IRNTV iterative methodSolving by variable substitution +.> -PANChanging it to a variable +.>The L1 norm total variation minimization problem of (2) solving +.>。
By passing through=Diff+PAN 1/2 Solving for spatial components>。
Step 4.2, similarly, applying the detail injection form of the GIHS-TV under the high resolution scale to solve the final fusion image #)。
Computing residual images=/>-/>=PAN 1/2 +Diff-/>。
Residual image basedHyperspectral image using GIHS-TV methodHS 2 Performing detail injection on each wave band of the image to obtain a hyperspectral fusion image +.>:
(9)
Wherein,kthe number of bands is indicated and,take a value of 1 for anyk,/>The value is 1.
The application also provides an embodiment of a full-color sharpening system for the satellite hyperspectral image of the first satellite of the enlightenment, which comprises a downsampling module, a first fusion module, an upsampling module and a second fusion module. The downsampling module is configured to sample a full-color imagePANDownsampling to obtain a low resolution panchromatic image. The first fusion module is configured to image the original hyperspectral imageHSAnd downsampled panchromatic image->Reconstruction of low-resolution fusion images using the GIHS-TV method>. The upsampling module is configured to +.>Up-sampling to obtain high-resolution hyperspectral image +.>. A second fusion module configured to +_for hyperspectral image>And full color imagePANReconstruction of high-resolution hyperspectral fusion images using the GIHS-TV method>. The system is more detailed in each moduleThe fine implementation method refers to a full-color sharpening method of a full-variation-based satellite-first-satellite hyperspectral image of the enlightenment under a multi-resolution analysis framework.
The application also provides an embodiment of the computer. The computer includes a processor and a memory. The memory is used to store non-transitory computer-readable instructions (e.g., one or more computer program modules). The processor is configured to execute non-transitory computer readable instructions that, when executed by the processor, may perform one or more of the steps of the method for full color sharpening of satellite-first-satellite hyperspectral images described above. The memory and processor may be interconnected by a bus system and/or other forms of connection mechanisms.
For example, the processor may be a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or other form of processing unit having data processing capabilities and/or program execution capabilities. For example, the Central Processing Unit (CPU) may be an X86 or ARM architecture, or the like. The processor may be a general purpose processor or a special purpose processor, and may control other components in the computer to perform the desired functions.
For example, the memory may comprise any combination of one or more computer program products, which may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. Volatile memory can include, for example, random Access Memory (RAM) and/or cache memory (cache) and the like. The non-volatile memory may include, for example, read-only memory (ROM), hard disk, erasable programmable read-only memory (EPROM), compact disc read-only memory (CD-ROM), USB memory, flash memory, and the like. One or more computer program modules may be stored on the computer readable storage medium and executed by the processor to perform various functions of the computer.
The present application also provides an embodiment of a computer-readable storage medium storing non-transitory computer-readable instructions that, when executed by a computer, can implement one or more of the steps of the method for full-color sharpening of a satellite-first-at-a-start hyperspectral image described above. The method and the system for full-color sharpening of the satellite hyperspectral image of the first satellite of the enlightenment provided by the embodiment of the application are realized in a software form and can be stored in a computer readable storage medium when sold or used as independent products. The relevant description of the storage medium may refer to the corresponding description of the memory in the computer, and will not be repeated here.
And performing fusion visual effect evaluation on aspects of global and local tone, local space detail, small target significance and the like. And calculating a spectrum angle, a spatial correlation coefficient, a spectrum distortion index, a spatial distortion index and a no-reference index of the fusion image, and carrying out objective evaluation. The following calculation formulas are general formulas, and thus each symbol parameter is not annotated in detail.
1) Spectral Angle (SAM): the SAM reflects the spectrum fidelity degree of the fusion image and the reference hyperspectral image by calculating the spectrum included angle of the corresponding pixels of the fusion image and the reference hyperspectral image. Recording deviceAnd->And respectively obtaining gray values of a plurality of wave bands with the positions of i in the fusion image and the hyperspectral image.
(10)
The average value of the spectrum included angles of all pixels is taken as the spectrum angles of the two images. The smaller the SAM value, the higher the spectral fidelity.
2) Spatial Correlation Coefficient (SCC): the high-pass filter is used for extracting high-frequency information of the full-color image and the fusion image, and the correlation coefficient is used for calculating the correlation of the high-frequency information of the full-color image and the fusion image. Defined as
(11)
High-frequency information of kth band for fusion image,>is high frequency information of a full color image. The larger the SCC, the better the spatial correlation of the fused image and the panchromatic image.
3) Spectral distortion index):/>The spectral loss between the fused image and the hyperspectral image is characterized as defined below.
(12)
Where p acts to amplify the spectral difference, typically set to 1.The smaller the spectral distortion, the less.
Wherein Q is a universal image quality evaluation index (UIQI), and the UIQI calculates the correlation, brightness and contrast similarity between the fused image and the reference image to characterize the comprehensive performance of fusion. Commonly referred to as Q index, is defined as follows
(13)
Wherein,and->Standard deviation of the fusion image and the reference image, respectively, < >>For covariance of both ++>Andrespectively representing the average of the two. The three partial expressions represent the correlation, average luminance similarity, and contrast similarity, respectively. The higher the Q value, the more similar the representative fusion image and the reference image are.
4) Index of spatial distortion):/>For calculating the loss of spatial detail between the fused image and the panchromatic image.
(14)
Wherein,for full color images down-sampled to the same resolution of the original hyperspectral image, it should be ensured from a practical point of view that the up-sampled hyperspectral image should be perfectly aligned with the full color image, otherwise the quality index would be meaningless. q serves to amplify the spatial detail distortion variance. />The smaller the description space detail distortion is, the smaller.
5) No reference index (QNR): QNR is the most dominant non-reference index for evaluating the performance of the full-color sharpening method, which comprehensively characterizes spectral and spatial distortions, and is defined as follows
(15)
Wherein,and->Is a parameter for balancing the spectral distortion and the spatial distortion, and the larger the QNR is, the better the fusion performance is, and the maximum value is 1.
Claims (8)
1. A full-color sharpening method for a hyperspectral image of a first satellite of a enlightenment star is characterized by comprising the following steps:
for full color imagesPANDownsampling to obtain a low resolution panchromatic image;
For original hyperspectral imageHSAnd downsampled panchromatic imagesReconstruction of low-resolution fusion images using the GIHS-TV method>;
For low resolution fused imagesUp-sampling to obtain high-resolution hyperspectral image +.>;
For hyperspectral imagesAnd full color imagePANReconstruction of high-resolution hyperspectral fusion images using the GIHS-TV method>。
2. The method for full-color sharpening of a satellite hyperspectral image of Starter of claim 1,
the objective function of the GIHS-TV method at the low resolution scale is expressed as:
(1)
wherein the method comprises the steps ofAnd->Respectively original hyperspectral imagesHSAnd fusion image->In the target function, the gray level distribution of the two is consistent, the target function is taken as a fidelity item, and the spatial component under IHS conversion is selected to calculate +.>,
(2)
In the method, in the process of the application,Nthe number of total bands is indicated and,HS k represent the firstkThe original hyperspectral image of the individual bands,
new spatial componentGradient and panchromatic image->The gradient of (c) is kept consistent, as a regularization term,
pair using IRNTV iterative methodSolving, changing it into a variable +.>The L1 norm total variation minimization problem of (2) solving +.>,
(3)
Solving for new spatial components by (4):
(4)
Computing residual images
(5)
Residual image basedUsing GIHS-TV method on raw hyperspectral imageHSPerforming detail injection on each wave band of the image to obtain a fusion image with low resolution>:
(6)
Wherein,and->Respectively representing fusion image->And the original hyperspectral imageHSIs the first of (2)kEach wave band, for anyk,The value is 1.
3. The method for full-color sharpening of a satellite hyperspectral image of Starter of claim 2,
the objective function of the GIHS-TV method at the high resolution scale is expressed as:
(7)
wherein the method comprises the steps ofAnd->Respectively hyperspectral image->And hyperspectral fusion image->In the target function, the gray level distribution of the two is consistent, the target function is taken as a fidelity item, and the spatial component under IHS conversion is selected to calculate +.>,
(8)
In the method, in the process of the application,Nthe number of total bands is indicated and,represent the firstkHyperspectral image of individual wavebandsHS 2 ;
Spatial componentGradient and panchromatic image of (2)PANThe gradient of (c) is kept consistent, as a regularization term,
pair using IRNTV iterative methodSolving by variable substitution +.> -PANChanging it to a variable +.>The L1 norm total variation minimization problem of (2) solving +.>,
By passing through=Diff+PAN 1/2 Solving for spatial components>,
Computing residual images=/>-/>=PAN 1/2 +Diff-/>,
Residual image basedHyperspectral image using GIHS-TV methodHS 2 Performing detail injection on each wave band of the image to obtain a hyperspectral fusion image +.>:
(9)
Wherein,kthe number of bands is indicated and,take a value of 1 for anyk,/>The value is 1.
4. A system for full-color sharpening of a hyperspectral image of a satellite number one enlightenment, comprising:
a downsampling module configured to sample a full color imagePANDownsampling to obtain a low resolution panchromatic image;
A first fusion module configured to fuse the original hyperspectral imageHSAnd downsampled panchromatic imagesReconstruction of low-resolution fusion images using the GIHS-TV method>;
An upsampling module configured to blend the images for low resolutionUp-sampling to obtain high-resolution hyperspectral image +.>;
A second fusion module configured to fuse the hyperspectral imageAnd full color imagePANReconstruction of high-resolution hyperspectral fusion images using the GIHS-TV method>。
5. The system for full-color sharpening of a satellite hyperspectral image of Starter, first satellite of claim 4,
the objective function of the GIHS-TV method at the low resolution scale is expressed as:
(10)
wherein the method comprises the steps ofAnd->Respectively original hyperspectral imagesHSAnd fusion image->In the target function, the gray level distribution of the two is consistent, the target function is taken as a fidelity item, and the spatial component under IHS conversion is selected to calculate +.>,
(11)
In the method, in the process of the application,Nthe number of total bands is indicated and,HS k represent the firstkThe original hyperspectral image of the individual bands,
new spatial componentGradient and panchromatic image->The gradient of (c) is kept consistent, as a regularization term,
pair using IRNTV iterative methodSolving, changing it into a variable +.>The L1 norm total variation minimization problem of (2) solving +.>,
(12)
Solving for new spatial components by (13):
(13)
Computing residual images
(14)
Residual image basedUsing GIHS-TV method on raw hyperspectral imageHSPerforming detail injection on each wave band of the image to obtain a fusion image with low resolution>:
(15)
Wherein,and->Respectively representing fusion image->And the original hyperspectral imageHSIs the first of (2)kEach wave band, for anyk,The value is 1.
6. The system for full-color sharpening of a satellite hyperspectral image of Starter, first satellite of claim 5,
the objective function of the GIHS-TV method at the high resolution scale is expressed as:
(16)
wherein the method comprises the steps ofAnd->Respectively hyperspectral image->And hyperspectral fusion image->In the target function, the gray level distribution of the two is consistent, the target function is taken as a fidelity item, and the spatial component under IHS conversion is selected to calculate +.>,
(17)
In the method, in the process of the application,Nthe number of total bands is indicated and,represent the firstkHyperspectral image of individual wavebandsHS 2 ;
Spatial componentGradient and panchromatic image of (2)PANThe gradient of (c) is kept consistent, as a regularization term,
pair using IRNTV iterative methodSolving by variable substitution +.> -PANChanging it into a variableIs->The L1 norm total variation minimization problem of (2) solving +.>,
By passing through=Diff+PAN 1/2 Solving for spatial components>,
Computing residual images=/>-/>=PAN 1/2 +Diff-/>,
Residual image basedHyperspectral image using GIHS-TV methodHS 2 Performing detail injection on each wave band of the image to obtain a hyperspectral fusion image +.>:
(18)
Wherein,kthe number of bands is indicated and,take a value of 1 for anyk,/>The value is 1.
7. A computer, comprising:
a processor;
a memory including one or more computer program modules;
wherein the one or more computer program modules are stored in the memory and configured to be executed by the processor, the one or more computer program modules comprising instructions for implementing the method of full color sharpening of a satellite hyperspectral image of the start star of any one of claims 1-3.
8. A computer readable storage medium storing non-transitory computer readable instructions, wherein the non-transitory computer readable instructions, when executed by a computer, enable the method of full color sharpening of a satellite-first-at-a-start hyperspectral image of any one of claims 1-3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311334228.XA CN117078563B (en) | 2023-10-16 | 2023-10-16 | Full-color sharpening method and system for hyperspectral image of first satellite of staring star |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311334228.XA CN117078563B (en) | 2023-10-16 | 2023-10-16 | Full-color sharpening method and system for hyperspectral image of first satellite of staring star |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117078563A true CN117078563A (en) | 2023-11-17 |
CN117078563B CN117078563B (en) | 2024-02-02 |
Family
ID=88717589
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311334228.XA Active CN117078563B (en) | 2023-10-16 | 2023-10-16 | Full-color sharpening method and system for hyperspectral image of first satellite of staring star |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117078563B (en) |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090318815A1 (en) * | 2008-05-23 | 2009-12-24 | Michael Barnes | Systems and methods for hyperspectral medical imaging |
KR20120000732A (en) * | 2010-06-28 | 2012-01-04 | 서울대학교산학협력단 | An automatic segmentation method for object-based analysis using high resolution satellite imagery |
CN103617597A (en) * | 2013-10-25 | 2014-03-05 | 西安电子科技大学 | A remote sensing image fusion method based on difference image sparse representation |
CN104867124A (en) * | 2015-06-02 | 2015-08-26 | 西安电子科技大学 | Multispectral image and full-color image fusion method based on dual sparse non-negative matrix factorization |
CN105023261A (en) * | 2015-07-22 | 2015-11-04 | 太原理工大学 | Remote sensing image fusion method based on AGIHS and low-pass filter |
CN109859110A (en) * | 2018-11-19 | 2019-06-07 | 华南理工大学 | The panchromatic sharpening method of high spectrum image of control convolutional neural networks is tieed up based on spectrum |
CN110533620A (en) * | 2019-07-19 | 2019-12-03 | 西安电子科技大学 | The EO-1 hyperion and panchromatic image fusion method of space characteristics are extracted based on AAE |
CN111008936A (en) * | 2019-11-18 | 2020-04-14 | 华南理工大学 | Multispectral image panchromatic sharpening method |
CN112528914A (en) * | 2020-12-19 | 2021-03-19 | 东南数字经济发展研究院 | Satellite image full-color enhancement method for gradually integrating detail information |
CN112634137A (en) * | 2020-12-28 | 2021-04-09 | 西安电子科技大学 | Hyperspectral and full-color image fusion method based on AE extraction of multi-scale spatial spectrum features |
US20210319534A1 (en) * | 2020-04-08 | 2021-10-14 | Mitsubishi Electric Research Laboratories, Inc. | Systems and Methods for Blind Multi-Spectral Image Fusion |
CN113570536A (en) * | 2021-07-31 | 2021-10-29 | 中国人民解放军61646部队 | Panchromatic and multispectral image real-time fusion method based on CPU and GPU cooperative processing |
CN114663301A (en) * | 2022-03-05 | 2022-06-24 | 西北工业大学 | Convolutional neural network panchromatic sharpening method based on wavelet layer |
CN115345792A (en) * | 2022-08-10 | 2022-11-15 | 河南大学 | Panchromatic sharpening method based on U-shaped pyramid residual error structure |
-
2023
- 2023-10-16 CN CN202311334228.XA patent/CN117078563B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090318815A1 (en) * | 2008-05-23 | 2009-12-24 | Michael Barnes | Systems and methods for hyperspectral medical imaging |
KR20120000732A (en) * | 2010-06-28 | 2012-01-04 | 서울대학교산학협력단 | An automatic segmentation method for object-based analysis using high resolution satellite imagery |
CN103617597A (en) * | 2013-10-25 | 2014-03-05 | 西安电子科技大学 | A remote sensing image fusion method based on difference image sparse representation |
CN104867124A (en) * | 2015-06-02 | 2015-08-26 | 西安电子科技大学 | Multispectral image and full-color image fusion method based on dual sparse non-negative matrix factorization |
CN105023261A (en) * | 2015-07-22 | 2015-11-04 | 太原理工大学 | Remote sensing image fusion method based on AGIHS and low-pass filter |
CN109859110A (en) * | 2018-11-19 | 2019-06-07 | 华南理工大学 | The panchromatic sharpening method of high spectrum image of control convolutional neural networks is tieed up based on spectrum |
CN110533620A (en) * | 2019-07-19 | 2019-12-03 | 西安电子科技大学 | The EO-1 hyperion and panchromatic image fusion method of space characteristics are extracted based on AAE |
CN111008936A (en) * | 2019-11-18 | 2020-04-14 | 华南理工大学 | Multispectral image panchromatic sharpening method |
US20210319534A1 (en) * | 2020-04-08 | 2021-10-14 | Mitsubishi Electric Research Laboratories, Inc. | Systems and Methods for Blind Multi-Spectral Image Fusion |
CN112528914A (en) * | 2020-12-19 | 2021-03-19 | 东南数字经济发展研究院 | Satellite image full-color enhancement method for gradually integrating detail information |
CN112634137A (en) * | 2020-12-28 | 2021-04-09 | 西安电子科技大学 | Hyperspectral and full-color image fusion method based on AE extraction of multi-scale spatial spectrum features |
CN113570536A (en) * | 2021-07-31 | 2021-10-29 | 中国人民解放军61646部队 | Panchromatic and multispectral image real-time fusion method based on CPU and GPU cooperative processing |
CN114663301A (en) * | 2022-03-05 | 2022-06-24 | 西北工业大学 | Convolutional neural network panchromatic sharpening method based on wavelet layer |
CN115345792A (en) * | 2022-08-10 | 2022-11-15 | 河南大学 | Panchromatic sharpening method based on U-shaped pyramid residual error structure |
Non-Patent Citations (2)
Title |
---|
MILOUD CHIKR EL-MEZOUAR等: "Edge Preservation in Ikonos Multispectral and Panchromatic Imagery Pan-sharpening", 《1ST TAIBAH UNIVERSITY INTERNATIONAL CONFERENCE ON COMPUTING AND INFORMATION TECHNOLOGY》, pages 1 - 6 * |
罗舒月: "面向全色与多光谱遥感影像融合的全色锐化算法研究", 《中国博士学位论文全文数据库工程科技Ⅱ辑》, no. 09, pages 028 - 1 * |
Also Published As
Publication number | Publication date |
---|---|
CN117078563B (en) | 2024-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Jiang et al. | Learning spatial-spectral prior for super-resolution of hyperspectral imagery | |
Yang et al. | PanNet: A deep network architecture for pan-sharpening | |
CN110533620B (en) | Hyperspectral and full-color image fusion method based on AAE extraction spatial features | |
CN110428387B (en) | Hyperspectral and full-color image fusion method based on deep learning and matrix decomposition | |
CN114119444B (en) | Multi-source remote sensing image fusion method based on deep neural network | |
US8699790B2 (en) | Method for pan-sharpening panchromatic and multispectral images using wavelet dictionaries | |
CN112819737B (en) | Remote sensing image fusion method of multi-scale attention depth convolution network based on 3D convolution | |
CN109146787B (en) | Real-time reconstruction method of dual-camera spectral imaging system based on interpolation | |
KR20130001213A (en) | Method and system for generating an output image of increased pixel resolution from an input image | |
CN109859110A (en) | The panchromatic sharpening method of high spectrum image of control convolutional neural networks is tieed up based on spectrum | |
Zhao et al. | FGF-GAN: A lightweight generative adversarial network for pansharpening via fast guided filter | |
CN108830791A (en) | Image super-resolution method based on itself sample and rarefaction representation | |
CN113744136A (en) | Image super-resolution reconstruction method and system based on channel constraint multi-feature fusion | |
Karwowska et al. | Using super-resolution algorithms for small satellite imagery: A systematic review | |
Mishra et al. | Self-FuseNet: data free unsupervised remote sensing image super-resolution | |
CN115311184A (en) | Remote sensing image fusion method and system based on semi-supervised deep neural network | |
CN113284045A (en) | HSI super-resolution reconstruction method based on transfer learning and spectrum recovery and related equipment | |
Deng et al. | Multiple frame splicing and degradation learning for hyperspectral imagery super-resolution | |
Qu et al. | An interpretable unsupervised unrolling network for hyperspectral pansharpening | |
Yang et al. | Hierarchical spatio-spectral fusion for hyperspectral image super resolution via sparse representation and pre-trained deep model | |
CN117078563B (en) | Full-color sharpening method and system for hyperspectral image of first satellite of staring star | |
CN111899166A (en) | Medical hyperspectral microscopic image super-resolution reconstruction method based on deep learning | |
Wang et al. | Heterogeneous two-stream network with hierarchical feature prefusion for multispectral pan-sharpening | |
Yang et al. | Depth super-resolution with color guidance: a review | |
Yao et al. | A multi-expose fusion image dehazing based on scene depth information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |