CN117197018A - Multiplication transformation remote sensing image fusion method for maintaining spectral distribution - Google Patents
Multiplication transformation remote sensing image fusion method for maintaining spectral distribution Download PDFInfo
- Publication number
- CN117197018A CN117197018A CN202311254447.7A CN202311254447A CN117197018A CN 117197018 A CN117197018 A CN 117197018A CN 202311254447 A CN202311254447 A CN 202311254447A CN 117197018 A CN117197018 A CN 117197018A
- Authority
- CN
- China
- Prior art keywords
- image
- multispectral image
- multispectral
- full
- sampled
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000009466 transformation Effects 0.000 title claims abstract description 36
- 230000003595 spectral effect Effects 0.000 title claims abstract description 19
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 13
- 238000000034 method Methods 0.000 claims abstract description 38
- 230000004927 fusion Effects 0.000 claims abstract description 33
- 238000001228 spectrum Methods 0.000 claims abstract description 25
- 238000010606 normalization Methods 0.000 claims abstract description 21
- 238000013507 mapping Methods 0.000 claims abstract description 17
- 238000012417 linear regression Methods 0.000 claims abstract description 7
- 238000004364 calculation method Methods 0.000 claims description 11
- 230000001186 cumulative effect Effects 0.000 claims description 3
- 230000001502 supplementing effect Effects 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 2
- 238000007499 fusion processing Methods 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 5
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 239000000654 additive Substances 0.000 description 2
- 230000000996 additive effect Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000004321 preservation Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000000513 principal component analysis Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000011426 transformation method Methods 0.000 description 1
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/10—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
Landscapes
- Image Processing (AREA)
Abstract
The application discloses a multiplicative transformation remote sensing image fusion method for maintaining spectrum distribution, which comprises the following steps: upsampling the multispectral image to the size of a panchromatic image; then, histogram information of the full-color image and the multispectral image is counted, and normalization operation is carried out on the full-color image and the up-sampled multispectral image respectively; performing multiple linear regression on the up-sampled multispectral image and the panchromatic image to obtain a fitted low-resolution panchromatic image; performing compensation and mapping of texture details and spectral intensities on the up-sampled multispectral image by utilizing multiplicative transformation and inverse normalization operation; and finally, further mapping the spectrum distribution of the fusion image to be consistent with the multispectral image by using a histogram matching method, so as to obtain a final fusion image. The method can effectively reduce detail distortion and spectrum distortion generated in the fusion process, so that the fused image has high object edge contour distinction degree on details, obvious color contrast degree on spectrum and strong color layering sense.
Description
Technical Field
The application relates to the technical field of digital image processing, in particular to a multiplicative transformation remote sensing image fusion method for maintaining spectral distribution.
Background
Panchromatic and multispectral image fusion is an important technology in the field of remote sensing, aimed at combining high-resolution panchromatic images with multispectral images to obtain a composite image with both spatial and spectral information. The remote sensing technology can acquire the information of the earth surface by carrying sensors on carriers such as satellites, aircrafts and the like, and is widely applied to the fields of environment monitoring, resource management, urban planning and the like. Full-color images possess high resolution but lower spectral information, while multispectral images contain multiple narrow bands of spectral information, but have relatively lower resolution. Therefore, the two types of images are fused, so that the respective advantages of the two types of images can be fully utilized, and more abundant and accurate information is provided.
Currently, the mainstream full-color and multispectral image fusion techniques mostly utilize conventional digital image processing techniques. The earliest remote sensing image fusion is carried out by a pixel-level method, namely, simply adding or multiplying gray information of a full-color image and spectrum information of a multispectral image. However, this approach ignores the complex relationship between spectral and spatial information, resulting in the problem of information loss in the fused image. In order to overcome the problems and ensure the operation efficiency, researchers propose a fusion method based on transformation, and the existing method based on the additive transformation mainly comprises principal component analysis change, laplacian transformation, wavelet transformation and the like, and the methods can better retain spectrum information, but lack texture details of targets, so that the problem of artifacts exists in images. The method based on multiplicative transformation mainly comprises HIS transformation, brovey transformation and Schmidt orthogonal transformation, and the method well solves the problem of texture detail loss in the fusion process, but has a certain degree of spectral distortion compared with the method of additive transformation.
Therefore, how to solve the problems of detail distortion and spectrum distortion existing in the transformation-based method and improve the fusion quality of remote sensing images under different ground object types is a technical problem that needs to be solved by those skilled in the art.
Disclosure of Invention
In view of this, the application provides a multiplicative transformation remote sensing image fusion method for at least solving the above part of technical problems, adds normalization and inverse normalization operations of distribution on the basis of multiplicative transformation to adaptively adjust the distribution, solves the problem of spectrum distortion caused by inconsistent data distribution, and further adds a histogram matching method to the fused image to map the data distribution of the fused image to be consistent with that of a multispectral image, thereby facilitating the preservation of original spectrum information of the fused image while ensuring that texture details are not lost, ensuring obvious color contrast of the fused image and strong color layering sense, and effectively improving the quality of the fused image.
In order to achieve the above purpose, the technical scheme adopted by the application is as follows:
the embodiment of the application provides a multiplicative transformation remote sensing image fusion method for maintaining spectral distribution, which comprises the following steps:
s1, upsampling a multispectral image to the spatial resolution of a full-color image;
s2, counting pixel histograms of the full-color image and the multispectral image, and respectively carrying out normalization operation on the full-color image and the up-sampled multispectral image;
s3, performing multiple linear regression solution on the up-sampled multispectral image and the panchromatic image to obtain a fitted low-resolution panchromatic image;
s4, performing compensation and mapping on texture details and spectrum intensity of the up-sampled multispectral image by utilizing multiplicative transformation and inverse normalization operation to obtain a primary fusion image;
and S5, mapping the spectrum distribution of the obtained primary fusion image to be consistent with the multispectral image by using a histogram matching method, and obtaining a final fusion image.
Further, the step S2 includes:
s21, calculating the mean value and the variance of each wave band of the full-color image and the multispectral image respectively by counting the pixel histograms of the full-color image and the multispectral image, wherein the calculation formula is as follows:
where pan represents a full-color image, mul represents a multispectral image, mean represents a Mean, std represents a variance, k represents a kth band of the multispectral image, W is a width of the full-color image, H is a height of the full-color image, i=1..w×h, W is a width of the multispectral image, H is a height of the multispectral image, and the constraints of w=4×w and h=4×h are satisfied;
s22, respectively normalizing data distribution of the full-color image and the up-sampled multispectral image according to the mean value and the variance obtained in the step S21, and mapping the full-color image and the multispectral image into the same data distribution through normalization operation; the normalization formula is:
wherein,representing an up-sampled multispectral image, F represents a constraint that adjusts the data distribution.
Further, the step S3 includes:
s31, performing multiple linear regression solution on the up-sampled multispectral image and the full-color image, and calculating and solving the weighting coefficient of each wave band by adopting a least square method, wherein the calculation formula is as follows:
wherein b k The weighting coefficient required by the kth wave band of the up-sampling multispectral image is represented, K is the total number of wave bands of the multispectral image;
s32, after the weighting coefficient of each wave band of the up-sampled multispectral image is obtained through calculation, constructing a low-resolution panchromatic image through weighted summation, wherein the weighted summation formula is as follows:
wherein,representing a full color image of low resolution.
Further, the step S4 includes:
s41, calculating the texture details required to be supplemented for obtaining the up-sampled multispectral image by using the low-resolution panchromatic image, wherein the calculation formula is as follows:
wherein D is i Representing texture details;
s42, supplementing texture details into the up-sampled multispectral image through multiplicative transformation, and performing inverse normalization operation to map data distribution to the distribution range of the multispectral image, wherein the formula is as follows:
wherein,representing the calculated preliminary fusion image.
Further, the step S5 specifically includes:
s51, respectively calculating cumulative histograms of the primary fusion image and the multispectral image, and setting a range of pixel values;
s52, sequentially calculating absolute values of differences between each value and each value of the multi-spectrum image histogram by using Euclidean distance in the histogram of the primary fusion image to obtain a difference value table between the two histograms;
s53, finding the minimum value in the difference value table, and establishing gray level mapping between the two histograms;
s54, mapping through gray level, and finally searching mapped pixel values by a dichotomy to obtain a final fusion image.
Further, in the step S51, the range of the pixel value is set between 0 and 65536.
Compared with the prior art, the application has at least the following beneficial effects:
1. the application provides a multiplicative transformation remote sensing image fusion method for maintaining spectral distribution, which adds normalization and inverse normalization operations of distribution on the basis of multiplicative transformation to adaptively adjust the distribution, solves the problem of spectral distortion caused by inconsistent data distribution, and simultaneously further adds a histogram matching method to the fused image to map the data distribution of the fused image to be consistent with that of a multispectral image, thereby being convenient for ensuring that texture details are not lost, simultaneously keeping the original spectral information of the fused image, ensuring that the color contrast of the fused image is obvious, the color layering sense is strong, and effectively improving the quality of the fused image.
2. The application can better reserve spectrum information while ensuring smaller calculated amount, improves the fusion quality of remote sensing images under different ground object types, and can be well deployed on various low-performance hardware platforms.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and drawings.
The technical scheme of the application is further described in detail through the drawings and the embodiments.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
The accompanying drawings are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate the application and together with the embodiments of the application, serve to explain the application.
Fig. 1 is a schematic flow chart of a multiplicative transformation remote sensing image fusion method for spectrum distribution maintenance according to an embodiment of the present application.
FIG. 2a is a schematic diagram of an original multispectral image according to an embodiment of the present application.
Fig. 2b is a schematic diagram of an embodiment of the present application without histogram matching.
Fig. 2c is a schematic diagram of adding histogram matching according to an embodiment of the present application.
FIG. 3a is a schematic diagram of an original multiplicative transformation in a shadow area of a body of water according to an embodiment of the application.
Fig. 3b is a schematic diagram of addition distribution adaptive mapping in a shadow area of a water body according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. Moreover, various numbers and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Referring to fig. 1, the present application provides a multiplicative transformation remote sensing image fusion method for maintaining spectral distribution, wherein the input of the method is performed on the basis of registration of full-color and multispectral remote sensing images, and the method mainly comprises the following steps:
s1, upsampling a multispectral image to the spatial resolution of a full-color image;
s2, counting pixel histograms of the full-color image and the multispectral image, and respectively carrying out normalization operation on the full-color image and the up-sampled multispectral image;
s3, performing multiple linear regression solution on the up-sampled multispectral image and the panchromatic image to obtain a fitted low-resolution panchromatic image;
s4, performing compensation and mapping on texture details and spectrum intensity of the up-sampled multispectral image by utilizing multiplicative transformation and inverse normalization operation to obtain a primary fusion image;
and S5, mapping the spectrum distribution of the obtained primary fusion image to be consistent with the multispectral image by using a histogram matching method, and obtaining a final fusion image.
The following describes in detail the specific embodiments of the steps described above:
in a specific embodiment, the step S2 specifically includes:
s21, calculating the mean value and the variance of each wave band of the full-color image and the multispectral image respectively by counting the pixel histograms of the full-color image and the multispectral image, wherein the calculation formula is as follows:
where pan represents a full-color image, mul represents a multispectral image, mean represents a Mean, std represents a variance, k represents a kth band of the multispectral image, W is a width of the full-color image, H is a height of the full-color image, i=1..w×h, W is a width of the multispectral image, H is a height of the multispectral image, and the constraints of w=4×w and h=4×h are satisfied;
s22, respectively carrying out data distribution normalization on the full-color image and the up-sampled multispectral image according to the mean value and the variance obtained in the step S21, wherein the specific formula is as follows:
wherein,representing an up-sampled multispectral image, F represents a constraint that adjusts the data distribution. Panchromatic and multispectral images are mapped into the same data distribution by normalization operations.
In a specific embodiment, the step S3 specifically includes:
s31, performing multiple linear regression solution on the up-sampled multispectral image and the full-color image, and calculating and solving the weighting coefficient b of each wave band by adopting a least square method k The calculation formula is as follows:
wherein b k Representing the weighting coefficients required to upsample the kth band of the multispectral image, K being the total number of bands of the multispectral image (typically k=4);
s32, calculating the weighting coefficient b of each wave band of the up-sampled multispectral image k Thereafter, a low resolution panchromatic image is constructed by weighted summation, the formula:
wherein,representing a full color image of low resolution.
In a specific embodiment, the step S4 specifically includes:
s41, calculating the texture details required to be supplemented for obtaining the up-sampled multispectral image by utilizing the low-resolution panchromatic image, wherein the calculation formula is as follows:
wherein D is i Representing texture details;
s42, supplementing texture details into the up-sampled multispectral image through multiplicative transformation, and performing inverse normalization operation to map data distribution to the distribution range of the multispectral image, wherein the formula is as follows:
wherein,representing the calculated preliminary fusion image.
In a specific embodiment, the step S5 specifically includes:
s51, respectively calculating cumulative histograms of the primary fusion image and the multispectral image, wherein the pixel value range is set to be 0-65536 due to the fact that the pixel range of the remote sensing image is larger;
s52, sequentially calculating absolute values of differences between each value and each value of the multi-spectrum image histogram by using Euclidean distance in the histogram of the primary fusion image to obtain a difference value table between the two histograms;
s53, finding the minimum value in the difference value table, and establishing gray level mapping between the two histograms;
s54, mapping through gray level, and finally searching mapped pixel values by a dichotomy to obtain a final fusion image.
The effect of the method of the application is verified as follows:
in this embodiment, the comparison of the graphs before and after the histogram matching method is added is shown in fig. 2, where fig. 2a is an original multispectral graph, fig. 2b is a graph without histogram matching added, and fig. 2c is a graph with histogram matching added; after the histogram matching is added, the original spectrum information of the fused image is reserved while the texture details are not lost, so that the color contrast of the fused image is obvious, the color layering sense is strong, and the quality of the fused image is effectively improved.
FIG. 3 shows a graph of the results of the method of the present application compared with the multiplicative transformation method in a shadow area of a body of water, wherein FIG. 3a is an effect graph of the original multiplicative transformation, and FIG. 3b is an effect graph of the method of the present application. As can be seen from fig. 3, the method of the present application effectively avoids spectral distortion generated in the multiplicative transformation fusion process in the shadow region of the water body, and proves the effectiveness of the method of the present application.
It should be noted that: the multiplicative transformation remote sensing image fusion method for maintaining the spectrum distribution is mainly based on analysis and understanding of full-color image and multispectral image fusion and is specially provided, and obviously the method is also applicable to the problem of fusion of other multispectral remote sensing images.
From the description of the above embodiments, those skilled in the art will appreciate that the present application aims to solve the problems of detail distortion and spectral distortion existing in transform-based methods, while satisfying fast operation in a low hardware environment; a multiplicative transformation remote sensing image fusion method for maintaining spectrum distribution is provided. The method is improved on the basis of multiplicative transformation, the spectrum information can be well reserved while the smaller calculated amount is ensured, the saturation of the spectrum is ensured while the detail texture is reserved, and the fusion quality of remote sensing images under different ground object types is improved; the method has good generalization performance, can be well deployed on various low-performance hardware platforms, and can be applied to fusion products of different satellites.
In addition, in an embodiment of the present application, there is provided a storage medium having stored thereon one or more programs readable by a computing device, the one or more programs including instructions, which when executed by the computing device, cause the computing device to perform a method of multiplicative transformation remote sensing image fusion for spectral distribution preservation as described above.
In this embodiment, the storage medium may be, for example, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the storage medium include: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, and any suitable combination of the foregoing.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
It is to be noticed that the term 'comprising', does not exclude the presence of elements or steps other than those listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (6)
1. The multiplicative transformation remote sensing image fusion method for maintaining the spectral distribution is characterized by comprising the following steps of:
s1, upsampling a multispectral image to the spatial resolution of a full-color image;
s2, counting pixel histograms of the full-color image and the multispectral image, and respectively carrying out normalization operation on the full-color image and the up-sampled multispectral image;
s3, performing multiple linear regression solution on the up-sampled multispectral image and the panchromatic image to obtain a fitted low-resolution panchromatic image;
s4, performing compensation and mapping on texture details and spectrum intensity of the up-sampled multispectral image by utilizing multiplicative transformation and inverse normalization operation to obtain a primary fusion image;
and S5, mapping the spectrum distribution of the obtained primary fusion image to be consistent with the multispectral image by using a histogram matching method, and obtaining a final fusion image.
2. The method of claim 1, wherein the step S2 includes:
s21, calculating the mean value and the variance of each wave band of the full-color image and the multispectral image respectively by counting the pixel histograms of the full-color image and the multispectral image, wherein the calculation formula is as follows:
where pan represents a full-color image, mul represents a multispectral image, mean represents a Mean, std represents a variance, k represents a kth band of the multispectral image, W is a width of the full-color image, H is a height of the full-color image, i=1..w×h, W is a width of the multispectral image, H is a height of the multispectral image, and the constraints of w=4×w and h=4×h are satisfied;
s22, respectively normalizing data distribution of the full-color image and the up-sampled multispectral image according to the mean value and the variance obtained in the step S21, and mapping the full-color image and the multispectral image into the same data distribution through normalization operation; the normalization formula is:
wherein,representing an up-sampled multispectral image, F represents a constraint that adjusts the data distribution.
3. The method of claim 1, wherein the step S3 includes:
s31, performing multiple linear regression solution on the up-sampled multispectral image and the full-color image, and calculating and solving the weighting coefficient of each wave band by adopting a least square method, wherein the calculation formula is as follows:
wherein b k The weighting coefficient required by the kth wave band of the up-sampling multispectral image is represented, K is the total number of wave bands of the multispectral image;
s32, after the weighting coefficient of each wave band of the up-sampled multispectral image is obtained through calculation, constructing a low-resolution panchromatic image through weighted summation, wherein the weighted summation formula is as follows:
wherein,representing a full color image of low resolution.
4. The method of claim 1, wherein the step S4 includes:
s41, calculating the texture details required to be supplemented for obtaining the up-sampled multispectral image by using the low-resolution panchromatic image, wherein the calculation formula is as follows:
wherein D is i Representing texture details;
s42, supplementing texture details into the up-sampled multispectral image through multiplicative transformation, and performing inverse normalization operation to map data distribution to the distribution range of the multispectral image, wherein the formula is as follows:
wherein,representing the calculated preliminary fusion image.
5. The method of claim 1, wherein the step S5 includes:
s51, respectively calculating cumulative histograms of the primary fusion image and the multispectral image, and setting a range of pixel values;
s52, sequentially calculating absolute values of differences between each value and each value of the multi-spectrum image histogram by using Euclidean distance in the histogram of the primary fusion image to obtain a difference value table between the two histograms;
s53, finding the minimum value in the difference value table, and establishing gray level mapping between the two histograms;
s54, mapping through gray level, and finally searching mapped pixel values by a dichotomy to obtain a final fusion image.
6. The method for fusing a multiplicative transformed remote sensing image with spectral distribution maintained as recited in claim 5, wherein in said step S51, the range of pixel values is set to be 0-65536.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311254447.7A CN117197018A (en) | 2023-09-27 | 2023-09-27 | Multiplication transformation remote sensing image fusion method for maintaining spectral distribution |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311254447.7A CN117197018A (en) | 2023-09-27 | 2023-09-27 | Multiplication transformation remote sensing image fusion method for maintaining spectral distribution |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117197018A true CN117197018A (en) | 2023-12-08 |
Family
ID=88997898
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311254447.7A Pending CN117197018A (en) | 2023-09-27 | 2023-09-27 | Multiplication transformation remote sensing image fusion method for maintaining spectral distribution |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117197018A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107016641A (en) * | 2016-09-23 | 2017-08-04 | 北京航空航天大学 | A kind of panchromatic and hyperspectral image fusion method based on improvement ratio transformation |
CN109886904A (en) * | 2019-01-25 | 2019-06-14 | 北京市遥感信息研究所 | A kind of SAR image and low resolution Multispectral Image Fusion Methods and system |
CN111383203A (en) * | 2019-11-07 | 2020-07-07 | 北京航空航天大学 | Panchromatic and multispectral remote sensing image fusion method based on regional fitting |
CN112085684A (en) * | 2020-07-23 | 2020-12-15 | 中国资源卫星应用中心 | Method and device for fusing remote sensing images |
CN115063336A (en) * | 2022-08-18 | 2022-09-16 | 北京理工大学 | Full-color and multispectral image fusion method and device and medium thereof |
CN115760666A (en) * | 2022-11-18 | 2023-03-07 | 北京航空航天大学 | Remote sensing image fusion method combining ratio transformation and distribution transformation |
-
2023
- 2023-09-27 CN CN202311254447.7A patent/CN117197018A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107016641A (en) * | 2016-09-23 | 2017-08-04 | 北京航空航天大学 | A kind of panchromatic and hyperspectral image fusion method based on improvement ratio transformation |
CN109886904A (en) * | 2019-01-25 | 2019-06-14 | 北京市遥感信息研究所 | A kind of SAR image and low resolution Multispectral Image Fusion Methods and system |
CN111383203A (en) * | 2019-11-07 | 2020-07-07 | 北京航空航天大学 | Panchromatic and multispectral remote sensing image fusion method based on regional fitting |
CN112085684A (en) * | 2020-07-23 | 2020-12-15 | 中国资源卫星应用中心 | Method and device for fusing remote sensing images |
CN115063336A (en) * | 2022-08-18 | 2022-09-16 | 北京理工大学 | Full-color and multispectral image fusion method and device and medium thereof |
CN115760666A (en) * | 2022-11-18 | 2023-03-07 | 北京航空航天大学 | Remote sensing image fusion method combining ratio transformation and distribution transformation |
Non-Patent Citations (1)
Title |
---|
李晓玲;聂祥飞;黄海波;张月;: "基于改进引导滤波和量子遗传算法的图像融合", 电光与控制, no. 02, 20 August 2019 (2019-08-20) * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8547389B2 (en) | Capturing image structure detail from a first image and color from a second image | |
CN110533620B (en) | Hyperspectral and full-color image fusion method based on AAE extraction spatial features | |
US20110243438A1 (en) | Generation of multi-resolution image pyramids | |
CN107358586A (en) | A kind of image enchancing method, device and equipment | |
Vijayalakshmi et al. | A novel contrast enhancement technique using gradient-based joint histogram equalization | |
Chen et al. | A new process for the segmentation of high resolution remote sensing imagery | |
CN104036461B (en) | A kind of Infrared Complex Background suppressing method based on Federated filter | |
Mahmood et al. | Human visual enhancement using multi scale retinex | |
CN111563866B (en) | Multisource remote sensing image fusion method | |
CN111340741B (en) | Particle swarm optimization gray image enhancement method based on quaternion and L1 norm | |
CN117422631A (en) | Infrared image enhancement method based on adaptive filtering layering | |
Huang et al. | Multi-feature combined for building shadow detection in GF-2 Images | |
CN117197018A (en) | Multiplication transformation remote sensing image fusion method for maintaining spectral distribution | |
KR102160687B1 (en) | Aviation image fusion method | |
CN115937302A (en) | Hyperspectral image sub-pixel positioning method combined with edge preservation | |
Huang | Wavelet for image fusion | |
Patil et al. | FWFusion: Fuzzy Whale Fusion model for MRI multimodal image fusion | |
CN116862809A (en) | Image enhancement method under low exposure condition | |
CN116091322A (en) | Super-resolution image reconstruction method and computer equipment | |
CN116246138A (en) | Infrared-visible light image target level fusion method based on full convolution neural network | |
CN116109535A (en) | Image fusion method, device and computer readable storage medium | |
CN111462025B (en) | Infrared and visible light image fusion method based on multi-scale low-rank matrix decomposition | |
CN115100075A (en) | Hyperspectral panchromatic sharpening method based on spectral constraint and residual error attention network | |
CN110895790A (en) | Scene image super-resolution method based on posterior degradation information estimation | |
Hong et al. | An improved color consistency optimization method based on the reference image contaminated by clouds |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |