CN112419212A - Infrared and visible light image fusion method based on side window guide filtering - Google Patents

Infrared and visible light image fusion method based on side window guide filtering Download PDF

Info

Publication number
CN112419212A
CN112419212A CN202011101778.3A CN202011101778A CN112419212A CN 112419212 A CN112419212 A CN 112419212A CN 202011101778 A CN202011101778 A CN 202011101778A CN 112419212 A CN112419212 A CN 112419212A
Authority
CN
China
Prior art keywords
infrared
image
visible light
layer
side window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011101778.3A
Other languages
Chinese (zh)
Other versions
CN112419212B (en
Inventor
肖锐
石皓元
李明旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kale Micro Vision Technology Yunnan Co ltd
Original Assignee
Kale Micro Vision Technology Yunnan Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kale Micro Vision Technology Yunnan Co ltd filed Critical Kale Micro Vision Technology Yunnan Co ltd
Priority to CN202011101778.3A priority Critical patent/CN112419212B/en
Priority claimed from CN202011101778.3A external-priority patent/CN112419212B/en
Publication of CN112419212A publication Critical patent/CN112419212A/en
Application granted granted Critical
Publication of CN112419212B publication Critical patent/CN112419212B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The invention discloses an infrared and visible light image fusion method based on side window guide filtering, which belongs to the technical field of image processing and comprises the following steps: respectively carrying out fuzzy processing on input infrared and visible light source images by using a Gaussian filter to obtain a base layer with large scale; subtracting the image of the basic layer from the image of the infrared and visible light sources to obtain a detailed layer image containing small-scale information; calculating the correlation coefficient of the infrared and visible light source images to obtain the fusion weight coefficient of the base layer, and fusing the base layer images; obtaining an initial detail layer weight map by using a maximum absolute value rule, guiding filtering and optimizing the initial detail layer weight map by using a side window, and fusing detail layer images; and performing secondary fusion by using the fused base layer image and the fused detail layer image to obtain a final fusion image. The invention can realize the fusion of the infrared and visible light images, and the fused images can keep good contrast.

Description

Infrared and visible light image fusion method based on side window guide filtering
Technical Field
The invention relates to the technical field of image processing, in particular to a method for fusing an infrared light image and a visible light image based on side window guide filtering.
Background
For the same application scene, different bands of image sensors may reflect different scene information. Image fusion of different spectral bands is one of the research hotspots in the fields of computer vision and image processing. Infrared light and visible light have different imaging principles. Infrared imaging sensors capture thermal radiation emitted by an object and are extremely sensitive to hot targets, but lack background texture detail. Visible light image sensors can capture more scene detail and texture information, but are susceptible to interference from the imaged scene. Such as lighting conditions, fog, occlusion, etc., can severely affect image quality. The infrared and visible light image fusion can provide more complementary information, and is more beneficial to human eye observation or computer vision analysis. In recent years, infrared and visible light image fusion is widely applied in the fields of video fusion, night vision, biological identification, remote sensing, military, agriculture and the like.
In the past decades, a large number of image fusion methods have been proposed and applied in different fields, among which methods based on multi-scale decomposition, methods based on sparse representation, methods based on principal component analysis, the three most commonly used. At present, the method based on multi-scale decomposition is the hottest and the most widely used method. However, these methods generally have the disadvantages of low computational efficiency, low contrast of fused images, insufficient target projection, easy occurrence of halo and artifact phenomena, and the like.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide an infrared and visible light image fusion method based on side window guide filtering. Firstly, carrying out multi-scale decomposition on input infrared and visible light images by using Gaussian filtering to obtain a base layer image containing large-scale information and a detail layer image containing small-scale information. Then, different fusion strategies are used in the different upscaled images. The weight coefficient of the fusion of the basic layer is obtained by calculating the correlation coefficient of the input infrared and visible light images, and the fused images are ensured to have enough contrast. And obtaining an initial weight map by the weight map of the detail layer through a maximum absolute value rule, and obtaining a fused weight map of the detail layer by using side window guided filtering optimization. And finally, obtaining a final fusion image through linear addition.
The invention provides an infrared and visible light image fusion method based on a side window guide filtering optimization weight graph by utilizing a Gaussian filtering multi-scale image decomposition and side window guide filtering optimization weight graph method, which mainly comprises the following steps of:
1: and a multi-scale decomposition tool of the Gaussian filter is utilized, so that the scale image can be effectively separated. Decomposing an input source image into a base layer containing large-scale information and a detail layer containing small-scale information, and fusing at different scale layers by adopting different fusion strategies. Meanwhile, the fused image is reconstructed by linear addition, so that the richness of fused image information is promoted, the algorithm complexity is reduced, and the efficiency is improved.
2: the weight map for detail layer image fusion can be effectively optimized by adopting an infrared and visible light image fusion algorithm based on side window filtering. The weight coefficient of the basic layer is obtained by calculating the correlation coefficient of the input infrared and visible light images, so that the overall contrast of the fused image can be maintained, a better visual effect is obtained, and the robustness of the algorithm is improved. And simultaneously, optimizing the detail layer fusion initial weight obtained by the maximum absolute value by using side window guide filtering. The original input image is used as a guide image of side window guide filtering, and the initial weight optimization effect of the detail layer with different roughness degrees can be obtained by setting the filtering radius and the standard deviation parameter of the side window guide filter, so that the significant information of the detail layer image can be retained to the maximum extent.
An infrared and visible light image fusion method based on side window guide filtering comprises the following steps:
step 1: obtaining the base layer by multi-scale decomposition
For input infrared source image
Figure 100002_DEST_PATH_IMAGE002
And visible light source images
Figure 100002_DEST_PATH_IMAGE004
Obtaining a processed source image by using a Gaussian filter to obtain a base layer image containing large-scale information, and subtracting the source image from the base layer image to obtainTo a detail layer image containing small-scale information, the process is as follows:
Figure DEST_PATH_IMAGE006
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE008
which represents a gaussian filtering operation, is shown,
Figure 100002_DEST_PATH_IMAGE010
and
Figure 100002_DEST_PATH_IMAGE012
respectively, the radius and standard deviation of the gaussian filter.
Figure 100002_DEST_PATH_IMAGE014
And
Figure 100002_DEST_PATH_IMAGE016
respectively corresponding to the multi-scale decomposed infrared and visible light base layer images.
The Gaussian filter is a filter commonly used in the field of digital image processing and is characterized by simple calculation and good smoothing effect. The input infrared and visible light images are subjected to multi-scale decomposition by using Gaussian filtering, and different scale information can be effectively separated by setting different filtering radiuses and standard deviation parameters of the Gaussian filter. Meanwhile, the image edge information of different scales can be kept in the decomposition process, and the final fusion effect is promoted.
Step 2: obtaining detail layer by layer through multi-scale decomposition
Infrared source image
Figure 100002_DEST_PATH_IMAGE002A
And visible light source images
Figure 100002_DEST_PATH_IMAGE004A
Subtracting the base layer image to obtain a detail layer image containing small-scale informationThe process is as follows:
Figure 100002_DEST_PATH_IMAGE020
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE022
and
Figure 100002_DEST_PATH_IMAGE024
corresponding to the multi-scale decomposed infrared and visible light detail layer images, respectively.
And 3, step 3: base layer weight coefficient acquisition
Infrared source image input by calculation
Figure 100002_DEST_PATH_IMAGE002AA
And visible light source images
Figure 100002_DEST_PATH_IMAGE004AA
The weight coefficient of the base layer image fusion is obtained by the following process:
Figure 100002_DEST_PATH_IMAGE028
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE030
representing the correlation coefficients of the input source image. Then, respectively distributing the weight coefficients of the infrared and visible light base layer image fusion according to the correlation coefficients of the source image:
Figure 100002_DEST_PATH_IMAGE032
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE034
and
Figure 100002_DEST_PATH_IMAGE036
respectively corresponding to the weight coefficients of the infrared and visible light base layer image fusion. The fusion coefficient of the basic layer is obtained by calculating the correlation coefficient, so that the final fusion image can obtain good contrast, and the robustness of the whole fusion algorithm is improved.
And 4, step 4: base layer weight coefficient acquisition
And (4) obtaining an initial weight map of the detail layer by utilizing a maximum absolute value rule for the infrared and visible light detail layer images obtained in the step (4), wherein the process is represented as follows:
Figure 100002_DEST_PATH_IMAGE038
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE040
a detail layer initial weight map representing an infrared image,
Figure 100002_DEST_PATH_IMAGE042
a detail layer initial weight map representing a visible light image. Then, the initial weights of the detail layer are optimized using side window guidance, and the process is expressed as follows:
Figure 100002_DEST_PATH_IMAGE044
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE046
a side-window guided filtering operation is shown,
Figure 100002_DEST_PATH_IMAGE048
the size of the side window guide filter is indicated.
Figure 100002_DEST_PATH_IMAGE050
Indicating the standard deviation of the side window guide filter, the degree of blurring can be controlled.
Figure 100002_DEST_PATH_IMAGE002AAA
And
Figure 100002_DEST_PATH_IMAGE004AAA
are the input infrared and visible source images, here as the side window guide filtered guide image.
Figure 100002_DEST_PATH_IMAGE054
And
Figure 100002_DEST_PATH_IMAGE056
the weight map is respectively corresponding to the fusion of the infrared and visible light detail layer images.
In the process of extracting the initial weight map of the detail layer, the maximum absolute value rule is utilized, the saliency information of the input infrared and visible light images can be effectively extracted, and the target object is highlighted. The side window guide filtering is used for optimizing the initial weight map of the detail layer, so that the visual effect of the fused image can be greatly improved, and the phenomena of halation and artifacts are avoided. The original input image is used as a guide image for side window guide filtering, and the initial weight optimization effect of the detail layer with different roughness degrees can be obtained by setting the filtering radius and the standard deviation parameter of the side window guide filter, so that the final fusion effect is promoted.
And 5, step 5: preliminarily merging base layer and detail layer
And (3) preliminarily fusing the base layer and the detail layer by using the weights of the infrared and visible light base layers and the detail layers obtained in the steps (3) and (4), wherein the process is as follows:
Figure 100002_DEST_PATH_IMAGE058
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE060
a base layer fused image representing the infrared and visible images,
Figure 100002_DEST_PATH_IMAGE062
a detail layer fused image representing the infrared and visible images.
And 6, step 6: weighted reconstruction of fused images
And 5, obtaining a final fusion image by adding the two image layers of the fusion image of the basic layer and the detail layer obtained in the step 5, wherein the process is as follows:
Figure 100002_DEST_PATH_IMAGE064
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE066
and the final fusion result.
The invention has the beneficial effects that: aiming at the infrared and visible light image fusion technology, Gaussian filtering is utilized to carry out multi-scale decomposition on the input infrared and visible light source images to obtain a basic layer containing large-scale information and a detail layer containing small-scale information. In the fusion process of the basic layer, fusion is carried out by utilizing the correlation coefficient of the source image, and the contrast and the robustness of the finally fused image are ensured. The fusion of the detail layers is combined with the side window guide filtering to optimize the initial weight of the detail layers obtained by the maximum absolute value rule, so that the visual effect of the fused image can be greatly improved, and the phenomena of halation and artifacts are avoided. In the invention, as long as the infrared and visible light images of the same scene are input, effective multi-scale fusion can be carried out, and a high-quality fusion image can be obtained.
Drawings
FIG. 1 is a flow chart of an algorithm;
FIG. 2(a) is an input infrared image;
FIG. 2(b) is an input visible light image;
FIG. 3(a) is a base layer for infrared image multi-scale decomposition;
FIG. 3(b) is a detail layer of the infrared image multi-scale decomposition;
FIG. 4(a) is a base layer for multi-scale decomposition of a visible light image;
FIG. 4(b) is a detail layer of a multi-scale decomposition of a visible light image;
FIG. 5(a) is an initial weight of an infrared image detail layer;
FIG. 5(b) is an initial weight of a visible image detail layer;
FIG. 6(a) is a fusion weight graph of infrared images;
FIG. 6(b) is a fusion weight graph of visible light images;
FIG. 7(a) is a base layer fusion map;
FIG. 7(b) is a detail layer fusion diagram;
fig. 8 shows the result of image fusion between infrared light and visible light.
Detailed Description
The technical solution of the present invention is described in detail and fully with reference to the accompanying drawings.
Figure 1 shows a flow chart of the present invention.
Fig. 2 is an example of a set of infrared and visible light images of the same scene, where fig. 2(a) is an input infrared image and fig. 2(b) is an input visible light image.
Fig. 3-4 show layers of a multi-scale decomposition with gaussian filtering. Fig. 3(a) and fig. 4(a) are the basic layers obtained by decomposing the infrared and visible light source images, respectively, and mainly contain large-scale information in the images. Fig. 3(b) and fig. 4(b) are detail layers obtained by decomposing images of infrared and visible light sources, respectively, and mainly contain small-scale information in the images. Fig. 5(a) and 5(b) are initial weights for infrared and visible image detail layers, respectively, reflecting the regions of most interest to human vision in the respective images. Fig. 6(a) and 6(b) are detail layer fusion weight maps obtained after the initial weight maps of the infrared and visible light images are subjected to side window guide filtering, which are helpful for overcoming artifact halo and have better subjective visual effect.
See Yin H, Gong Y, Qiu G, et al, Side window guided filtering [ J ]. Signal Processing, 2019: 315-.
In this embodiment, when performing scale decomposition on the input infrared and visible light source images by using gaussian filtering, settings are made
Figure DEST_PATH_IMAGE068
Figure DEST_PATH_IMAGE070
And respectively obtaining a corresponding base layer image and a corresponding detail layer image.
The fusion coefficient of the basic layer is obtained by calculating the correlation coefficient between the input infrared and visible light source images, and the fusion image of the basic layer is obtained by linear weighted summation of the infrared and visible light source images according to the numerical value of the correlation coefficient.
In this embodiment, the initial weight map of the detail layer is obtained by the maximum absolute value rule, and then the initial weight map of the detail layer is optimized by using side window guided filtering to obtain the fusion weight map of the detail layer. Wherein the input infrared and visible light source images are used as guide images of the side window guide filter, and setting is performed
Figure DEST_PATH_IMAGE072
Figure DEST_PATH_IMAGE074
And obtaining the infrared and visible light detail layer fusion images by means of linear weighted summation of the detail layer fusion weight image and the detail layer image.
And finally, the final fusion image is obtained by the linear addition of the infrared and visible light base layer fusion image and the detail layer fusion image, the final fusion result is shown in fig. 8, the fusion image has better contrast, meanwhile, the saliency information in the infrared and visible light images is better kept, and the subjective visual effect is good.

Claims (9)

1. An infrared and visible light image fusion method based on side window guide filtering is characterized by comprising the following steps:
the method comprises the following steps: multiscale decomposition versus input infrared source images
Figure DEST_PATH_IMAGE002
And visible light source images
Figure DEST_PATH_IMAGE004
Obtaining a base layer image containing large-scale information by using a Gaussian filter;
step two: multiscale decomposition vs. infrared source images
Figure DEST_PATH_IMAGE002A
And visible light source images
Figure DEST_PATH_IMAGE004A
Subtracting the base layer image to obtain a detail layer image containing small-scale information;
step three: infrared source image input by calculation
Figure DEST_PATH_IMAGE002AA
And visible light source images
Figure DEST_PATH_IMAGE004AA
Obtaining a weight coefficient of the image fusion of the base layer by the correlation coefficient between the base layer and the image fusion of the base layer;
step four: respectively distributing weight coefficients for fusing the infrared basic layer image and the visible light basic layer image according to the correlation coefficients of the source image;
step five: obtaining a detail layer initial weight graph by utilizing a maximum absolute value rule for the detail layer image containing the small-scale information obtained in the step two;
step six: optimizing the initial weight of the detail layer by using side window guidance on the obtained initial weight graph of the detail layer in the step five;
step seven: preliminarily fusing the base layer and the detail layer by using the weights of the infrared base layer and the visible light base layer obtained in the fourth step and the sixth step and the weights of the infrared detail layer and the visible light detail layer;
step eight: and adding the base layer and the detail layer fusion images obtained in the step seven by utilizing the two image layers to obtain a final fusion image result.
2. The infrared and visible light image fusion method based on side window guide filtering as claimed in claim 1, wherein in the step one, the process is:
Figure DEST_PATH_IMAGE010
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE012
which represents a gaussian filtering operation, is shown,
Figure DEST_PATH_IMAGE014
and
Figure DEST_PATH_IMAGE016
respectively representing the radius and standard deviation of Gaussian filtering;
Figure DEST_PATH_IMAGE018
and
Figure DEST_PATH_IMAGE020
respectively corresponding to the multi-scale decomposed infrared and visible light base layer images.
3. The infrared and visible light image fusion method based on side window guide filtering as claimed in claim 1, wherein in the second step, the process is:
Figure DEST_PATH_IMAGE022
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE024
and
Figure DEST_PATH_IMAGE026
corresponding to the multi-scale decomposed infrared and visible light detail layer images, respectively.
4. The infrared and visible light image fusion method based on side window guide filtering as claimed in claim 1, wherein said step three, the process is:
Figure DEST_PATH_IMAGE028
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE030
representing the correlation coefficients of the input source image.
5. The infrared and visible light image fusion method based on side window guide filtering as claimed in claim 1, wherein said step four, the process is:
Figure DEST_PATH_IMAGE032
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE034
and
Figure DEST_PATH_IMAGE036
respectively corresponding to the weight coefficients of the infrared and visible light base layer image fusion.
6. The infrared and visible light image fusion method based on side window guide filtering as claimed in claim 1, wherein in the fifth step, the process is:
Figure DEST_PATH_IMAGE038
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE040
a detail layer initial weight map representing an infrared image,
Figure DEST_PATH_IMAGE042
a detail layer initial weight map representing a visible light image.
7. The infrared and visible light image fusion method based on side window guide filtering as claimed in claim 1, wherein in step six, the process is:
Figure DEST_PATH_IMAGE044
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE046
a side-window guided filtering operation is shown,
Figure DEST_PATH_IMAGE048
represents the size of the side window guide filter;
Figure DEST_PATH_IMAGE050
the standard deviation of the side window guide filter is represented, and the fuzzy degree can be controlled;
Figure DEST_PATH_IMAGE002AAA
and
Figure DEST_PATH_IMAGE004AAA
is the input infrared and visible light source images, here as the guide image of the side window guide filter;
Figure DEST_PATH_IMAGE054
and
Figure DEST_PATH_IMAGE056
the weight map is respectively corresponding to the fusion of the infrared and visible light detail layer images.
8. The infrared and visible light image fusion method based on side window guide filtering as claimed in claim 1, wherein in the seventh step, the process is:
Figure DEST_PATH_IMAGE058
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE060
a base layer fused image representing the infrared and visible images,
Figure DEST_PATH_IMAGE062
a detail layer fused image representing the infrared and visible images.
9. The infrared and visible light image fusion method based on side window guide filtering as claimed in claim 1, wherein in step eight, the process is:
Figure DEST_PATH_IMAGE064
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE066
and the final fusion result.
CN202011101778.3A 2020-10-15 Infrared and visible light image fusion method based on side window guide filtering Active CN112419212B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011101778.3A CN112419212B (en) 2020-10-15 Infrared and visible light image fusion method based on side window guide filtering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011101778.3A CN112419212B (en) 2020-10-15 Infrared and visible light image fusion method based on side window guide filtering

Publications (2)

Publication Number Publication Date
CN112419212A true CN112419212A (en) 2021-02-26
CN112419212B CN112419212B (en) 2024-05-17

Family

ID=

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113129243A (en) * 2021-03-10 2021-07-16 同济大学 Blood vessel image enhancement method and system based on infrared and visible light image fusion
CN113763368A (en) * 2021-09-13 2021-12-07 中国空气动力研究与发展中心超高速空气动力研究所 Large-size test piece multi-type damage detection characteristic analysis method
CN113793318A (en) * 2021-09-13 2021-12-14 中国空气动力研究与发展中心超高速空气动力研究所 Multi-region complex damage defect characteristic comprehensive analysis method
CN113935922A (en) * 2021-10-21 2022-01-14 燕山大学 Infrared and visible light image feature enhancement fusion method
CN114092369A (en) * 2021-11-19 2022-02-25 中国直升机设计研究所 Image fusion method based on visual saliency mapping and least square optimization
CN114757912A (en) * 2022-04-15 2022-07-15 电子科技大学 Material damage detection method, system, terminal and medium based on image fusion
CN115578304A (en) * 2022-12-12 2023-01-06 四川大学 Multi-band image fusion method and system combining saliency region detection

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104809734A (en) * 2015-05-11 2015-07-29 中国人民解放军总装备部军械技术研究所 Infrared image and visible image fusion method based on guide filtering
CN107169944A (en) * 2017-04-21 2017-09-15 北京理工大学 A kind of infrared and visible light image fusion method based on multiscale contrast
KR101788660B1 (en) * 2016-08-12 2017-10-20 포항공과대학교 산학협력단 Apparatus and method for removing haze in a single image
CN111179209A (en) * 2019-12-20 2020-05-19 上海航天控制技术研究所 Infrared and visible light image information fusion method and device based on feature guidance
CN111223069A (en) * 2020-01-14 2020-06-02 天津工业大学 Image fusion method and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104809734A (en) * 2015-05-11 2015-07-29 中国人民解放军总装备部军械技术研究所 Infrared image and visible image fusion method based on guide filtering
KR101788660B1 (en) * 2016-08-12 2017-10-20 포항공과대학교 산학협력단 Apparatus and method for removing haze in a single image
CN107169944A (en) * 2017-04-21 2017-09-15 北京理工大学 A kind of infrared and visible light image fusion method based on multiscale contrast
CN111179209A (en) * 2019-12-20 2020-05-19 上海航天控制技术研究所 Infrared and visible light image information fusion method and device based on feature guidance
CN111223069A (en) * 2020-01-14 2020-06-02 天津工业大学 Image fusion method and system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
HUI YIN等: "Side Window Filtering", 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 9 January 2020 (2020-01-09), pages 8750 - 8758 *
HUIBIN YAN等: "A General Perceptual Infrared and Visible Image Rusion Framework Based on Linear Filter and Side Window Filtering Technology", IEEE ACCESS, 23 December 2019 (2019-12-23), pages 3029 - 3041, XP011766057, DOI: 10.1109/ACCESS.2019.2961626 *
RUI TAO等: "Multi-focus Image Fusion Based on Side Window Filtering Technique and Majority Filter", 2019 IEEE 5TH INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATIONS(ICCC), 13 April 2020 (2020-04-13), pages 327 - 331 *
钱进等: "基于侧窗滤波与分块贝塞尔插值的图像融合", 长春理工大学学报(自然科学版), 30 June 2020 (2020-06-30), pages 7 - 12 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113129243A (en) * 2021-03-10 2021-07-16 同济大学 Blood vessel image enhancement method and system based on infrared and visible light image fusion
CN113763368A (en) * 2021-09-13 2021-12-07 中国空气动力研究与发展中心超高速空气动力研究所 Large-size test piece multi-type damage detection characteristic analysis method
CN113793318A (en) * 2021-09-13 2021-12-14 中国空气动力研究与发展中心超高速空气动力研究所 Multi-region complex damage defect characteristic comprehensive analysis method
CN113793318B (en) * 2021-09-13 2023-04-07 中国空气动力研究与发展中心超高速空气动力研究所 Multi-region complex damage defect characteristic comprehensive analysis method
CN113935922A (en) * 2021-10-21 2022-01-14 燕山大学 Infrared and visible light image feature enhancement fusion method
CN114092369A (en) * 2021-11-19 2022-02-25 中国直升机设计研究所 Image fusion method based on visual saliency mapping and least square optimization
CN114757912A (en) * 2022-04-15 2022-07-15 电子科技大学 Material damage detection method, system, terminal and medium based on image fusion
CN115578304A (en) * 2022-12-12 2023-01-06 四川大学 Multi-band image fusion method and system combining saliency region detection
CN115578304B (en) * 2022-12-12 2023-03-10 四川大学 Multi-band image fusion method and system combining saliency region detection

Similar Documents

Publication Publication Date Title
CN111209810B (en) Boundary frame segmentation supervision deep neural network architecture for accurately detecting pedestrians in real time through visible light and infrared images
CN112733950A (en) Power equipment fault diagnosis method based on combination of image fusion and target detection
CN109754384B (en) Infrared polarization image fusion method of uncooled infrared focal plane array
WO2021098083A1 (en) Multispectral camera dynamic stereo calibration algorithm based on salient feature
CN107977950B (en) Rapid and effective video image fusion method based on multi-scale guide filtering
CN113837974B (en) NSST domain power equipment infrared image enhancement method based on improved BEEPS filtering algorithm
CN112184604A (en) Color image enhancement method based on image fusion
CN114782298B (en) Infrared and visible light image fusion method with regional attention
CN113012140A (en) Digestive endoscopy video frame effective information region extraction method based on deep learning
CN110910456A (en) Stereo camera dynamic calibration algorithm based on Harris angular point mutual information matching
CN114187214A (en) Infrared and visible light image fusion system and method
CN110060218A (en) Remote sensing image processing method based on GIS-Geographic Information System
CN116757986A (en) Infrared and visible light image fusion method and device
CN105608674B (en) A kind of image enchancing method based on image registration, interpolation and denoising
CN116823694B (en) Infrared and visible light image fusion method and system based on multi-focus information integration
CN117392496A (en) Target detection method and system based on infrared and visible light image fusion
CN110827375B (en) Infrared image true color coloring method and system based on low-light-level image
CN112734636A (en) Fusion method of multi-source heterogeneous remote sensing images
CN110084774B (en) Method for minimizing fusion image by enhanced gradient transfer and total variation
CN112419212A (en) Infrared and visible light image fusion method based on side window guide filtering
CN116883303A (en) Infrared and visible light image fusion method based on characteristic difference compensation and fusion
CN112419212B (en) Infrared and visible light image fusion method based on side window guide filtering
Moghimi et al. A joint adaptive evolutionary model towards optical image contrast enhancement and geometrical reconstruction approach in underwater remote sensing
CN113962904B (en) Method for filtering and denoising hyperspectral image
CN115330874B (en) Monocular depth estimation method based on superpixel processing shielding

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant