CN112884690B - Infrared and visible light image fusion method based on three-scale decomposition - Google Patents

Infrared and visible light image fusion method based on three-scale decomposition Download PDF

Info

Publication number
CN112884690B
CN112884690B CN202110220561.2A CN202110220561A CN112884690B CN 112884690 B CN112884690 B CN 112884690B CN 202110220561 A CN202110220561 A CN 202110220561A CN 112884690 B CN112884690 B CN 112884690B
Authority
CN
China
Prior art keywords
image
visible light
infrared
vis
representing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110220561.2A
Other languages
Chinese (zh)
Other versions
CN112884690A (en
Inventor
任龙
张海峰
单福强
张辉
冯佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XiAn Institute of Optics and Precision Mechanics of CAS
Original Assignee
XiAn Institute of Optics and Precision Mechanics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XiAn Institute of Optics and Precision Mechanics of CAS filed Critical XiAn Institute of Optics and Precision Mechanics of CAS
Priority to CN202110220561.2A priority Critical patent/CN112884690B/en
Publication of CN112884690A publication Critical patent/CN112884690A/en
Application granted granted Critical
Publication of CN112884690B publication Critical patent/CN112884690B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Abstract

The invention relates to an infrared and visible light image fusion method, in particular to an infrared and visible light image fusion method based on three-scale decomposition. The invention aims to solve the technical problem that the existing infrared and visible light image fusion method is difficult to simultaneously meet real-time performance and better fusion effect. The method comprises the steps of decomposing a visible light image and an infrared image into a background brightness layer, a significant characteristic layer and a detail layer by three scales, fusing different decomposition layers by different fusion methods, adding and re-optimizing the fused decomposition layers to finally obtain a target fusion image, has simple steps, saves time, ensures the real-time property of infrared and visible light image fusion, retains background brightness information, improves the quality of the fusion image, and has good fusion quality and good effect.

Description

Infrared and visible light image fusion method based on three-scale decomposition
Technical Field
The invention relates to an infrared and visible light image fusion method, in particular to an infrared and visible light image fusion method based on three-scale decomposition.
Background
At present, the image fusion method based on the scale decomposition is mostly based on the multi-scale decomposition and based on two scale decompositions. The multi-scale decomposition generally needs to decompose infrared and visible light images into a base layer and a plurality of groups of detail layers respectively, each group comprises three detail layers, and although the method can obtain good fusion effect, the steps are complicated and time-consuming, and the real-time requirement of image fusion is difficult to meet; two kinds of scale decomposition methods rely on an edge-preserving filter to decompose an infrared image and a visible image into a base layer and a detail layer respectively, so that the speed is high, but background brightness information is difficult to keep, and the visual effect of the final fusion image is poor.
Disclosure of Invention
The invention aims to solve the technical problem that the existing infrared and visible light image fusion method is difficult to simultaneously meet real-time performance and better fusion effect, and provides an infrared and visible light image fusion method based on three-scale decomposition.
In order to solve the technical problems, the technical solution provided by the invention is as follows:
an infrared and visible light image fusion method based on three-dimension decomposition is characterized by comprising the following steps:
1) Decomposing the infrared image and the visible light image in three scales
Filtering the infrared image and the visible light image respectively by using a Gaussian filter to obtain an infrared background brightness layer image and a visible light background brightness layer image;
respectively calculating the infrared image and the visible light image to obtain an infrared image fusion weight and a visible light image fusion weight, multiplying the infrared image fusion weight by the infrared image to obtain an infrared image salient feature layer image, and multiplying the visible light image fusion weight by the visible light image to obtain a visible light image salient feature layer image;
respectively carrying out guide filtering on the infrared image and the visible light image by using a guide filter, and respectively subtracting the infrared background brightness layer image and the visible light background brightness layer image from the infrared image and the visible light image after the guide filtering to obtain an infrared detail layer image and a visible light detail layer image;
2) Carrying out image fusion on the images obtained by decomposition
2.1 Fusing the infrared background brightness layer image with the visible light background brightness layer image to obtain a fused background brightness layer image; fusing the infrared significant characteristic layer image with the visible light significant characteristic layer image to obtain a fused significant characteristic layer image; fusing the infrared detail layer image and the visible light detail layer image to obtain a fused detail layer image;
2.2 Adding the fused background brightness layer image, the fused salient feature layer image and the fused detail layer image to obtain an initial fused image;
2.3 ) optimizing the initial fusion image by adopting an optimization model and a gradient descent method to obtain a final fusion image.
Further, in step 1), the filtering formula of the gaussian filter is as follows:
G IR =Gaussian(P IR ,7,0,0.5)
G VIS =Gaussian(P VIS ,7,0,0.5);
wherein the content of the first and second substances,
G IR representing an infrared background luminance layer image;
G VIS representing a visible light background luminance layer image;
P IR representing an infrared image;
P VIS representing a visible light image;
7 is the filter size;
0 and 0.5 are the mean and variance of the gaussian filter, respectively.
Further, the calculation formula for the fusion weight of the infrared image and the visible light image in the step 1) is as follows:
Figure RE-GDA0003033299960000021
Figure RE-GDA0003033299960000022
wherein the content of the first and second substances,
W IR representing an infrared image fusion weight;
W VIS representing a visible light image fusion weight;
Figure RE-GDA0003033299960000031
means representing an infrared image;
Figure RE-GDA0003033299960000032
a mean value representing a visible light image;
the infrared image significant characteristic layer image and the visible light image significant characteristic layer image are obtained by adopting the following formulas:
S IR =P IR .*W IR
S VIS =P VIS .*W VIS
denotes multiplication for pixel points;
S IR representing an infrared image salient feature layer image;
S VIS representing a visible image saliency feature layer image.
Further, the calculation formula used in step 1) is as follows:
filter formula of the pilot filter:
q i =a k *I i +b k
Figure RE-GDA0003033299960000033
wherein, the first and the second end of the pipe are connected with each other,
I i representing an input image as an infrared image P IR Or visible light image G IR
q i Representing the output image as an infrared image P IR Guiding the filtered image E IR Or visible light image
P VIS Guiding the filtered image E VIs
a k And b k As a filter coefficient, can be represented by E (a) k ,b k ) Calculating to obtain;
ω k is a filtering window;
ε is the regularization parameter;
the infrared image detail layer image and the visible light image detail layer image are obtained by adopting the following formulas:
D IR =E IR -G IR
D VIS =E VIS -G VIS
wherein the content of the first and second substances,
D IR representing an infrared image detail layer image;
D VIS representing a visible image detail layer image.
Further, in step 2.1), fusing the background brightness layer image G F The calculation formula of (a) is as follows:
Figure RE-GDA0003033299960000041
fusing salient feature layer images S F The calculation formula of (a) is as follows:
S F =S IR +S VIS
fusing detail layer images D F The calculation formula of (a) is as follows:
D F =max(D IR ,D VIS )
where max () denotes taking the maximum of two values as the output value.
Further, the calculation formula used in step 2.2) is as follows:
F=G F +S F +D F
where F denotes the initial fusion image.
Further, in step 2.3), the optimization model is as follows:
Figure RE-GDA0003033299960000042
wherein the content of the first and second substances,
F * representing the optimized final fusion image;
Figure RE-GDA0003033299960000043
for gradient operator, e 1 As a balance factor, 0.8 was taken.
Compared with the prior art, the invention has the following beneficial effects:
the invention provides an infrared and visible light image fusion method based on three-scale decomposition, which relates to key technologies and optimization related to infrared and visible light image fusion.
Drawings
FIG. 1 is a schematic diagram of an infrared and visible light image fusion method based on three-scale decomposition according to the present invention;
FIG. 2 is a fused background luminance layer image according to an embodiment of the present invention;
FIG. 3 is a fused salient feature layer image of an embodiment of the present invention;
FIG. 4 is a fused detail layer image of an embodiment of the invention;
FIG. 5 is an initial fused image of an embodiment of the present invention;
FIG. 6 is a final fused image of an embodiment of the present invention.
Detailed Description
The invention is further described below with reference to the figures and examples.
According to the infrared and visible light image fusion method based on three-scale decomposition, the visible light image and the infrared image are subjected to three-scale decomposition, different fusion methods are adopted according to different decomposition layers for fusion, and then the fused decomposition layers are added and optimized to finally obtain the target fusion image. As shown in fig. 1, the method specifically comprises the following steps:
1) Decomposing the infrared image and the visible light image in three scales
Filtering the infrared image and the visible light image respectively by using a Gaussian filter to obtain an infrared background brightness layer image and a visible light background brightness layer image;
the filtering formula of the Gaussian filter is as follows:
G IR =Gaussian(P IR ,7,0,0.5)
G VIS =Gaussian(P VIS ,7,0,0.5);
wherein the content of the first and second substances,
G IR representing an infrared background luminance layer image;
G VIS representing a visible light background luminance layer image;
P IR representing an infrared image;
P VIS representing a visible light image;
7 is the filtering size, and the larger the size is, the more fuzzy the filtered image is;
0 and 0.5 are mean and variance of gaussian filtering, respectively;
respectively calculating the infrared image and the visible light image to obtain an infrared image fusion weight and a visible light image fusion weight, multiplying the infrared image fusion weight by the infrared image to obtain an infrared image salient feature layer image, and multiplying the visible light image fusion weight by the visible light image to obtain a visible light image salient feature layer image;
the calculation formula for the fusion weight of the infrared image and the visible light image is as follows:
Figure RE-GDA0003033299960000061
Figure RE-GDA0003033299960000062
wherein the content of the first and second substances,
W IR representing infrared image fusion weightsWeighing;
W VIS representing a visible light image fusion weight;
Figure RE-GDA0003033299960000063
means representing an infrared image;
Figure RE-GDA0003033299960000064
a mean value representing a visible light image;
the infrared image significant characteristic layer image and the visible light image significant characteristic layer image are obtained by adopting the following formulas:
S IR =P IR .*W IR
S VIS =P VIS .*W VIS
denotes multiplication for pixel points;
S IR representing an infrared image salient feature layer image;
S VIS representing a visible light image salient feature layer image;
respectively carrying out guide filtering on the infrared image and the visible light image by using a guide filter, and respectively subtracting the infrared background brightness layer image and the visible light background brightness layer image from the infrared image and the visible light image after the guide filtering to obtain an infrared detail layer image and a visible light detail layer image;
filter formula of the pilot filter:
q i =a k *I i +b k
Figure RE-GDA0003033299960000065
wherein, the first and the second end of the pipe are connected with each other,
I i representing an input image as an infrared image P IR Or visible light image G IR
q i Representing the output image as an infrared image P IR Guiding the filtered image E IR Or visible light image
P VIS Guiding the filtered image E VIS
a k And b k As filter coefficient, can be represented by E (a) k ,b k ) Calculating to obtain;
ω k is a filtering window;
epsilon is a regularization parameter;
the infrared image detail layer image and the visible light image detail layer image are obtained by adopting the following formula:
D IR =E IR -G IR
D VIS =E VIS -G VIS
wherein the content of the first and second substances,
D IR representing an infrared image detail layer image;
D VIS representing a visible light image detail layer image;
2) Carrying out image fusion on the images obtained by decomposition
2.1 Fusing the infrared background luminance layer image with the visible light background luminance layer image to obtain a fused background luminance layer image; fusing the infrared significant characteristic layer image with the visible light significant characteristic layer image to obtain a fused significant characteristic layer image; fusing the infrared detail layer image and the visible light detail layer image to obtain a fused detail layer image;
there are several kinds of selectable fusion methods for each decomposition layer, some of which may adopt the existing method, and one fusion method is given for each decomposition layer as follows:
fusion background brightness layer image G F The calculation formula of (a) is as follows:
Figure RE-GDA0003033299960000071
fusing salient feature layer images S F The calculation formula of (a) is as follows:
S F =S IR +S VIS
fusing detail layer images D F Meter (2)The calculation formula is as follows:
D F =max(D IR ,D VIS )
where max () denotes taking the maximum of two values as the output value;
FIG. 2 is a fused background luminance layer image; FIG. 3 is a fused salient feature layer image; FIG. 4 is a fused detail layer image;
2.2 The fused background brightness layer image, the fused salient feature layer image and the fused detail layer image are added to obtain an initial fused image, and fig. 5 shows the initial fused image, so that the initial fused image needs to be optimized because the detail image is lost due to overexposure generated in a part of the area;
the calculation formula used is as follows:
F=G F +S F +D F
wherein F represents the initial fused image;
2.3 Using an optimization model and a gradient descent method, the optimization model is as follows:
Figure RE-GDA0003033299960000081
wherein the content of the first and second substances,
F * representing the optimized final fusion image;
Figure RE-GDA0003033299960000082
as gradient operator, e 1 Is taken as a balance factor and is 0.8;
the initial fusion image is optimized to obtain a final fusion image, and fig. 6 is the final fusion image, which shows that the infrared and visible light image fusion performed by the method of the present invention has good fusion quality and meets the design requirements.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same, and it is obvious for those skilled in the art to modify the specific technical solutions described in the foregoing embodiments, or to substitute part of the technical features, and these modifications or substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions protected by the present invention.

Claims (5)

1. An infrared and visible light image fusion method based on three-scale decomposition is characterized by comprising the following steps:
1) Decomposing the infrared image and the visible light image in three scales
Filtering the infrared image and the visible light image respectively by using a Gaussian filter to obtain an infrared background brightness layer image and a visible light background brightness layer image;
respectively calculating the infrared image and the visible light image to obtain an infrared image fusion weight and a visible light image fusion weight, multiplying the infrared image fusion weight by the infrared image to obtain an infrared image salient feature layer image, and multiplying the visible light image fusion weight by the visible light image to obtain a visible light image salient feature layer image;
respectively carrying out guide filtering on the infrared image and the visible light image by using a guide filter, and respectively subtracting the infrared background brightness layer image and the visible light background brightness layer image from the infrared image and the visible light image after the guide filtering to obtain an infrared detail layer image and a visible light detail layer image;
the calculation formula for the fusion weight of the infrared image and the visible light image is as follows:
Figure FDA0003930693500000011
Figure FDA0003930693500000012
wherein the content of the first and second substances,
W IR representing an infrared image fusion weight;
W VIS representing a visible light image fusion weight;
Figure FDA0003930693500000013
means representing an infrared image;
Figure FDA0003930693500000014
a mean value representing a visible light image;
P IR representing an infrared image;
P VIS representing a visible light image;
the infrared image significant characteristic layer image and the visible light image significant characteristic layer image are obtained by adopting the following formulas:
S IR =P IR .*W IR
S VIS =P VIS .*W VIS
* Representing multiplication for a pixel point;
S IR representing an infrared image salient feature layer image;
S VIS representing a visible light image salient feature layer image;
2) Carrying out image fusion on the images obtained by decomposition
2.1 Fusing the infrared background luminance layer image with the visible light background luminance layer image to obtain a fused background luminance layer image; fusing the infrared significant characteristic layer image with the visible light significant characteristic layer image to obtain a fused significant characteristic layer image; fusing the infrared detail layer image and the visible light detail layer image to obtain a fused detail layer image;
2.2 Adding the fused background brightness layer image, the fused salient feature layer image and the fused detail layer image to obtain an initial fused image;
2.3 Adopting an optimization model and a gradient descent method to optimize the initial fusion image to obtain a final fusion image;
the optimization model is as follows:
Figure FDA0003930693500000021
wherein the content of the first and second substances,
F * representing the optimized final fusion image;
Figure FDA0003930693500000022
for gradient operator, e 1 Is taken as a balance factor and is 0.8;
f denotes the initial fused image.
2. The infrared and visible light image fusion method based on three-dimension decomposition according to claim 1, characterized in that:
in step 1), the filtering formula of the gaussian filter is as follows:
G IR =Gaussian(P IR ,7,0,0.5)
G VIS =Gaussian(P VIS ,7,0,0.5);
wherein, the first and the second end of the pipe are connected with each other,
G IR representing an infrared background luminance layer image;
G VIS representing a visible light background luminance layer image;
P IR representing an infrared image;
P VIS representing a visible light image;
7 is the filter size;
0 and 0.5 are the mean and variance of the gaussian filter, respectively.
3. The infrared and visible light image fusion method based on three-dimension decomposition according to claim 2, characterized in that:
the calculation formula used in step 1) is as follows:
filter formula of the pilot filter:
q i =a k *I i +b k
Figure FDA0003930693500000031
wherein the content of the first and second substances,
I i representing an input image as an infrared image P IR Or visible light image P VIS
q i Representing the output image as an infrared image P IR Guiding the filtered image E IR Or visible light image
P VIS Guiding the filtered image E VIS
a k And b k As filter coefficient, can be represented by E (a) k ,b k ) Calculating to obtain;
ω k is a filtering window;
epsilon is a regularization parameter;
the infrared image detail layer image and the visible light image detail layer image are obtained by adopting the following formulas:
D IR =E IR -G IR
D VIS =E VIS -G VIS
wherein the content of the first and second substances,
D IR representing an infrared image detail layer image;
D VIS representing a visible image detail layer image.
4. The infrared and visible light image fusion method based on three-dimension decomposition according to claim 3, characterized in that:
in step 2.1), fusing the background brightness layer image G F The calculation formula of (a) is as follows:
Figure FDA0003930693500000041
fusing salient feature layer images S F The calculation formula of (a) is as follows:
S F =S IR +S VIS
fusing detail layer images D F The calculation formula of (a) is as follows:
D F =max(D IR ,D VIS )
where max () denotes taking the maximum of two values as the output value.
5. The infrared and visible light image fusion method based on three-dimension decomposition according to claim 4, characterized in that:
the calculation formula used in step 2.2) is as follows:
F=G F +S F +D F
where F denotes the initial fusion image.
CN202110220561.2A 2021-02-26 2021-02-26 Infrared and visible light image fusion method based on three-scale decomposition Active CN112884690B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110220561.2A CN112884690B (en) 2021-02-26 2021-02-26 Infrared and visible light image fusion method based on three-scale decomposition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110220561.2A CN112884690B (en) 2021-02-26 2021-02-26 Infrared and visible light image fusion method based on three-scale decomposition

Publications (2)

Publication Number Publication Date
CN112884690A CN112884690A (en) 2021-06-01
CN112884690B true CN112884690B (en) 2023-01-06

Family

ID=76054895

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110220561.2A Active CN112884690B (en) 2021-02-26 2021-02-26 Infrared and visible light image fusion method based on three-scale decomposition

Country Status (1)

Country Link
CN (1) CN112884690B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114419312B (en) * 2022-03-31 2022-07-22 南京智谱科技有限公司 Image processing method and device, computing equipment and computer readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017020595A1 (en) * 2015-08-05 2017-02-09 武汉高德红外股份有限公司 Visible light image and infrared image fusion processing system and fusion method
CN109242888A (en) * 2018-09-03 2019-01-18 中国科学院光电技术研究所 A kind of infrared and visible light image fusion method of combination saliency and non-down sampling contourlet transform
CN109509164A (en) * 2018-09-28 2019-03-22 洛阳师范学院 A kind of Multisensor Image Fusion Scheme and system based on GDGF
CN109509163A (en) * 2018-09-28 2019-03-22 洛阳师范学院 A kind of multi-focus image fusing method and system based on FGF
CN109614976A (en) * 2018-11-02 2019-04-12 中国航空工业集团公司洛阳电光设备研究所 A kind of heterologous image interfusion method based on Gabor characteristic
CN110490914A (en) * 2019-07-29 2019-11-22 广东工业大学 It is a kind of based on brightness adaptively and conspicuousness detect image interfusion method
AU2020100178A4 (en) * 2020-02-04 2020-03-19 Huang, Shuying DR Multiple decision maps based infrared and visible image fusion
CN111223069A (en) * 2020-01-14 2020-06-02 天津工业大学 Image fusion method and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017020595A1 (en) * 2015-08-05 2017-02-09 武汉高德红外股份有限公司 Visible light image and infrared image fusion processing system and fusion method
CN109242888A (en) * 2018-09-03 2019-01-18 中国科学院光电技术研究所 A kind of infrared and visible light image fusion method of combination saliency and non-down sampling contourlet transform
CN109509164A (en) * 2018-09-28 2019-03-22 洛阳师范学院 A kind of Multisensor Image Fusion Scheme and system based on GDGF
CN109509163A (en) * 2018-09-28 2019-03-22 洛阳师范学院 A kind of multi-focus image fusing method and system based on FGF
CN109614976A (en) * 2018-11-02 2019-04-12 中国航空工业集团公司洛阳电光设备研究所 A kind of heterologous image interfusion method based on Gabor characteristic
CN110490914A (en) * 2019-07-29 2019-11-22 广东工业大学 It is a kind of based on brightness adaptively and conspicuousness detect image interfusion method
CN111223069A (en) * 2020-01-14 2020-06-02 天津工业大学 Image fusion method and system
AU2020100178A4 (en) * 2020-02-04 2020-03-19 Huang, Shuying DR Multiple decision maps based infrared and visible image fusion

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Image Fusion With Guided Filtering;Shutao Li;《 IEEE Transactions on Image Processing》;20130731;全文 *
Infrared and visible image fusion using total variation model;YongMa 等;《Neurocomputing》;20160819;全文 *
一种基于多尺度低秩分解的红外与可见光图像融合方法;陈潮起等;《光学学报》;20200610(第11期);全文 *
带细节增强与降噪的真彩图像多尺度边缘检测;肖锋等;《计算机工程与应用》;20111231;全文 *
采用多重特征蒙板的人像皮肤美化技术;鲁晓卉等;《浙江大学学报(工学版)》;20170930(第12期);全文 *

Also Published As

Publication number Publication date
CN112884690A (en) 2021-06-01

Similar Documents

Publication Publication Date Title
Xiao et al. Fast image dehazing using guided joint bilateral filter
DE102017010210A1 (en) Image Matting by means of deep learning
CN103942758A (en) Dark channel prior image dehazing method based on multiscale fusion
Zhang et al. A naturalness preserved fast dehazing algorithm using HSV color space
CN103942769A (en) Satellite remote sensing image fusion method
CN108182671B (en) Single image defogging method based on sky area identification
CN115565035A (en) Infrared and visible light image fusion method for night target enhancement
CN108377374A (en) Method and system for generating depth information related to an image
DE112019007550T5 (en) AUTOMATICALLY SEGMENT AND ADJUST IMAGES
CN112884690B (en) Infrared and visible light image fusion method based on three-scale decomposition
CN106508048B (en) A kind of similar scale image interfusion method based on multiple dimensioned primitive form
CN113592018A (en) Infrared light and visible light image fusion method based on residual dense network and gradient loss
Liu et al. Single color image dehazing based on digital total variation filter with color transfer
CN103313068A (en) White balance corrected image processing method and device based on gray edge constraint gray world
CN115049921A (en) Method for detecting salient target of optical remote sensing image based on Transformer boundary sensing
CN113011438B (en) Bimodal image significance detection method based on node classification and sparse graph learning
CN112164010A (en) Multi-scale fusion convolution neural network image defogging method
CN112365425A (en) Low-illumination image enhancement method and system
CN110796716A (en) Image coloring method based on multiple residual error networks and regularized transfer learning
Wang et al. Model based edge-preserving and guided filter for real-world hazy scenes visibility restoration
CN113506230B (en) Photovoltaic power station aerial image dodging processing method based on machine vision
Zhang et al. A compensation textures dehazing method for water alike area
CN112991236B (en) Image enhancement method and device based on template
Xia et al. Fog removal and enhancement method for UAV aerial images based on dark channel prior
CN107301625A (en) Image defogging algorithm based on brightness UNE

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant