CN110084774B - Method for minimizing fusion image by enhanced gradient transfer and total variation - Google Patents

Method for minimizing fusion image by enhanced gradient transfer and total variation Download PDF

Info

Publication number
CN110084774B
CN110084774B CN201910288177.9A CN201910288177A CN110084774B CN 110084774 B CN110084774 B CN 110084774B CN 201910288177 A CN201910288177 A CN 201910288177A CN 110084774 B CN110084774 B CN 110084774B
Authority
CN
China
Prior art keywords
image
fusion
term
fused
infrared
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910288177.9A
Other languages
Chinese (zh)
Other versions
CN110084774A (en
Inventor
罗晓清
张战成
尹云飞
张宝成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangnan University
Original Assignee
Jiangnan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangnan University filed Critical Jiangnan University
Priority to CN201910288177.9A priority Critical patent/CN110084774B/en
Publication of CN110084774A publication Critical patent/CN110084774A/en
Application granted granted Critical
Publication of CN110084774B publication Critical patent/CN110084774B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for minimizing fusion images by enhanced gradient transfer and total variation, and belongs to the field of image fusion. The method mainly solves the problem that the texture information of the target and the background is not detailed when the infrared image and the visible image are fused. By constraining the fused image to have a color with redAn outer image, a visible image, similar pixel intensities, and a gradient similar to the infrared image, the visible image. We convert the fusion problem to L 1 TV minimization problem, use of m, λ 1 And lambda (lambda) 2 The three parameters control the relationship between the data fidelity term and the regularization term to achieve the effect of simultaneously maintaining thermal radiation and appearance information in the source image. The invention can fully integrate the target texture detail information of the infrared and visible light images, effectively protect the image details, improve the visual effect and greatly improve the quality of the fused image compared with the traditional fusion method.

Description

Method for minimizing fusion image by enhanced gradient transfer and total variation
Technical Field
The invention belongs to the field of image fusion, relates to a method for minimizing fusion of infrared and visible light images by enhanced gradient transfer and total variation, is a fusion method in the technical field of infrared and visible light image processing, and is widely applied to business and military.
Background
As a research branch and a research key point in the field of image fusion, with the rapid development of thermal radiation image technology, infrared and visible light image fusion has become a current research hotspot at home and abroad. The infrared image can accurately provide information such as position details of the target object, and the visible light image can accurately provide detailed details and background information. The infrared and visible light image fusion can effectively integrate the infrared image target feature information and the visible light image scene detail information to obtain a fusion image with more comprehensive information. The infrared imaging sensor and the visible light imaging sensor provide complementary information, so that the fused image contains more comprehensive and rich information, and the fused image is more in line with the visual characteristics of people or machines, and is more beneficial to further analysis and processing of the image and automatic target recognition. In pixel image fusion, the first problem to be solved is to determine the most important information in the source image to convert the obtained information into a fused image with the least possible variations, in particular distortion or loss. To address this problem, many approaches have been proposed over the past few decades, including pyramid-based approaches, wavelet transforms, curvelet transforms, multi-resolution singular value decomposition, guided filtering, multi-focus, sparse representation, and the like. Averaging the source image pixel by pixel is the simplest strategy. However, this direct approach produces many undesirable effects, such as contrast reduction. To solve this problem, a method based on multi-scale transformation has been proposed, which involves three basic steps: first decomposing a source image into a multi-scale representation with low and high frequency information; then fusing the multi-scale representation according to some fusion rules; the inverse transform of the composite multi-scale coefficients is ultimately used to construct the fused image. Multiscale transform-based methods can provide better performance because they are consistent with the human visual system, while real-world objects are often composed of structures of different scales. Examples of such methods include laplacian pyramids, discrete wavelet transforms, non-subsampled contourlet transforms, and the like. Methods based on multi-scale transformations have met with great success in many cases; however, they use the same representation for different source images and attempt to preserve the same salient features such as edges and lines in the source images. For the problem of infrared and visible image fusion, the thermal radiation information in the infrared image is characterized by pixel intensities, and the target typically has a greater intensity than the background, and thus can be easily detected; while texture information in the visible image is mainly characterized by gradients, gradients with large magnitudes (e.g. edges) provide detailed information of the scene. Therefore, it is not appropriate to use the same representation for both types of images during the fusion process. Instead, to preserve as much important information as possible, it is desirable to fuse the images to preserve the main intensity distribution in the infrared image and the gradient changes in the visible image.
The original additive noise removal variational model is divided into regularization term and data fidelity term, wherein the regularization term plays a role in noise suppression, and the data fidelity term is used for keeping similarity of a denoised image and an observed image and keeping edge characteristics of the image. In turn, minimizing infrared and visible image fusion based on gradient transfer and total variation, abbreviated as Gradient Transfer Fusion (GTF), was proposed. Representing the fusion as an optimization questionThe problem, wherein the objective function consists of a data fidelity term and a regularization term. The data fidelity term constrains the fused image to have similar pixel intensities as the given infrared image, while the regularization term ensures that gradient distributions in the visible image can be transferred into the fused image, L 1 The norms are used to promote sparsity of the gradient, which can then be passed through the existing L 1 TV minimization techniques to solve the optimization problem. In the GTF, although the background information can be captured well, the target is not enough outstanding, and the fused image has low contrast. To this end, the present invention proposes an improved algorithm gradient transfer and total variation minimizing fusion of infrared and visible images. The method simultaneously maintains the gradient and the pixel intensity of the infrared and visible light images, and introduces three parameters to properly adjust the relation between the regularization term and the fidelity term, thereby obtaining better fusion effect
In order to improve the performance of the fused image, the selection of the fusion rule is also important. According to the invention, a fusion rule based on gradient transfer and total variation minimization is selected, so that the gradient and pixel intensity of the image are better maintained, and the gradient and pixel intensity of the infrared and visible light images are simultaneously maintained, so that the fused image target is more accurate, the background detail is clear, and the quality of the fused image is improved.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, provides a method for minimizing fusion of infrared and visible light images by enhanced gradient transfer and total variation, solves the problems of losing target details and unclear background textures of the fusion image obtained by the existing infrared and visible light image fusion method, fully integrates structural information and functional information of infrared and visible light images of different modes, effectively protects image details, enhances image details, textures and edge contours, improves visual effects of the fusion image, and improves the quality of the fusion image.
The technical scheme adopted for solving the technical problems is as follows:
a method of enhancing gradient transfer and total variation to minimize fusion of images, comprising the steps of:
(1) Carrying out gradient transfer transformation on the infrared image and the visible light image to be fused to obtain corresponding gradient and pixel intensity;
(2) Establishing a data fidelity term and a regularization term, wherein the data fidelity term is used for constraining the fusion image to have pixel intensities similar to those of the given infrared image and visible light image, and the regularization term is used for ensuring that gradient distribution in the infrared image and the visible light image can be transferred into the fusion image;
(2.1) data Fidelity term
Figure BDA0002023998400000021
Wherein x represents a fused image; epsilon 1 (x) A fidelity term representing the fused image; p represents the norm of the fidelity term; u represents an infrared image; v represents a visible light image; wherein the sizes of x, u and v images are all a×b; infrared images are typically characterized by pixel intensities, as are differences in pixel intensities between the object and the background in the visible image. Due to the pixel intensity difference between the object and the background, the object is typically clearly visible in the infrared image, and the texture of the object is also quite clear in some visible images. The data fidelity term is characterized by pixel intensity so as not to lose some pixel intensity information.
(2.2) regularization term
Figure BDA0002023998400000031
Wherein ε 2 (x) Regularization term representing the fused image; q represents the norm of the regularization term;
Figure BDA00020239984000000314
representing gradients of the fused image; />
Figure BDA00020239984000000311
Representing the gradient of the visible light image; />
Figure BDA00020239984000000312
Representing gradients of the infrared image; it can be seen in the fidelity term that the pixel intensity distribution of the object and the background is preserved, while the detailed appearance information for the object and the scene is basically characterized by the gradient of the image, so the regularization term is characterized by the pixel gradient to enhance the complete visual morphological depiction of the object and the background, so that the fused image has more detailed appearance information.
(3) The fusion problem is expressed as an initial objective function in conjunction with formulas (I) and (II):
Figure BDA0002023998400000032
wherein ε (x) represents the objective function; m, lambda 1 And lambda (lambda) 2 Is a parameter used to adjust the trade-off between x and u, v. In the case where the first and second constrained fusion images x have similar pixel intensities as the infrared image u and the visible light image v, the third and fourth require fusion images x and the infrared image u and the visible light image v to have similar gradients, m, λ 1 And lambda (lambda) 2 The objective function (III) is to make the fused image look more preferable to the visual form, the detail texture of the appearance is clear, the target can be highlighted and the detail information is not lost, and the image with the function of keeping and enhancing the background detail information while the highlighting texture of the target is clear is presented.
(4) Optimization using total variation minimization
Regarding the p, q norms in formula (III), we want to preserve the thermal radiation information of the infrared image and the information of relatively high pixel intensity in the visible image in our problem, so p=1, and also to promote sparsity of gradient, use L 1 The norm minimizes the gradient difference, i.e. q=1, for an image of size a×b we use y e R ab×1 A column vector form representing its pixel intensity, having a gray value ranging from 0 to 255. Let y=x-v, i.e. x=y+v, the optimization (III) can be rewritten as:
Figure BDA00020239984000000315
Figure BDA0002023998400000033
Figure BDA0002023998400000034
wherein y represents the difference between the fusion image x and the visible light image v; t (y) represents the minimized energy functional;
Figure BDA0002023998400000035
for each +.>
Figure BDA0002023998400000036
Representing image gradient +.>
Figure BDA00020239984000000313
At pixels i and->
Figure BDA0002023998400000039
And->
Figure BDA00020239984000000310
Corresponding to the horizontal and vertical first step difference, respectively, i.e. +.>
Figure BDA0002023998400000037
And->
Figure BDA0002023998400000038
Where r (i) and b (i) represent nearest neighbors to the right and below pixel i. Further, if the pixel i is located in the last row or column, both r (i) and b (i) are set to i. The objective function (IV) is convex and thus has a globally optimal solution, the algorithm being such that m, λ are adjusted 1 And lambda (lambda) 2 The values of (2) are such that the fidelity term and regularization term reach the appropriate points, so that the fused image can retain the heat radiation information and clear appearance of the two source imagesTexture information.
(5) Solving T (y) by using a generalized total variation functional minimization method: problem (IV) is Standard L 1 The idea of the solution is to decompose the formula (IV) into two parts, namely a fidelity term and a regularization term, to solve the problem step by step, and finally combine the two terms to obtain the following formula:
Figure BDA0002023998400000041
Figure BDA0002023998400000042
wherein: the first term is a fidelity term, the second and third terms are regularization terms, C (y (k) ) Is a constant value, and is used for the control of the temperature,
Figure BDA0002023998400000043
and->
Figure BDA0002023998400000044
Iterative functions respectively representing a fidelity term and a regularization term; y is (k) Representing y pairs of iterations k times;
(6) For equation (V) it is necessary to pass through the existing L 1 -TV minimisation optimization iteration to calculate T (y), i.e.: cycle k=0, 1,; when T is (k) (y) if the convergence condition is met, ending the iteration, otherwise k=k+1 returning to (5);
(7) The fused image x is determined by * Is a global optimal solution of (1): x is x * =t (y) +v, resulting in the final fusion image x *
The invention has the beneficial effects that:
1. the invention adopts the method of minimizing fusion of infrared and visible light images based on enhanced gradient transfer and total variation, and fusion on the space domain, so that the gradient and pixel intensity of two source images can be maintained at the same time, and detail characteristics such as textures, contours and the like of the source images can be fully fused, so that the definition of the fused images is higher, the information quantity is more abundant, and the quality is better.
2. The infrared and visible light image fusion method provided by the invention uses three parameters to adjust the proportion relation between the fused image and the source image, and can debug the required fusion effect graph according to the requirements, and has the advantages of flexible structure and low calculation complexity, so that the method can meet the requirements of the public.
Drawings
Fig. 1 (a) shows the present invention when the parameter m=0, λ 1 =4,λ 2 Fusion image of =0.
Fig. 1 (b) is a fused image of the GTF invention at parameter λ=4.
Fig. 1 (c) shows the present invention when the parameter m=4, λ 1 =4,λ 2 Fusion image of =0.
Fig. 2 (a) shows the present invention for maintaining the parameters m=0, λ in different source images 1 In the case of =4 unchanged, λ 2 The display result of the SSIM index of the fused image increases from 0 to 40 steps by 4.
FIG. 2 (b) shows the maintenance of the parameter lambda in different source images according to the invention 1 =4,λ 2 And when the value of m is unchanged, the value of m is increased from 0 to 40 step sizes to 4, and the display result of the fused image SSIM index is displayed.
Fig. 3 shows the present invention for maintaining the parameters m=0, λ in different source images 1 In the case of =4 unchanged, λ 2 The subjective fusion image results with values of 4 increasing from 0 to 40 steps are displayed. Wherein (a-1) to (a-5) each represent a group represented by lambda 2 When=0, 8, 12, 16, 20, subjective Bunker fusion images were corresponding; (b-1) to (b-5) each represent when lambda 2 When=0, 8, 12, 16, 20, subjective Lake fusion images were corresponding; (c-1) to (c-5) each represent when lambda 2 When=0, 8, 12, 16, 20, subjective rank fusion images were corresponding.
FIG. 4 is a graph showing the retention of a parameter lambda in different source images according to the present invention 1 =4,λ 2 With =0 unchanged, subjective fusion image results with m increasing in value from 0 to 40 steps of 4 are displayed. Wherein, (a-1) to (a-5) represent, when m=0, 4,8, 16, 40, respectively, corresponding subjective Bunker fusion images; (b-1) to (b-5) represent corresponding subjective Lake fusion images when m=0, 4,8, 16, 40, respectively; (c-1) to (c-5) represent corresponding subjective Tank fusion images when m=0, 4,8, 16, 40, respectively.
FIG. 5 is a graphical representation of the fusion of 6 different source images according to the present invention, with (a-1) through (f-1) being visible images of Bunker, lake, oneman in front of hous, sandpath, NATO-cup and Tank; (a-2) through (f-2) are infrared images of Bunker, lake, one man in front of hous, sandpath, NATO-lamp and Tank; starting from line 3 to line 11, the result graph of fusing each pair of visible light/infrared images by adopting different fusion methods, wherein the fusion methods are as follows in sequence from top to bottom: the fusion image of the method, the fusion image based on GTF, the fusion image based on LP, the fusion image based on RP, the fusion image based on Wavelet, the fusion image based on DTCWT, the fusion image based on CVT, the fusion image based on MSVD and the fusion image based on LP-SR.
FIG. 6 is a quantitative comparison of the fusion image of the present invention at MI indicators. The fusion method for each image comprises the following steps of: LP, RP, wavelet, DTCWT, CVT, MSVD, LP-SR, GTF, the method.
FIG. 7 is a quantitative comparison of the fusion image of the present invention at EN indicators. The fusion method for each image comprises the following steps of: LP, RP, wavelet, DTCWT, CVT, MSVD, LP-SR, GTF, the method.
FIG. 8 is a quantitative comparison of fusion images of the present invention at the Yang index. The fusion method for each image comprises the following steps of: LP, RP, wavelet, DTCWT, CVT, MSVD, LP-SR, GTF, the method.
FIG. 9 is a quantitative comparison of fused images of the present invention at the Chen index. The fusion method for each image comprises the following steps of: LP, RP, wavelet, DTCWT, CVT, MSVD, LP-SR, GTF, the method.
Fig. 10 is a flow chart of the method of the present invention.
Detailed Description
An example of the present invention (infrared and visible light images) is described in detail below with reference to the accompanying drawings, and the present embodiment is performed on the premise of the technical solution of the present invention, and the detailed implementation manner and specific operation steps are as follows:
step 1: for formula (III) we set the parameters m, lambda 2 Equal to 0, the formula can be converted to the following format:
Figure BDA0002023998400000051
let lambda for formula (VI) 1 A fusion image obtained by fusing the images=4 is shown in fig. 1 (a); next, let λ=4 in the original GTF model to obtain a fused image as shown in fig. 1 (b);
step 2: also for equation (III) we set m=4, λ 1 =4,λ 2 The fusion image obtained by=0 is shown in fig. 1 (c);
step 3: we used to verify λ in formula (III) 2 The function of (1) is set to m=0, λ 1 =4, then adjust λ 2 The values of (2) are gradually increased from 0 to 40 steps of 4, and the fusion results obtained by fusing 6 source images (Bunker, lake, one man in front of hous, sandpath, NATO-lamp and Tank, respectively) are evaluated by SSIM objective index, and the display results are shown with lambda 2 The value increase index gradually approaches 1 but never equals 1 as shown in FIG. 2 (a), which is consistent with the fusion objective being operated on, where λ is chosen 2 =0, 8, 12, 16, 20 corresponds to subjective fusion images, and the results are shown in fig. 3.
Step 4: we used to verify the effect of m in formula (III), λ was set first 1 =4,λ 2 The fusion results obtained by fusing 6 source images (Bunker, lake, one man in front of hous, sandath, NATO-clamp and Tank, respectively) with values of m=0 and then adjusting the value of m to gradually increase from 0 to 40 steps of 4, are evaluated by SSIM objective index, and the displayed results are that the index gradually approaches 1 but never equals 1 as the value of m increases, as shown in fig. 2 (b), which is consistent with the objective of the fusion operation, wherein m=0, 4,8, 16, 40 is selected to correspond to the subjective fusion image, and the result is shown in fig. 4.
Simulation experiment:
to verify the feasibility and effectiveness of the present invention, fusion experiments were performed according to the method of the present invention using the method of the present invention on 6 sets of infrared and visible images, as shown in the first and second rows of fig. 5.
From the above, it can be seen from the comparison of the fusion results of fig. 5: the fusion image obtained by the method is faithful to original information to the greatest extent, the characteristics of objects, textures and the like in the image to be fused are better maintained, and the loss of the object textures and the background pixel intensity is effectively avoided, so that the contrast and the definition of the image are higher, the details are more prominent, the subjective visual effect is best, and the fusion result is more ideal.
Fig. 6, 7, 8 and 9 show objective evaluation indexes of fusion results obtained by various fusion methods. The higher the histogram is, the optimal value of the evaluation index obtained by the source image in the image fusion method is represented. The objective evaluation index is not highest in this method for several of the images, because the weak pixel intensities of the infrared image gradient and the visible image in the source image result in a fused image that is inferior to other fusion methods.
As can be seen from the data in fig. 6, 7, 8 and 9, the fusion image obtained by the method of the present invention is superior to other fusion methods in terms of objective evaluation indexes such as Mutual Information (MI), information Entropy (EN), yang, chen, etc. The MI reflects that the larger the mutual information between the fusion image and the fusion image obtained by the fusion algorithm, the higher the correlation between the fusion image and the fusion image, and the higher the effect of the fusion image. The entropy reflects the amount of information carried by the image, and the larger the entropy is, the more the information is contained, and the better the fusion effect is; image fusion quality measurement based on similarity by Yang; chen index is tested by using the perception degree of contrast sensitivity function type vision system to the image space frequency. The method comprises the steps of firstly partitioning the picture, and then calculating the saliency of the region.

Claims (1)

1. A method of enhancing gradient transfer and minimizing total variation in a fused image comprising the steps of:
(1) Carrying out gradient transfer transformation on the infrared image and the visible light image to be fused to obtain corresponding gradient and pixel intensity;
(2) Establishing a data fidelity term and a regularization term, wherein the data fidelity term is used for constraining the fusion image to have pixel intensities similar to those of the given infrared image and visible light image, and the regularization term is used for ensuring that gradient distribution in the infrared image and the visible light image can be transferred into the fusion image;
(2.1) establishing data Fidelity terms
Figure FDA0004111825780000011
Wherein x represents a fused image; epsilon 1 (x) A fidelity term representing the fused image; p represents the norm of the fidelity term; u represents an infrared image; v represents a visible light image; wherein the sizes of x, u and v images are all a×b;
(2.2) establishing a regularization term
Figure FDA0004111825780000012
Wherein ε 2 (x) Regularization term representing the fused image; q represents the norm of the regularization term;
Figure FDA00041118257800000114
representing gradients of the fused image;
Figure FDA0004111825780000013
representing the gradient of the visible light image; />
Figure FDA0004111825780000014
Representing gradients of the infrared image;
(3) Combining the data fidelity term with the regularization term to obtain an initial objective function
Figure FDA0004111825780000015
Wherein e (x) represents an objective function; m, lambda 1 And lambda (lambda) 2 Is a parameter used to adjust the trade-off between x and u, v;
(4) For an image of size a×b, y∈R is used ab×1 A column vector form representing its pixel intensity, having a gray value ranging from 0 to 255; let p=1, q=1, let y=x-v, i.e. x=y+v, given an initial allowed error ε > 0, the fusion problem is transformed into an optimized minimum energy functional model:
Figure FDA0004111825780000016
wherein y represents the difference between the fusion image x and the visible light image v; t (y) represents the minimized energy functional;
Figure FDA0004111825780000017
wherein (x) 1 ,x 2 )∈R 2 The method comprises the steps of carrying out a first treatment on the surface of the i represents a pixel; />
Figure FDA0004111825780000018
Wherein->
Figure FDA0004111825780000019
Representing the gradient of pixel i +.>
Figure FDA00041118257800000110
And
Figure FDA00041118257800000111
horizontal and vertical first-order differences, i.e. +.>
Figure FDA00041118257800000112
And->
Figure FDA00041118257800000113
r (i) and b (i) represent nearest neighbors to the right and below pixel i, and when pixel i is in the last row or column, then both r (i) and b (i) are set to i;
since the objective function (IV) is convex, it has a globally optimal solution by adjusting m, λ in the objective function (IV) 1 And lambda (lambda) 2 The values of the two source images are obtained by fusion, so that the fidelity term and the regularization term reach proper points, and the fused image can retain heat radiation information and clear appearance texture information of the two source images;
(5) Solving T (y) by using a generalized total variation functional minimization method:
decomposing the formula (IV) into two parts of a fidelity term and a regularization term, solving the two parts step by step, and finally combining the two parts to obtain the following formula:
Figure FDA0004111825780000021
wherein the first item
Figure FDA0004111825780000022
Is a fidelity term, second term->
Figure FDA0004111825780000023
And third item->
Figure FDA0004111825780000024
Is a regularization term, c (y (k) ) Is constant, & lt>
Figure FDA0004111825780000025
And->
Figure FDA0004111825780000026
Iterative functions respectively representing a fidelity term and a regularization term; y is (k) Representing iterating k times for y;
(6) For equation (V) it is necessary to pass through the existing L 1 TV minimizes optimization iterations to calculate T (y),namely: cycle k=0, 1, …; when T is (k) (y) if the convergence condition is met, ending the iteration, otherwise k=k+1 returning to (5);
(7) By the formula x * =t (y) +v, determining fusion image x * Is subjected to global optimal solution to finally obtain a final fusion image x *
CN201910288177.9A 2019-04-11 2019-04-11 Method for minimizing fusion image by enhanced gradient transfer and total variation Active CN110084774B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910288177.9A CN110084774B (en) 2019-04-11 2019-04-11 Method for minimizing fusion image by enhanced gradient transfer and total variation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910288177.9A CN110084774B (en) 2019-04-11 2019-04-11 Method for minimizing fusion image by enhanced gradient transfer and total variation

Publications (2)

Publication Number Publication Date
CN110084774A CN110084774A (en) 2019-08-02
CN110084774B true CN110084774B (en) 2023-05-05

Family

ID=67414771

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910288177.9A Active CN110084774B (en) 2019-04-11 2019-04-11 Method for minimizing fusion image by enhanced gradient transfer and total variation

Country Status (1)

Country Link
CN (1) CN110084774B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110544225A (en) * 2019-08-08 2019-12-06 北京首贝科技发展有限公司 lightweight image fusion algorithm and device based on weak computing power
CN115082968B (en) * 2022-08-23 2023-03-28 天津瑞津智能科技有限公司 Behavior identification method based on infrared light and visible light fusion and terminal equipment
CN115620030B (en) * 2022-12-06 2023-04-18 浙江正泰智维能源服务有限公司 Image matching method, device, equipment and medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014078985A1 (en) * 2012-11-20 2014-05-30 Thomson Licensing Method and apparatus for image regularization
CN103914815A (en) * 2012-12-31 2014-07-09 诺基亚公司 Image fusion method and device
CN107945145A (en) * 2017-11-17 2018-04-20 西安电子科技大学 Infrared image fusion Enhancement Method based on gradient confidence Variation Model

Also Published As

Publication number Publication date
CN110084774A (en) 2019-08-02

Similar Documents

Publication Publication Date Title
CN110533620B (en) Hyperspectral and full-color image fusion method based on AAE extraction spatial features
CN110084774B (en) Method for minimizing fusion image by enhanced gradient transfer and total variation
CN110189286B (en) Infrared and visible light image fusion method based on ResNet
Wang et al. Variational single nighttime image haze removal with a gray haze-line prior
CN112950518B (en) Image fusion method based on potential low-rank representation nested rolling guide image filtering
CN107909562A (en) A kind of Fast Image Fusion based on Pixel-level
Bi et al. Haze removal for a single remote sensing image using low-rank and sparse prior
CN107689038A (en) A kind of image interfusion method based on rarefaction representation and circulation guiding filtering
CN112215787B (en) Infrared and visible light image fusion method based on significance analysis and adaptive filter
CN105809650B (en) A kind of image interfusion method based on bidirectional iteration optimization
CN110060218A (en) Remote sensing image processing method based on GIS-Geographic Information System
CN107451986B (en) Single infrared image enhancement method based on fusion technology
CN114331937B (en) Multi-source image fusion method based on feedback iterative adjustment under low illumination condition
Liu et al. TSE_Fuse: Two stage enhancement method using attention mechanism and feature-linking model for infrared and visible image fusion
Zhang et al. Multisensor Infrared and Visible Image Fusion via Double Joint Edge Preservation Filter and Nonglobally Saliency Gradient Operator
CN105528772B (en) A kind of image interfusion method based on directiveness filtering
Wang et al. Rapid nighttime haze removal with color-gray layer decomposition
Yuan et al. Defogging Technology Based on Dual‐Channel Sensor Information Fusion of Near‐Infrared and Visible Light
Chen et al. Low‐light image enhancement based on exponential Retinex variational model
CN114897757B (en) NSST and parameter self-adaptive PCNN-based remote sensing image fusion method
CN113205471A (en) Log-Gabor transformation and direction region entropy-based guided filtering remote sensing image fusion method
Guo et al. Restoration of underwater vision using a two-phase regularization mechanism
Liu et al. Adaptive weighted image fusion algorithm based on NSCT multi-scale decomposition
CN113096033A (en) Low-illumination image enhancement method based on Retinex model self-adaptive structure
Jiao [Retracted] Optimization of Color Enhancement Processing for Plane Images Based on Computer Vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant