CN111861957B - Image fusion method and device - Google Patents

Image fusion method and device Download PDF

Info

Publication number
CN111861957B
CN111861957B CN202010633192.5A CN202010633192A CN111861957B CN 111861957 B CN111861957 B CN 111861957B CN 202010633192 A CN202010633192 A CN 202010633192A CN 111861957 B CN111861957 B CN 111861957B
Authority
CN
China
Prior art keywords
coefficient matrix
detail coefficient
target
layer
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010633192.5A
Other languages
Chinese (zh)
Other versions
CN111861957A (en
Inventor
符采灵
陈云娜
金羽锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TCL China Star Optoelectronics Technology Co Ltd
Original Assignee
TCL China Star Optoelectronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TCL China Star Optoelectronics Technology Co Ltd filed Critical TCL China Star Optoelectronics Technology Co Ltd
Priority to CN202010633192.5A priority Critical patent/CN111861957B/en
Publication of CN111861957A publication Critical patent/CN111861957A/en
Application granted granted Critical
Publication of CN111861957B publication Critical patent/CN111861957B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20064Wavelet transform [DWT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The application relates to an image fusion method and an image fusion device, wherein a plurality of images to be fused corresponding to the same scene are obtained, then multi-layer wavelet decomposition is carried out on each image to be fused by utilizing a two-dimensional discrete wavelet transformation function to obtain an approximate coefficient matrix corresponding to a low-frequency layer and a detail coefficient matrix group corresponding to each high-frequency layer in a plurality of high-frequency layers, then weighting calculation is carried out on the approximate coefficient matrices corresponding to the plurality of images to be fused to obtain a target approximate coefficient matrix, a first target detail coefficient matrix group is determined according to a first preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer, a second target detail coefficient matrix group is determined according to a second preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the remaining layer, and then a fusion image is generated according to the target approximate coefficient matrix, the first target detail coefficient matrix group and the second target detail coefficient matrix group, so that the fusion image is clearer and the fusion effect is improved.

Description

Image fusion method and device
[ field of technology ]
The present application relates to the field of image processing technologies, and in particular, to an image fusion method and apparatus.
[ background Art ]
Image Fusion refers to the integration of the same target Image set from multiple information sources into an Image with more details and information according to the corresponding Fusion technology, so as to achieve the effect of Image sharpening. In recent years, with the continuous development of image fusion technology, the technology has been widely applied to various fields such as remote sensing, medical imaging, and computer vision.
Image fusion can be classified into pixel-level based fusion, feature-level based fusion and decision-level based fusion, wherein the image information acquired by the pixel-level based fusion is richer and more excellent in accuracy and robustness, so that the image fusion is widely studied. The fusion method based on the pixel level comprises a non-multi-scale method, a multi-scale method and the like, wherein the multi-scale method comprises an image fusion method based on wavelet transformation, the method comprises the steps of respectively carrying out wavelet transformation on each pair of original images to obtain a low-frequency layer and a high-frequency layer, then respectively adopting different fusion rules for processing the low-frequency layer and the high-frequency layer, and finally carrying out inverse wavelet transformation to obtain a fusion image. The conventional image fusion method based on wavelet transformation generally adopts a weighting rule to process both a low-frequency layer and a high-frequency layer or adopts a weighting rule to process both the low-frequency layer and the high-frequency layer, and adopts a region average energy fusion rule to process both the low-frequency layer and the high-frequency layer, wherein the information entropy and standard deviation of the obtained fusion image are lower than those of the original image, while the information entropy is higher than the original image, the standard deviation is lower than that of the original image, so that the contrast ratio of the fusion image is not high enough and the definition is not high enough.
[ invention ]
The invention aims to provide an image fusion method and device, and aims to solve the technical problems that the contrast of a fusion image obtained by the existing wavelet transform-based image fusion method is not high enough and not clear enough.
In order to solve the above problems, an embodiment of the present application provides an image fusion method, including:
acquiring a plurality of images to be fused corresponding to the same scene;
performing multi-layer wavelet decomposition on each image to be fused by utilizing a two-dimensional discrete wavelet transformation function to obtain an approximate coefficient matrix corresponding to a low-frequency layer and a detail coefficient matrix group corresponding to each high-frequency layer in a plurality of high-frequency layers, wherein each detail coefficient matrix group comprises a plurality of detail coefficient matrixes, and different detail coefficient matrixes correspond to different decomposition directions;
weighting calculation is carried out on the approximate coefficient matrixes corresponding to the multiple images to be fused so as to obtain a target approximate coefficient matrix;
determining a first target detail coefficient matrix group according to a first preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer number;
determining a second target detail coefficient matrix group according to a second preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the residual layer number;
and generating a fusion image according to the target approximate coefficient matrix, the first target detail coefficient matrix group and the second target detail coefficient matrix group so as to fuse the plurality of images to be fused.
The determining a first target detail coefficient matrix group according to a first preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer number specifically includes:
comparing the detail coefficients in the same decomposition direction and the same coefficient position in the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer number;
the detail coefficient with the largest numerical value at each coefficient position in each decomposition direction is formed into a corresponding first target detail coefficient matrix;
and taking the first target detail coefficient matrixes corresponding to different decomposition directions as a first target detail coefficient matrix group.
The determining a second target detail coefficient matrix group according to a second preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the remaining layers specifically includes:
expanding each detail coefficient matrix in the detail coefficient matrix group corresponding to the high-frequency layer in the residual layer number to obtain a corresponding expansion matrix;
dividing the expansion matrix into a plurality of submatrices with preset sizes by taking each detail coefficient as a center, wherein the detail coefficients are in one-to-one correspondence with the submatrices;
calculating the spatial frequency of each submatrix;
comparing the spatial frequencies of all the submatrices corresponding to the same decomposition direction and the same coefficient position in the same layer number;
in the same decomposition direction in the same layer number, the detail coefficient corresponding to the spatial frequency with the largest value at each coefficient position forms a second target detail coefficient matrix;
taking the second target detail coefficient matrixes corresponding to different decomposition directions in the same layer number as a second target detail coefficient matrix group, wherein each layer corresponds to a second target detail coefficient matrix group
When the multiple images to be fused are all color images, before the multiple layers of wavelet decomposition is performed on the multiple images to be fused by using the two-dimensional discrete wavelet transform function, the method further comprises:
converting the plurality of color images into a gray scale image;
the wavelet decomposition of the plurality of images to be fused by using a two-dimensional discrete wavelet transform function comprises the following steps: and carrying out multi-layer wavelet decomposition on the plurality of gray images by utilizing a two-dimensional discrete wavelet transform function.
Wherein after generating the fused image according to the target approximation coefficient matrix, the first target detail coefficient matrix set, and the second target detail coefficient matrix set, the method further comprises:
the fused image is converted into a final color image.
In order to solve the above problem, an embodiment of the present application further provides an image fusion apparatus, including:
the acquisition module is used for acquiring a plurality of images to be fused corresponding to the same scene;
the decomposition module is used for carrying out multi-layer wavelet decomposition on each image to be fused by utilizing a two-dimensional discrete wavelet transformation function so as to obtain an approximate coefficient matrix corresponding to a low-frequency layer and a detail coefficient matrix group corresponding to each high-frequency layer in a plurality of high-frequency layers, wherein each detail coefficient matrix group comprises a plurality of detail coefficient matrixes, and different detail coefficient matrixes correspond to decomposition-free directions;
the calculation module is used for carrying out weighted calculation on the approximate coefficient matrixes corresponding to the multiple images to be fused so as to obtain a target approximate coefficient matrix;
the first determining module is used for determining a first target detail coefficient matrix group according to a first preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer number;
the second determining module is used for determining a second target detail coefficient matrix group according to a second preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the residual layer number;
and the fusion module is used for generating a fusion image according to the target approximate coefficient matrix, the first target detail coefficient matrix group and the second target detail coefficient matrix group so as to fuse the plurality of images to be fused.
The first determining module is specifically configured to:
comparing the detail coefficients in the same decomposition direction and the same coefficient position in the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer number;
the detail coefficient with the largest numerical value at each coefficient position in each decomposition direction is formed into a corresponding first target detail coefficient matrix;
and taking the first target detail coefficient matrixes corresponding to different decomposition directions as a first target detail coefficient matrix group.
The second determining module is specifically configured to:
expanding each detail coefficient matrix in the detail coefficient matrix group corresponding to the high-frequency layer of the residual layers to obtain a corresponding expansion matrix;
dividing the expansion matrix into a plurality of submatrices with preset sizes by taking each detail coefficient as a center, wherein the detail coefficients are in one-to-one correspondence with the submatrices;
calculating the spatial frequency of each submatrix;
comparing the spatial frequencies of all the submatrices corresponding to the same decomposition direction and the same coefficient position in the same layer number;
in the same decomposition direction in the same layer number, the detail coefficient corresponding to the spatial frequency with the largest value at each coefficient position forms a second target detail coefficient matrix;
and taking the second target detail coefficient matrixes corresponding to different decomposition directions in the same layer number as a second target detail coefficient matrix group, wherein each layer number corresponds to one second target detail coefficient matrix group.
Wherein, when the multiple images to be fused are all color images, the image fusion device further comprises a first conversion module for:
converting the plurality of color images into gray images before the decomposition module performs multi-layer wavelet decomposition on the plurality of images to be fused by utilizing a two-dimensional discrete wavelet transform function;
the decomposition module is specifically used for: and carrying out multi-layer wavelet decomposition on the plurality of gray images by utilizing a two-dimensional discrete wavelet transform function.
The image fusion device further comprises a second conversion module, wherein the second conversion module is used for:
and after the fusion module generates a fusion image according to the target approximate coefficient matrix, the first target detail coefficient matrix group and the second target detail coefficient matrix group, converting the fusion image into a final color image.
The beneficial effects of this application are: compared with the prior art, the method and the device have the advantages that the detail coefficient matrix groups in the high-frequency layers with different layers are processed by adopting different algorithms, so that the information entropy and standard deviation of the fusion image are increased, the image information richness and contrast of the fusion image are improved to the greatest extent, the fusion image is clearer, and the fusion effect is improved.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is an application scenario schematic diagram of an image fusion method provided in an embodiment of the present application;
fig. 2 is a schematic flow chart of an image fusion method according to an embodiment of the present application;
fig. 3 is another flow chart of an image fusion method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a two-layer wavelet decomposition process provided by an embodiment of the present application;
fig. 5 is a schematic structural diagram of an image fusion apparatus according to an embodiment of the present application;
fig. 6 is another schematic structural diagram of an image fusion apparatus according to an embodiment of the present application.
[ detailed description ] of the invention
The present application is described in further detail below with reference to the drawings and examples. It is specifically noted that the following examples are only for illustration of the present application, but do not limit the scope of the present application. Likewise, the following embodiments are only some, but not all, of the embodiments of the present application, and all other embodiments obtained by one of ordinary skill in the art without inventive effort are within the scope of the present application.
Referring to fig. 1, fig. 1 provides a schematic view of an application scenario of a picture fusion system, where the image fusion system may include any one of the image fusion apparatuses provided in the embodiments of the present application.
As shown in fig. 1, first, two-dimensional discrete wavelet decomposition (2d_dwt) is performed on an input image A1 and an input image A2 to obtain a low-frequency layer, a first-layer high-frequency layer, and a second-layer (i.e., highest-layer) high-frequency layer corresponding to the two input images, respectively, and the low-frequency layer, the first-layer high-frequency layer, and the second-layer high-frequency layer of the two input images A1 and A2 are fused by different fusion rules to obtain a target low-frequency layer, a target first-layer high-frequency layer, and a target second-layer high-frequency layer, and then, inverse two-dimensional discrete wavelet decomposition (2d_i DWT) is performed on the target low-frequency layer, the target first-layer high-frequency layer, and the target second-layer high-frequency layer to obtain a final fused image a.
Referring to fig. 2, fig. 2 is a flow chart of an image fusion method provided in an embodiment of the present application, where the image fusion method may be applied to a terminal or a server, for example, a smart phone, a background server of online image processing software, etc., and the specific flow may be as follows:
s101: and acquiring a plurality of images to be fused corresponding to the same scene.
Specifically, the multiple images to be fused corresponding to the same scene may be understood as multiple images of different data sources of the same target, for example, different focused images obtained according to camera lens variation, and infrared images and visible light images of skin tissues obtained by medical photographing, etc., and the multiple images to be fused should have the same size.
It should be noted that, when the plurality of images to be fused are all color images, referring to fig. 2, after step S101, step S107 should be further included: the color image is converted into a gray scale image. For example, an RGB image is converted into an HSV space, specifically, the HSV is divided into three components, H (Hue) represents Hue, S (Saturation) represents Saturation, color purity, V (Value) represents brightness, and, similarly to RGB, the HSV is also a color coding method, and after conversion into the HSV space, a luminance channel (i.e., a V channel, i.e., a gray image) is then subjected to subsequent processing.
S102: and carrying out multi-layer wavelet decomposition on each image to be fused by utilizing a two-dimensional discrete wavelet transformation function so as to obtain an approximate coefficient matrix corresponding to the low-frequency layer and a detail coefficient matrix group corresponding to each high-frequency layer in the plurality of high-frequency layers.
In this embodiment, each detail coefficient matrix group includes a plurality of detail coefficient matrices, and different detail coefficient matrices correspond to different decomposition directions.
Specifically, n-layer decomposition is performed on an image by using a two-dimensional discrete wavelet transform function to obtain an approximation coefficient matrix corresponding to 1 low-frequency layer and a plurality of detail coefficient matrices corresponding to 3n high-frequency layers, taking two-layer wavelet decomposition as an example, please refer to fig. 4, fig. 4 is a process diagram of two-layer wavelet decomposition provided in the embodiment of the present application, and as shown in the drawing, when the first layer decomposition is performed, an approximation coefficient matrix i1_1a and three detail coefficient matrices i1_1h, i1_1v and i1_1d are obtained, wherein i1_1h, i1_1v and i1_1d jointly form a detail coefficient matrix group, "h", "v" and "d" represent different decomposition directions, wherein "h" represents a horizontal detail component, "v" represents a vertical detail component, "d" represents a diagonal detail component, and then when the second layer wavelet decomposition is performed, i1_2h, i1_2v and i1_2d are obtained, wherein i1_1h, i1_1v and i1_1d jointly form a detail coefficient matrix group corresponding to the first layer 1_2d, and the highest layer 1_1d jointly form a high-frequency coefficient matrix group (i_2h_2d).
S103: and carrying out weighted calculation on the approximate coefficient matrixes corresponding to the multiple images to be fused so as to obtain a target approximate coefficient matrix.
Specifically, taking two images to be fused as an example, for the coefficient I1/u of the ith row and jth column in the nth layer detail matrix of the image 1 n a (i, j), and the n-th layer detail coefficient moment of image 2Coefficient I2/u of ith row and jth column in array n a (i, j) and can be weighted by the following weighting formula: i/u n a(i,j)=0.5×I1_ n a(i,j)+0.5×I2_ n a(i,j),I_ n a (i, j) represents the coefficient of the ith row and jth column in the target approximation coefficient matrix, and 0.5 is a weighted weight value.
S104: and determining a first target detail coefficient matrix group according to a first preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer number.
The first preset algorithm is a fusion algorithm based on a coefficient maximum value, and for a detail coefficient matrix corresponding to a high-frequency layer of an nth layer obtained by wavelet transformation, the fusion algorithm based on the coefficient maximum value is adopted, and the specific flow is as follows:
with continued reference to fig. 3, step S104 may specifically include the following sub-steps:
s1041: and comparing the detail coefficients in the same decomposition direction and the same coefficient position in the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer number.
For example, as shown in fig. 4, the detail coefficient matrix set corresponding to the highest-layer high-frequency layer obtained after the two-dimensional discrete wavelet transform of fig. I1 includes i1_2h, i1_2v and i1_2d, and taking the fusion of two images as an example, the detail coefficient matrix set corresponding to the highest-layer high-frequency layer obtained after the two-dimensional discrete wavelet transform of fig. I2 should include i2_2h, i2_2v and i2_2d, where "h", "v" and "d" represent different decomposition directions, and therefore, the detail coefficients at the same coefficient positions of i1_2h and i2_2h should be compared.
S1042: and forming a corresponding first target detail coefficient matrix by using the detail coefficient with the largest numerical value at each coefficient position in each decomposition direction.
S1043: and taking the first target detail coefficient matrixes corresponding to different decomposition directions as a first target detail coefficient matrix group.
Specifically, each layer after decomposition has three decomposition directions "h", "v", and "d", and thus each first target detail coefficient matrix group includes three first target detail coefficient matrices.
S105: and determining a second target detail coefficient matrix group according to a second preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layers with the residual layers.
The second preset algorithm is a fusion algorithm based on regional spatial frequency, and for a detail coefficient matrix corresponding to a high-frequency layer of the residual layer number obtained by wavelet transformation, the fusion algorithm based on regional spatial frequency is adopted, and the specific flow is as follows:
with continued reference to fig. 3, step S105 may specifically include the following sub-steps:
s1051: and expanding each detail coefficient matrix in the detail coefficient matrix groups corresponding to the high-frequency layers of the remaining layers to obtain corresponding expansion matrices.
Specifically, the spatial frequency in the subsequent step is the spatial frequency of the local area in the detail coefficient matrix, that is, the spatial frequency of the m×n small matrix is obtained by centering on each coefficient in the detail coefficient matrix, and then the spatial frequency of the small matrix is calculated, however, in order to satisfy that the detail coefficient at the edge of the detail coefficient matrix can also be used as the central calculated spatial frequency, the detail coefficient matrix is first expanded before the small matrix is divided, and the specific number of expanded rows and columns is determined according to the sizes of M and N, for example, the size of the original detail coefficient matrix is 1920×1080, the size of the small matrix is 3×3, and the size of the matrix obtained after 0 expansion is 1922×1082, where the numerical value used for expansion is determined according to the case.
S1052: dividing the expansion matrix into a plurality of submatrices with preset sizes by taking each detail coefficient as a center, wherein the detail coefficients correspond to the submatrices one by one.
Specifically, the extended matrix may be divided into 3×3 sub-matrices centered on each detail coefficient, with each detail coefficient corresponding to a 3×3 sub-matrix.
S1053: the spatial frequency of each sub-matrix is calculated.
Specifically, taking the size of the submatrix as m×n as an example, the method for calculating the spatial frequency is as follows:
first, the horizontal spatial frequency of the submatrix is calculated, and for the horizontal spatial frequency row_freq of the submatrix centered on the ith Row and jth column coefficients of the image I1, the calculation formula is as follows:
wherein I1 (I, j) represents the coefficient of the ith row and jth column in the submatrix, and I1 (I, j-1) represents the coefficient of the ith row and jth-1 column in the submatrix;
the vertical spatial frequency of the submatrix is calculated, and for the vertical spatial frequency Col_Freq of the submatrix taking the ith row and jth column coefficients of the image I1 as the center, the calculation formula is as follows:
wherein I1 (I, j) represents the coefficient of the ith row and jth column in the submatrix, and I1 (I-1, j) represents the coefficient of the ith-1 th row and jth column in the submatrix;
for the horizontal Spatial frequency row_freq, and the vertical Spatial frequency col_freq, the total Spatial frequency spatial_freq is calculated according to the following formula:
s1054: and comparing the spatial frequencies of all the submatrices corresponding to the same decomposition direction and the same coefficient position in the same layer number.
S1055: and in the same decomposition direction in the same layer number, the detail coefficient corresponding to the spatial frequency with the largest value at each coefficient position forms a second target detail coefficient matrix.
Specifically, the pixel gray level of the image reflected by the spatial frequency changes in space, the larger the spatial frequency is, the more the gray level changes, and the more the image details are, therefore, we select the detail coefficient corresponding to the spatial frequency with the largest value at each coefficient position to form the second target detail coefficient matrix in the same decomposition direction in the same layer number, so as to enrich the image information and improve the fusion effect.
S1056: and taking the second target detail coefficient matrixes corresponding to different decomposition directions in the same layer number as a second target detail coefficient matrix group, wherein each layer number corresponds to one second target detail coefficient matrix group.
Specifically, each layer number corresponds to a second target detail coefficient matrix group, and each second target detail coefficient matrix group comprises second target detail coefficient matrices with three different decomposition directions.
S106: and generating a fusion image according to the target approximate coefficient matrix, the first target detail coefficient matrix group and the second target detail coefficient matrix group.
In this embodiment, the step may directly perform multi-layer wavelet fusion on the target approximation coefficient matrix, the first target detail coefficient matrix set, and the second target detail coefficient matrix set by using the inverse two-dimensional discrete wavelet transform function, where the specific fusion layer number is determined by the decomposition layer number in the above step, so as to obtain a final fusion image.
In some embodiments, as shown in fig. 3, when the plurality of images to be fused are all color images, step S108 may be further included after step S106: the fused image is converted into a final color image.
For ease of understanding, referring to table 1, table 1 lists input image I1, input image I2, output image i_avg based on weighted fusion rule, output image i_energy based on region average Energy fusion rule, and information entropy and standard deviation of output image i_fre+max based on region space frequency and coefficient maximum rule in the present application, as shown in table 1, the information entropy and standard deviation of i_avg are lower than those of input images I1 and I2, which indicates that the contrast of the fusion image output based on weighted fusion rule is reduced, the information entropy and i_energy information entropy of i_fre+max are similar, and are larger than those of input images I1 and I2, which indicates that the fusion image finally output based on region average Energy fusion rule and the coefficient maximum rule in the present application is more abundant than those of input images I1 and I2, and the fusion effect is better.
Image processing apparatus I1 I2 I_Avg I_Energy I_Fre+Max
Information entropy 7.2747 7.2770 7.2714 7.2825 7.2826
Standard deviation of 45.6136 45.9840 45.5411 45.5305 46.3839
TABLE 1
In contrast to the prior art, the image fusion method provided by the present application includes: obtaining a plurality of images to be fused corresponding to the same scene, performing multi-layer wavelet decomposition on each image to be fused by utilizing a two-dimensional discrete wavelet transformation function to obtain an approximate coefficient matrix corresponding to a low-frequency layer and a detail coefficient matrix group corresponding to each high-frequency layer in a plurality of high-frequency layers, wherein each detail coefficient matrix group comprises a plurality of detail coefficient matrixes, different detail coefficient matrix corresponds to no decomposition direction, performing weighted calculation on the approximate coefficient matrixes corresponding to the plurality of images to be fused to obtain a target approximate coefficient matrix, determining a first target detail coefficient matrix group according to a first preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer, determining a second target detail coefficient matrix group according to a second preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the remaining layer, generating a fused image according to the target approximate coefficient matrix, the first target detail coefficient matrix group and the second target detail coefficient matrix group, and fusing the plurality of images to be fused. According to the method and the device, the detail coefficient matrix groups of the high-frequency layers with different layers are processed by adopting different algorithms, so that the information entropy and standard deviation of the fusion image are increased, the image information richness and contrast of the fusion image are improved to the greatest extent, the fusion image is clearer, and the fusion effect is improved.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an image fusion apparatus according to an embodiment of the present application, where the image fusion apparatus includes:
(1) Acquisition Module 10
The acquisition module 10 is configured to acquire a plurality of images to be fused corresponding to the same scene.
(2) Decomposition module 20
The decomposition module 20 is configured to perform multi-layer wavelet decomposition on each image to be fused by using a two-dimensional discrete wavelet transform function, so as to obtain an approximate coefficient matrix corresponding to a low-frequency layer and a detail coefficient matrix group corresponding to each high-frequency layer in a plurality of high-frequency layers, where each detail coefficient matrix group includes a plurality of detail coefficient matrices, and different detail coefficient matrices correspond to directions without decomposition.
(3) Calculation module 30
The calculating module 30 is configured to perform weighted calculation on the approximation coefficient matrices corresponding to the multiple images to be fused, so as to obtain a target approximation coefficient matrix.
The calculation module 30 is specifically configured to: determining discharge starting time according to the starting time and a preset delay time; and the method is used for determining the discharge frequency according to the estimated time consumption duration and the preset single discharge duration.
(4) First determination module 40
The first determining module 40 is configured to determine a first target detail coefficient matrix set according to a first preset algorithm and a detail coefficient matrix set corresponding to a high frequency layer with a highest layer number.
The first determining module 40 is specifically configured to:
comparing detail coefficients in the same decomposition direction and the same coefficient position in the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer number;
forming a corresponding first target detail coefficient matrix by using the detail coefficient with the largest numerical value at each coefficient position in each decomposition direction;
and taking the first target detail coefficient matrixes corresponding to different decomposition directions as a first target detail coefficient matrix group.
(5) The second determination module 50
The second determining module 50 is configured to determine a second target detail coefficient matrix set according to a second preset algorithm and a detail coefficient matrix set corresponding to the high frequency layer with the remaining layers.
Wherein the second determining module 50 may specifically be configured to:
expanding each detail coefficient matrix in the detail coefficient matrix group corresponding to the high-frequency layers of the remaining layers to obtain a corresponding expansion matrix;
dividing the expansion matrix into a plurality of submatrices with preset sizes by taking each detail coefficient as a center, wherein the detail coefficients correspond to the submatrices one by one;
calculating the spatial frequency of each sub-matrix;
comparing the spatial frequencies of all the submatrices corresponding to the same decomposition direction and the same coefficient position in the same layer number;
in the same decomposition direction in the same layer number, the detail coefficient corresponding to the spatial frequency with the largest value at each coefficient position forms a second target detail coefficient matrix;
and taking the second target detail coefficient matrixes corresponding to different decomposition directions in the same layer number as a second target detail coefficient matrix group, wherein each layer number corresponds to one second target detail coefficient matrix group.
(6) Fusion module 60
The fusion module 60 is configured to generate a fused image according to the target approximation coefficient matrix, the first target detail coefficient matrix set, and the second target detail coefficient matrix set, so as to fuse the plurality of images to be fused.
In some embodiments, referring to fig. 6, fig. 6 is another schematic structural diagram of an image fusion apparatus according to an embodiment of the present application, when a plurality of images to be fused are color images, the image fusion apparatus further includes a first conversion module 70 configured to:
the plurality of color images are converted to gray scale images before the decomposition module 20 performs a multi-layer wavelet decomposition on the plurality of images to be fused using a two-dimensional discrete wavelet transform function.
In this embodiment, the decomposition module 20 may specifically be used for: and carrying out multi-layer wavelet decomposition on the plurality of gray images by utilizing a two-dimensional discrete wavelet transform function.
In this embodiment, the image fusion apparatus further includes a second conversion module 80 for:
after the fusion module 60 generates a fused image from the target approximation coefficient matrix, the first target detail coefficient matrix set, and the second target detail coefficient matrix set, the fused image is converted to a final color image.
In the implementation, each module, unit and/or sub-unit may be implemented as an independent entity, or may be implemented as the same entity or several entities, and the implementation of each module and/or unit may refer to the foregoing method embodiment, and the specific beneficial effects may refer to the beneficial effects in the foregoing method embodiment, which are not described herein again.
In contrast to the prior art, in the image fusion device in this embodiment, a plurality of images to be fused corresponding to the same scene are acquired through the acquisition module 10, then the decomposition module 20 performs multi-layer wavelet decomposition on each image to be fused by using a two-dimensional discrete wavelet transform function to obtain an approximate coefficient matrix corresponding to a low-frequency layer and a detail coefficient matrix group corresponding to each high-frequency layer in a plurality of high-frequency layers, each detail coefficient matrix group comprises a plurality of detail coefficient matrices, different detail coefficient matrix corresponds to a decomposition direction, then the calculation module 30 performs weighted calculation on the approximate coefficient matrices corresponding to the plurality of images to be fused to obtain a target approximate coefficient matrix, the first determination module 40 determines a first target detail coefficient matrix group according to a first preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer at the highest layer, meanwhile, the second determination module 50 determines a second target detail coefficient matrix group according to a second preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer of the rest, and then the fusion module 60 generates a plurality of images to be fused according to the target approximate coefficient matrix, the first target detail coefficient matrix group and the second target detail coefficient matrix group, and the second target detail coefficient matrix group to enrich the fusion effect, so that the fusion effect of the images to be fused is improved, and the fusion effect of the images is improved, and the fusion effect of the fusion device is improved, and the fusion device is improved.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor. To this end, an embodiment of the present application provides a storage medium having stored therein a plurality of instructions capable of being loaded by a processor to perform the steps of any one of the embodiments of the image fusion method provided by the embodiment of the present invention.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
The steps in any embodiment of the image fusion method provided in the embodiment of the present application may be executed due to the instructions stored in the storage medium, so that the beneficial effects that any image fusion method provided in the embodiment of the present application may be achieved, which are detailed in the previous embodiments and are not described herein.
The foregoing description of the preferred embodiments of the present application is not intended to be limiting, but is intended to cover any and all modifications, equivalents, and alternatives falling within the spirit and principles of the present application.

Claims (8)

1. An image fusion method, comprising:
acquiring a plurality of images to be fused corresponding to the same scene;
performing multi-layer wavelet decomposition on each image to be fused by utilizing a two-dimensional discrete wavelet transformation function to obtain an approximate coefficient matrix corresponding to a low-frequency layer and a detail coefficient matrix group corresponding to each high-frequency layer in a plurality of high-frequency layers, wherein each detail coefficient matrix group comprises a plurality of detail coefficient matrixes, and different detail coefficient matrixes correspond to different decomposition directions;
weighting calculation is carried out on the approximate coefficient matrixes corresponding to the multiple images to be fused so as to obtain a target approximate coefficient matrix;
determining a first target detail coefficient matrix group according to a first preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer number;
determining a second target detail coefficient matrix group according to a second preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the residual layer number;
generating a fusion image according to the target approximate coefficient matrix, the first target detail coefficient matrix group and the second target detail coefficient matrix group so as to fuse the plurality of images to be fused;
the determining a first target detail coefficient matrix group according to a first preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer number specifically includes:
comparing the detail coefficients in the same decomposition direction and the same coefficient position in the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer number;
the detail coefficient with the largest numerical value at each coefficient position in each decomposition direction is formed into a corresponding first target detail coefficient matrix;
and taking the first target detail coefficient matrixes corresponding to different decomposition directions as a first target detail coefficient matrix group.
2. The image fusion method according to claim 1, wherein the determining the second target detail coefficient matrix set according to the second preset algorithm and the detail coefficient matrix set corresponding to the high-frequency layer of the remaining layers specifically includes:
expanding each detail coefficient matrix in the detail coefficient matrix group corresponding to the high-frequency layer in the residual layer number to obtain a corresponding expansion matrix;
dividing the expansion matrix into a plurality of submatrices with preset sizes by taking each detail coefficient as a center, wherein the detail coefficients are in one-to-one correspondence with the submatrices;
calculating the spatial frequency of each submatrix;
comparing the spatial frequencies of all the submatrices corresponding to the same decomposition direction and the same coefficient position in the same layer number;
in the same decomposition direction in the same layer number, the detail coefficient corresponding to the spatial frequency with the largest value at each coefficient position forms a second target detail coefficient matrix;
and taking the second target detail coefficient matrixes corresponding to different decomposition directions in the same layer number as a second target detail coefficient matrix group, wherein each layer number corresponds to one second target detail coefficient matrix group.
3. The image fusion method according to claim 1, wherein when the plurality of images to be fused are color images, before the multi-layer wavelet decomposition of the plurality of images to be fused using the two-dimensional discrete wavelet transform function, further comprising:
converting the plurality of color images into a gray scale image;
the wavelet decomposition of the plurality of images to be fused by using a two-dimensional discrete wavelet transform function comprises the following steps: and carrying out multi-layer wavelet decomposition on the plurality of gray images by utilizing a two-dimensional discrete wavelet transform function.
4. The image fusion method of claim 3, further comprising, after the generating a fused image from the target approximation coefficient matrix, the first target detail coefficient matrix set, and the second target detail coefficient matrix set:
the fused image is converted into a final color image.
5. An image fusion apparatus, comprising:
the acquisition module is used for acquiring a plurality of images to be fused corresponding to the same scene;
the decomposition module is used for carrying out multi-layer wavelet decomposition on each image to be fused by utilizing a two-dimensional discrete wavelet transformation function so as to obtain an approximate coefficient matrix corresponding to a low-frequency layer and a detail coefficient matrix group corresponding to each high-frequency layer in a plurality of high-frequency layers, wherein each detail coefficient matrix group comprises a plurality of detail coefficient matrixes, and different detail coefficient matrixes correspond to decomposition-free directions;
the calculation module is used for carrying out weighted calculation on the approximate coefficient matrixes corresponding to the multiple images to be fused so as to obtain a target approximate coefficient matrix;
the first determining module is used for determining a first target detail coefficient matrix group according to a first preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer number;
the second determining module is used for determining a second target detail coefficient matrix group according to a second preset algorithm and the detail coefficient matrix group corresponding to the high-frequency layer with the residual layer number;
the fusion module is used for generating a fusion image according to the target approximate coefficient matrix, the first target detail coefficient matrix group and the second target detail coefficient matrix group so as to fuse the plurality of images to be fused;
the first determining module is specifically configured to:
comparing the detail coefficients in the same decomposition direction and the same coefficient position in the detail coefficient matrix group corresponding to the high-frequency layer with the highest layer number;
the detail coefficient with the largest numerical value at each coefficient position in each decomposition direction is formed into a corresponding first target detail coefficient matrix;
and taking the first target detail coefficient matrixes corresponding to different decomposition directions as a first target detail coefficient matrix group.
6. The image fusion apparatus of claim 5, wherein the second determining module is specifically configured to:
expanding each detail coefficient matrix in the detail coefficient matrix group corresponding to the high-frequency layer in the residual layer number to obtain a corresponding expansion matrix;
dividing the expansion matrix into a plurality of submatrices with preset sizes by taking each detail coefficient as a center, wherein the detail coefficients are in one-to-one correspondence with the submatrices;
calculating the spatial frequency of each submatrix;
comparing the spatial frequencies of all the submatrices corresponding to the same decomposition direction and the same coefficient position in the same layer number;
in the same decomposition direction in the same layer number, the detail coefficient corresponding to the spatial frequency with the largest value at each coefficient position forms a second target detail coefficient matrix;
and taking the second target detail coefficient matrixes corresponding to different decomposition directions in the same layer number as a second target detail coefficient matrix group, wherein each layer corresponds to one second target detail coefficient matrix group.
7. The image fusion apparatus of claim 5, wherein when the plurality of images to be fused are color images, the image fusion apparatus further comprises a first conversion module configured to:
converting the plurality of color images into gray images before the decomposition module performs multi-layer wavelet decomposition on the plurality of images to be fused by utilizing a two-dimensional discrete wavelet transform function;
the decomposition module is specifically used for: and carrying out multi-layer wavelet decomposition on the plurality of gray images by utilizing a two-dimensional discrete wavelet transform function.
8. The image fusion apparatus of claim 7, further comprising a second conversion module configured to:
and after the fusion module generates a fusion image according to the target approximate coefficient matrix, the first target detail coefficient matrix group and the second target detail coefficient matrix group, converting the fusion image into a final color image.
CN202010633192.5A 2020-07-02 2020-07-02 Image fusion method and device Active CN111861957B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010633192.5A CN111861957B (en) 2020-07-02 2020-07-02 Image fusion method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010633192.5A CN111861957B (en) 2020-07-02 2020-07-02 Image fusion method and device

Publications (2)

Publication Number Publication Date
CN111861957A CN111861957A (en) 2020-10-30
CN111861957B true CN111861957B (en) 2024-03-08

Family

ID=73152094

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010633192.5A Active CN111861957B (en) 2020-07-02 2020-07-02 Image fusion method and device

Country Status (1)

Country Link
CN (1) CN111861957B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116113978A (en) * 2020-11-05 2023-05-12 华为技术有限公司 Depth high dynamic range imaging based on wavelet transforms
CN115100081B (en) * 2022-08-24 2022-11-15 深圳佳弟子科技有限公司 LCD display screen gray scale image enhancement method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441766A (en) * 2008-11-28 2009-05-27 西安电子科技大学 SAR image fusion method based on multiple-dimension geometric analysis
CN105184761A (en) * 2015-08-28 2015-12-23 中国科学院深圳先进技术研究院 Image rain removing method based on wavelet analysis and system
RU2586585C1 (en) * 2015-04-07 2016-06-10 Закрытое акционерное общество "МНИТИ" (сокращенно ЗАО "МНИТИ") Method of increasing visual information content of digital images
CN107451984A (en) * 2017-07-27 2017-12-08 桂林电子科技大学 A kind of infrared and visual image fusion algorithm based on mixing multiscale analysis

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441766A (en) * 2008-11-28 2009-05-27 西安电子科技大学 SAR image fusion method based on multiple-dimension geometric analysis
RU2586585C1 (en) * 2015-04-07 2016-06-10 Закрытое акционерное общество "МНИТИ" (сокращенно ЗАО "МНИТИ") Method of increasing visual information content of digital images
CN105184761A (en) * 2015-08-28 2015-12-23 中国科学院深圳先进技术研究院 Image rain removing method based on wavelet analysis and system
CN107451984A (en) * 2017-07-27 2017-12-08 桂林电子科技大学 A kind of infrared and visual image fusion algorithm based on mixing multiscale analysis

Also Published As

Publication number Publication date
CN111861957A (en) 2020-10-30

Similar Documents

Publication Publication Date Title
WO2019192316A1 (en) Image related processing method and apparatus, device and storage medium
CN111861957B (en) Image fusion method and device
KR101291869B1 (en) Noise and/or flicker reduction in video sequences using spatial and temporal processing
US8639053B2 (en) Methods and systems for up-scaling a standard definition (SD) video to high definition (HD) quality
US11216910B2 (en) Image processing system, image processing method and display device
CN112887728A (en) Electronic device, control method and system of electronic device
CN110111269B (en) Low-illumination imaging algorithm and device based on multi-scale context aggregation network
US10602145B2 (en) Image encoding apparatus and control method thereof
CN111047543A (en) Image enhancement method, device and storage medium
US9031350B2 (en) Method for processing edges in an image and image processing apparatus
CN110852334A (en) System and method for adaptive pixel filtering
CN104036468A (en) Super-resolution reconstruction method for single-frame images on basis of pre-amplification non-negative neighbor embedding
Wielgus et al. Fast and adaptive bidimensional empirical mode decomposition for the real-time video fusion
CN111226256A (en) System and method for image dynamic range adjustment
CN113132695A (en) Lens shadow correction method and device and electronic equipment
CN110503002B (en) Face detection method and storage medium
CN108122218B (en) Image fusion method and device based on color space
CN105869129A (en) Residual heterogeneous noise elimination method for aiming at thermal infrared image after heterogeneous correction
CN111784594A (en) Infrared image contrast enhancement method and device
Silverman et al. Segmentation of hyperspectral images based on histograms of principal components
JP7093441B2 (en) Image processing method, equipment and storage medium
CN109544463A (en) The inverse tone mapping (ITM) method of image content-based
CN110113595B (en) Method and device for converting 2D video into 3D video and electronic equipment
WO2017094504A1 (en) Image processing device, image processing method, image capture device, and program
CN109636740B (en) Infrared image multi-scale intelligent non-uniformity correction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant