CN112233053B - Image fusion method, device, equipment and computer readable storage medium - Google Patents

Image fusion method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN112233053B
CN112233053B CN202011009953.6A CN202011009953A CN112233053B CN 112233053 B CN112233053 B CN 112233053B CN 202011009953 A CN202011009953 A CN 202011009953A CN 112233053 B CN112233053 B CN 112233053B
Authority
CN
China
Prior art keywords
image
frequency
infrared light
matrix
low
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011009953.6A
Other languages
Chinese (zh)
Other versions
CN112233053A (en
Inventor
杨熙丞
张东
王松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202011009953.6A priority Critical patent/CN112233053B/en
Publication of CN112233053A publication Critical patent/CN112233053A/en
Application granted granted Critical
Publication of CN112233053B publication Critical patent/CN112233053B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application provides a fusion method, a fusion device, fusion equipment and a computer readable storage medium of images. The fusion method comprises the following steps: obtaining a visible light image and an infrared light image; respectively carrying out low-pass filtering treatment on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image; the method comprises the steps of performing difference on a visible light image and a visible light low-frequency image to obtain a first visible light detail matrix, and performing difference on an infrared light image and an infrared light low-frequency image to obtain a first infrared light detail matrix; performing high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix; performing fusion processing on the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image; and obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix. By the aid of the scheme, the natural degree of details of the fusion image is improved.

Description

Image fusion method, device, equipment and computer readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method, an apparatus, a device, and a computer readable storage medium for image fusion.
Background
With the development of monitoring technology, the requirements on the images collected by the monitoring equipment are also higher and higher. Especially for some special scenes, such as night low-light scenes, how to ensure the quality of the image is the key of the monitoring field.
In the prior art, the infrared texture information and the visible light texture information are generally obtained according to the obtained visible light image of the visible light wave band sensor and the obtained infrared light image of the infrared wave band sensor, the maximum fusion texture information is obtained through the fusion texture coefficient, and the reflection coefficient is obtained according to the fusion texture coefficient, so that the fusion result is obtained based on the infrared light image, the visible light image, the fusion texture coefficient and the reflection coefficient. However, fusion brightness and detail are abrupt and fusion results are unnatural in the fusion results obtained by the scheme.
Disclosure of Invention
The application provides a fusion method, a device, equipment and a computer readable storage medium of images, which mainly solve the technical problem of how to improve the nature degree of the details of the fused images.
In order to solve the technical problems, the application provides an image fusion method, which comprises the following steps:
obtaining a visible light image and an infrared light image;
respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image;
The visible light image and the visible light low-frequency image are subjected to difference to obtain a first visible light detail matrix, and the infrared light image and the infrared light low-frequency image are subjected to difference to obtain a first infrared light detail matrix;
performing high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix;
Performing fusion processing on the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image;
And obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix.
In order to solve the above technical problems, the present application provides an image fusion apparatus, including:
An acquisition unit configured to acquire a visible light image and an infrared light image;
The filtering unit is used for respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image;
The computing unit is used for differentiating the visible light image and the visible light low-frequency image to obtain a first visible light detail matrix, and differentiating the infrared light image and the infrared light low-frequency image to obtain a first infrared light detail matrix;
The high-frequency fusion unit is used for carrying out high-frequency fusion processing by utilizing the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix;
The low-frequency fusion unit is used for carrying out fusion processing on the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image;
And the target fusion unit is used for obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix.
To solve the above technical problems, the present application provides a terminal device, which includes a memory and a processor coupled to the memory;
The memory is used for storing program data, and the processor is used for executing the program data to realize the image fusion method according to any one of the above.
To solve the above technical problem, the present application also provides a computer storage medium for storing program data, which when executed by a processor, is configured to implement the image fusion method according to any one of the above claims.
In the scheme, a visible light image and an infrared light image are acquired; respectively carrying out low-pass filtering treatment on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image; the method comprises the steps of performing difference on a visible light image and a visible light low-frequency image to obtain a first visible light detail matrix, and performing difference on an infrared light image and an infrared light low-frequency image to obtain a first infrared light detail matrix; performing high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix; performing fusion processing on the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image; and obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix. By the aid of the scheme, the natural degree of details of the fusion image is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art. Wherein:
FIG. 1 is a flowchart of a first embodiment of an image fusion method according to the present application;
FIG. 2 is a flow chart of acquiring a high frequency fusion matrix in the fusion method of the images shown in FIG. 1;
FIG. 3 is a flowchart of a second embodiment of an image fusion method according to the present application;
FIG. 4 is a schematic diagram of an embodiment of an image fusion apparatus according to the present application;
fig. 5 is a schematic structural diagram of an embodiment of a terminal device provided by the present application;
FIG. 6 is a schematic diagram of a computer storage medium according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The application provides an image fusion method which can be applied to security monitoring and aims to fuse a visible light image and an infrared light image in a low-light scene, the natural degree of the detail of the fused image can be improved through the image fusion method, and particularly referring to fig. 1, fig. 1 is a schematic flow chart of a first embodiment of the image fusion method provided by the application. The image fusion method of the embodiment can be applied to an image fusion device and also can be applied to a server with data processing capability. The image fusion method in this embodiment specifically includes the following steps:
S101: a visible light image and an infrared light image are acquired.
The application obtains a fused image with natural details by acquiring a visible light image and an infrared light image in a low-light scene and fusing the visible light image and the infrared light image. The image fusion device in the embodiment can shoot a visible light image and an infrared light image in a low-light environment where the fusion device is located through a camera installed in the fusion device. Specifically, the camera may acquire a visible light image by sensitization to a visible light band and acquire an infrared light image by sensitization to an infrared light band. The visible light image and the infrared light image are acquired by arranging the camera in the image fusion device, and the image fusion device can acquire the visible light image and the infrared light image in the low-illumination environment where the image fusion device is positioned at preset time intervals. The camera can be arranged at any position in the image fusion device, one or more cameras can be arranged, and the camera can be arranged at one side of the image fusion device facing to the external environment in order to conveniently acquire the visible light image and the infrared light image in the low-light environment where the image fusion device is positioned.
S102: and respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image.
In order to facilitate acquisition of the visible light low-frequency image and the infrared light low-frequency image, the image fusion device is used for carrying out fusion processing on the visible light image, the infrared light image, the visible light low-frequency image and the infrared light low-frequency image, and a fused image with natural details is obtained. The image fusion device of the embodiment performs low-pass filtering processing on the visible light image to obtain a visible light low-frequency image; and performing low-pass filtering processing on the infrared light image to obtain an infrared light low-frequency image. The image fusion device of the present embodiment may perform low-pass filtering processing on the visible light image and the infrared light image by using low-pass filters such as mean filtering, gaussian filtering, and guided filtering, which is not limited in this embodiment.
S103: and performing difference on the visible light image and the visible light low-frequency image to obtain a first visible light detail matrix, and performing difference on the infrared light image and the infrared light low-frequency image to obtain a first infrared light detail matrix.
In order to obtain the detail matrixes corresponding to the visible light image and the infrared light image, the image fusion device of the embodiment performs difference processing on the visible light image acquired in S101 and the visible light low-frequency image acquired in S102, and the infrared light image acquired in S101 and the infrared light low-frequency image acquired in S102 to obtain a first visible light detail matrix and a first infrared light detail matrix respectively.
Specifically, the image fusion device of the embodiment performs difference on the visible light image and the light low-frequency image to obtain a first visible light detail matrix; the image fusion device performs difference on the infrared light image and the infrared light low-frequency image to obtain a first infrared light detail matrix.
Further, the calculation of the first visible detail matrix satisfies the following equation:
V_detail=V1-V_base
Wherein V_detail is a first visible light detail matrix, V1 is a visible light image, and V_base is a visible light low-frequency image.
The calculation of the first infrared light detail matrix satisfies the following equation:
N_detail=N1-N_base
wherein, N_detail is a first infrared detail matrix, N1 is an infrared light image, and N_base is an infrared light low-frequency image.
S104: and carrying out high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix.
The image fusion device of the embodiment performs high-frequency fusion processing on the first visible light detail matrix and the first infrared light detail matrix based on the first visible light detail matrix and the first infrared light detail matrix acquired in S103, so as to obtain a high-frequency fusion matrix.
S105: and carrying out fusion processing on the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image.
The image fusion device of the embodiment performs low-frequency fusion processing on the visible light low-frequency image and the infrared light low-frequency image based on the visible light low-frequency image and the infrared light low-frequency image acquired in S102, and obtains a low-frequency fusion image.
S106: and obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix.
In order to improve the nature degree of the details of the fusion image, the image fusion device of the embodiment fuses the low-frequency fusion image acquired in S105 and the high-frequency fusion matrix acquired in S104 to obtain a target fusion image.
In the scheme, the image fusion device acquires the visible light image and the infrared light image; respectively carrying out low-pass filtering treatment on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image; the method comprises the steps of performing difference on a visible light image and a visible light low-frequency image to obtain a first visible light detail matrix, and performing difference on an infrared light image and an infrared light low-frequency image to obtain a first infrared light detail matrix; performing high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix; performing fusion processing on the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image; based on the low-frequency fusion image and the high-frequency fusion matrix, a target fusion image is obtained, and the natural degree of the fusion image detail is improved.
With continued reference to fig. 2, fig. 2 is a schematic flow chart of acquiring a high-frequency fusion matrix in the fusion method of the image shown in fig. 1. In order to improve the nature degree of the details of the fused image, on the basis of the above embodiment, the image fusion method of the present embodiment further includes the following steps:
S201: and converting the first visible light detail matrix by using a visible light detail weight function to obtain a second visible light detail matrix.
In order to adjust the appropriate visible light image detail intensity, the image fusion device of the embodiment uses the visible light weight function to perform conversion processing on the first visible light detail matrix to obtain a second visible light detail matrix.
The image fusion device of the embodiment traverses the pixel points in the first visible light detail matrix, calculates the absolute value of the pixel value of the selected pixel point in the first visible light detail matrix by using the visible light detail weight function, and obtains the pixel point after the pixel point in the first visible light detail matrix is converted so as to form a second visible light detail matrix.
Specifically, the image fusion device of the embodiment selects a pixel point in the first visible light detail matrix, queries a preset visible light detail weight function, and calculates the absolute value of the pixel value of the selected pixel point by using the visible light detail weight function so as to convert the first visible light detail matrix into the second visible light detail matrix.
The conversion of the first visible light detail matrix and the second visible light detail matrix satisfies the following equation:
V_detail2(i,j)=sign(V_detail(i,j))*w_vis_detail(abs(V_detail(i,j)))
Wherein v_deltail 2 is a second visible light detail matrix, i, j represent the ordinate and abscissa of the visible light image, abs is an absolute function, v_deltail is a first visible light detail matrix, w_vis_deltail is a visible light detail weight function, and sign function is used to obtain the symbol value of the element in the first visible light detail matrix.
It should be noted that the sign function satisfies the following formula:
from the above, the sign function is related to the first visible detail matrix.
Wherein i, j represent the ordinate and the abscissa of the visible light image, v_detail is the first visible light detail matrix, and sign function is used to obtain the symbol value of the element in the first visible light detail matrix.
Further, the visible detail weight function satisfies the following equation:
Wherein w_vis_detail is a visible light detail weight function, and x is the value of the selected pixel point in the first visible light detail matrix, namely v_detail (i, j); low_x and high_x are the abscissas of turning point 1 and turning point 2 in the visible light detail weight function, respectively, and low_x is < high_x; low_y and high_y are the ordinate of turning point 1 and turning point 2 in the visible light detail weight function, respectively, low_y < high_y.
S202: and converting the first infrared light detail matrix by using an infrared light detail weight function to obtain a second infrared light detail matrix.
In order to adjust the appropriate infrared light image detail intensity, the image fusion device of the embodiment uses an infrared light detail weight function to perform conversion processing on the first infrared light detail matrix to obtain a second infrared light detail matrix.
The image fusion device of the embodiment traverses the pixel points in the first infrared light detail matrix, calculates the absolute value of the pixel value of the selected pixel point in the first infrared light detail matrix by using the infrared light detail weight function, and obtains the pixel point after the pixel point in the first infrared light detail matrix is converted so as to form a second infrared light detail matrix.
Specifically, the image fusion device of the embodiment selects a pixel point in the first infrared light detail matrix, queries a preset infrared light detail weight function, and calculates the absolute value of the pixel value of the selected pixel point by using the infrared light detail weight function so as to convert the first infrared light detail matrix into the second infrared light detail matrix.
The conversion of the first infrared light detail matrix and the second infrared light detail matrix satisfies the following equation:
N_detail2(i,j)=sign(N_detail(i,j))*w_nir_detail(abs(N_detail(i,j)))
Wherein n_detail2 is a second infrared detail matrix, i, j represent an ordinate and an abscissa of the infrared image, abs is an absolute function, n_detail is a first infrared detail matrix, w_nir_detail is an infrared detail weight function, and sign function is used for obtaining element symbol values in the first infrared detail matrix.
In this step, the values of the infrared light detail weight function and sign function may refer to S201, and repeated description is omitted here.
S203: a difference image between the infrared light low-frequency image and the visible light low-frequency image is calculated.
In order to improve the nature degree of the fused image details, the image fusion device of the embodiment performs difference processing on the infrared light low-frequency image and the visible light low-frequency image, namely calculates a difference image between the infrared light low-frequency image and the visible light low-frequency image, so as to obtain a third infrared light detail matrix according to the difference image.
S204: and carrying out high-frequency and low-frequency joint control processing on the difference image and the second infrared detail matrix to obtain a third infrared detail matrix.
The image fusion device of the embodiment performs high-frequency and low-frequency joint control processing on the difference image acquired in S203 and the second infrared light detail matrix acquired in S202 to obtain a third infrared light detail matrix.
Specifically, the calculation of the third infrared light detail matrix satisfies the following equation:
Wherein diff_base is a difference image between the infrared light low-frequency image and the visible light low-frequency image; i, j represent the ordinate and abscissa of the infrared light image, respectively; eps is a small number, preventing the divisor from being 0; n_delta2 is a second infrared detail matrix; n_delta3 is a third infrared light detail matrix; k1 is the infrared light high frequency suppression parameter.
In a specific embodiment, the smaller k1, the stronger the ability to suppress the second infrared light detail matrix, and the stronger the ability to suppress the second infrared light detail matrix at a location when the difference between the low-frequency pixels of the infrared light image and the low-frequency pixels of the visible light image is larger.
S205: and performing low-pass filtering treatment on the third infrared light detail matrix to obtain a fourth infrared light detail matrix, and performing low-pass filtering treatment on the second visible light detail matrix to obtain a third visible light detail matrix.
In order to reduce noise in the third infrared light detail matrix and the second visible light detail matrix and increase the detail intensity of the fusion image, the image fusion device of the embodiment performs low-pass filtering processing on the third infrared light detail matrix to obtain a fourth infrared light detail matrix; and performing low-pass filtering processing on the second visible light detail matrix to obtain a third visible light detail matrix. The image fusion device of the embodiment may perform low-pass filtering processing on the third infrared light detail matrix and the second visible light detail matrix by using low-pass filters such as mean filtering, gaussian filtering, guided filtering, and the like.
In a specific embodiment, the image fusion device performs low-pass filtering processing on the third infrared light detail matrix and the second visible light detail matrix by adopting an average filter, so as to obtain a fourth infrared light detail matrix and a third visible light detail matrix respectively. Because the image fusion device performs low-pass filtering processing on the third infrared light detail matrix and the second visible light detail matrix by using the mean filter, the embodiment only describes in detail that the image fusion device performs low-pass filtering on the third infrared light detail matrix by using the mean filter to obtain a fourth infrared light detail matrix.
Specifically, the calculation of the fourth infrared light detail matrix satisfies the following equation:
Wherein N_deltaail4 is a fourth infrared light detail matrix, N_deltaail3 is a third infrared light detail matrix, x, y are position information of infrared light image pixels, r1 is a filtering radius, and w is a weight matrix of mean filtering.
For the weight matrix w of the mean filtering, in a specific embodiment, the image fusion device may take a 5*5 neighborhood as an example, where the weight matrix of the mean filtering satisfies the following formula:
S206: and performing excessive value inhibition processing on the third visible light detail matrix to obtain a fourth visible light detail matrix.
In order to avoid the influence of the excessive value in the third visible light detail matrix on the nature degree of the fused image detail, the image fusion device of the embodiment performs excessive value inhibition processing on the third visible light detail matrix by using the third visible light detail matrix and the fourth infrared light detail matrix acquired in S205 to obtain a fourth visible light detail matrix.
Specifically, the calculation of the fourth visible detail matrix satisfies the following equation:
Wherein w_vis_inhibit is a weight parameter for inhibiting the visible light detail matrix, v_detail3 is a third visible light detail matrix, v_detail4 is a fourth visible light detail matrix, and n_detail4 is a fourth infrared light detail matrix.
It should be noted that, when the weight parameter of the suppression visible light detail matrix is 1, the excessive value in the third visible light detail matrix is completely suppressed; when the weight parameter of the suppression visible light detail matrix is 0, the excessive value in the third visible light detail matrix is not suppressed at all.
In a specific embodiment, the image fusion device obtains the fourth visible light detail matrix by comparing the numerical values of the third visible light detail matrix and the fourth infrared light detail matrix. Specifically, if the value of the third visible light detail matrix is smaller than the value of the fourth infrared light detail matrix, the fourth infrared light detail matrix is v_detail3 (i, j); if the value of the third visible light detail matrix is greater than or equal to the value of the fourth infrared light detail matrix, the fourth infrared light detail matrix is w_vis_inhibit v_detail4 (i, j) + (1-w_vis_inhibit) v_detail3 (i, j).
S207: and comparing the numerical values of the fourth visible light detail matrix and the fourth infrared light detail matrix.
The image fusion device of the embodiment performs high-frequency fusion processing by using the acquired first visible light detail matrix, first infrared light detail matrix, fourth visible light detail matrix and fourth infrared light detail matrix to obtain a high-frequency fusion matrix.
Specifically, the image fusion device compares the values of the fourth visible light detail matrix and the fourth infrared light detail matrix, and if the value of the fourth visible light detail matrix is greater than the value of the fourth infrared light detail matrix, S208 is executed, and the first infrared light detail matrix is used as a high-frequency fusion matrix; if the value of the fourth visible light detail matrix is smaller than the value of the fourth infrared light detail matrix, S209 is executed, and the first visible light detail matrix is used as a high-frequency fusion matrix.
Further, the calculation of the high frequency fusion matrix satisfies the following equation:
Wherein F_detail is a high frequency fusion matrix, N_detail is a first infrared detail matrix, V_detail is a first visible detail matrix, N_detail4 is a fourth infrared detail matrix, and V_detail4 is a fourth visible detail matrix.
S208: and if the numerical value of the fourth visible light detail matrix is larger than that of the fourth infrared light detail matrix, taking the first infrared light detail matrix as a high-frequency fusion matrix.
S209: and if the value of the fourth visible light detail matrix is smaller than that of the fourth infrared light detail matrix, taking the first visible light detail matrix as a high-frequency fusion matrix.
In the scheme, the image fusion device converts the first visible light detail matrix by using the visible light detail weight function to obtain the second visible light detail matrix, so that the detail intensity of the visible light image is adjusted; converting the first infrared detail matrix by using an infrared detail weight function to obtain a second infrared detail matrix, and adjusting the detail intensity of the infrared image; calculating a difference image between the infrared light low-frequency image and the visible light low-frequency image, and performing high-frequency and low-frequency joint control processing on the difference image and the second infrared light detail matrix to obtain a third infrared light detail matrix, so that the inhibition of the second infrared light detail matrix is enhanced; performing low-pass filtering on the third infrared light detail matrix to obtain a fourth infrared light detail matrix, and performing low-pass filtering on the second visible light detail matrix to obtain a third visible light detail matrix, so that noise in the third infrared light detail matrix and the second visible light detail matrix is reduced, and the detail intensity of the fused image is increased; and performing excessive value inhibition processing on the third visible light detail matrix by using the third visible light detail matrix and the fourth infrared light detail matrix to obtain the fourth visible light detail matrix, so as to avoid the influence of excessive values in the third visible light detail matrix on the nature degree of the fused image detail.
With continued reference to fig. 3, fig. 3 is a flowchart illustrating a second embodiment of an image fusion method according to the present application. Specifically, the image fusion method of the present embodiment includes the steps of:
s301: a visible light image and an infrared light image are acquired.
S302: and respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image.
S303: and performing difference on the visible light image and the visible light low-frequency image to obtain a first visible light detail matrix, and performing difference on the infrared light image and the infrared light low-frequency image to obtain a first infrared light detail matrix.
S304: and carrying out high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix.
The detailed descriptions of S301 to S304 in this embodiment can refer to the related descriptions in the above embodiments, and the detailed descriptions are not repeated here.
S305: and performing conversion processing on the infrared light low-frequency image by using the infrared light suppression weight function to obtain the infrared light low-frequency suppression image.
In order to reduce the influence of the excessive bright infrared light on the infrared light low-frequency image, the image fusion device of the embodiment performs conversion processing on the infrared light low-frequency image by using the infrared light suppression weight function to obtain the infrared light low-frequency suppression image.
The image fusion device of the embodiment traverses the pixel points in the infrared light low-frequency image, calculates the pixel value of the selected pixel point in the infrared light low-frequency image by using the infrared light suppression weight function, and obtains the pixel point after the pixel point in the infrared light low-frequency image is converted, so as to form the infrared light low-frequency suppression image.
Specifically, the image fusion device of the embodiment selects a pixel point in the infrared light low-frequency image, queries a preset infrared light suppression weight function, and calculates a pixel value of the selected pixel point by using the infrared light suppression weight function so as to convert the infrared light low-frequency image into an infrared light low-frequency suppression image.
The conversion of the infrared light low-frequency image and the infrared light low-frequency suppression image satisfies the following equation:
N_base2(i,j)=w_nir_inhibit*(N_base(i,j))
Wherein N_Base2 is an infrared light low-frequency inhibition image; w_nir_inhibit is an infrared light inhibition weight function; n_base is an infrared light low frequency image.
It should be noted that the value of the infrared light suppression weight function satisfies the following formula:
Wherein x1 is the abscissa of the inflection point of the infrared light suppression weight function, and x is the pixel value of the selected pixel point in the infrared light low-frequency image, namely N_base (i, j); w_nir_inhibit is an infrared light inhibition weight function; k2 represents the suppression degree of the infrared light low-frequency image, when k2 is 1, the image fusion device does not perform suppression processing on the infrared light low-frequency image, and when k2 is less than 1, and the smaller k2 is, the stronger the suppression degree of the image fusion device on the infrared light low-frequency image is.
Further, in a specific embodiment, when the input pixel value x is smaller than x1, the image fusion device does not process the pixel value of the input selected pixel point in the infrared light low-frequency image.
S306: and carrying out low-frequency fusion on the infrared light low-frequency inhibition image and the visible light low-frequency image by using a low-frequency fusion technology to obtain a low-frequency fusion image.
The image fusion apparatus of the present embodiment performs low-frequency fusion processing on the infrared light low-frequency suppression image and the visible light low-frequency image based on the infrared light low-frequency suppression image acquired in S305, to obtain a low-frequency fusion image.
Specifically, the calculation of the low-frequency fusion image satisfies the following equation:
F_base(i,j)=w_nir(i,j)*N_base2(i,j)+(1-w_nir(i,j))*V_base(i,j)
wherein w_nir is an infrared light fusion proportion matrix, N_Bas2 is an infrared light low-frequency inhibition image, V_Base is a visible light low-frequency image, and F_Base is a low-frequency fusion image.
In a specific embodiment, the weight value of each position in the infrared light fusion proportion matrix may be the same or different. The infrared light fusion proportion matrix can be obtained by the difference value of the infrared light low-frequency inhibition image and the visible light low-frequency image. The acquisition of the infrared light fusion proportion matrix meets the following formula:
w_nir(i,j)=min(k3*(1-abs(V_base(i,j)-N_base2(i,j))),1)
Wherein abs is an absolute value function, V_base is a visible light low-frequency image, N_base2 is an infrared light low-frequency inhibition image, k3 is a parameter for adjusting the infrared light fusion ratio, and the larger the parameter for adjusting the infrared light fusion ratio is, the higher the average fusion ratio of infrared light is.
In a specific embodiment, when the pixel difference between the infrared light low-frequency suppression image and the visible light low-frequency image at a certain position is larger, less infrared light brightness is fused; when the pixel difference between the infrared light low-frequency suppressed image and the visible light low-frequency image at a certain position is smaller, more infrared light brightness is fused.
S307: and obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix.
The image fusion device of this embodiment performs fusion processing by using the high-frequency fusion matrix acquired in S304 and the low-frequency fusion image acquired in S306, so as to obtain a target fusion image.
Specifically, the fusion of the target image satisfies the following equation:
F_result=F_base+F_detail
Wherein F_result is a target fusion image, F_base is a low-frequency fusion image, and F_detail is a high-frequency fusion matrix.
In the scheme, the image fusion device acquires a visible light image and an infrared light image; respectively carrying out low-pass filtering treatment on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image; the method comprises the steps of performing difference on a visible light image and a visible light low-frequency image to obtain a first visible light detail matrix, and performing difference on an infrared light image and an infrared light low-frequency image to obtain a first infrared light detail matrix; performing high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix; performing conversion processing on the infrared light low-frequency image by using an infrared light suppression weight function to obtain an infrared light low-frequency suppression image; performing low-frequency fusion on the infrared light low-frequency inhibition image and the visible light low-frequency image by using a low-frequency fusion technology to obtain a low-frequency fusion image; and obtaining a target fusion map based on the low-frequency fusion image and the high-frequency fusion matrix. The image fusion device of the embodiment utilizes the infrared light suppression weight function to perform conversion processing on the infrared light low-frequency image to obtain the infrared light low-frequency suppression image, thereby reducing the influence of the excessive bright infrared light on the infrared light low-frequency image; and the infrared light low-frequency inhibition image and the visible light low-frequency image are subjected to low-frequency fusion by utilizing a low-frequency fusion technology to obtain a low-frequency fusion image, and fusion processing is performed by utilizing the low-frequency fusion image and the high-frequency fusion matrix to obtain a target fusion image, so that the natural degree of details of the target fusion image is improved.
In order to implement the image fusion method of the above embodiment, another terminal device is provided in the present application, and referring specifically to fig. 4, fig. 4 is a schematic structural diagram of an embodiment of an image fusion apparatus provided in the present application.
The image fusion apparatus 400 includes an acquisition unit 41, a filtering unit 42, a calculation unit 43, a high-frequency fusion unit 44, a low-frequency fusion unit 45, and a target fusion unit 46.
An acquisition unit 41 for acquiring a visible light image and an infrared light image.
And a filtering unit 42, configured to perform low-pass filtering processing on the visible light image and the infrared light image, so as to obtain a visible light low-frequency image and an infrared light low-frequency image.
The calculating unit 43 is configured to perform a difference between the visible light image and the visible light low-frequency image to obtain a first visible light detail matrix, and perform a difference between the infrared light image and the infrared light low-frequency image to obtain a first infrared light detail matrix.
The high-frequency fusion unit 44 is configured to perform high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix, so as to obtain a high-frequency fusion matrix.
The low-frequency fusion unit 45 is configured to perform fusion processing on the visible light low-frequency image and the infrared light low-frequency image, so as to obtain a low-frequency fusion image.
The target fusion unit 46 is configured to obtain a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix.
In order to implement the image fusion method of the above embodiment, another terminal device is provided in the present application, and referring specifically to fig. 5, fig. 5 is a schematic structural diagram of an embodiment of the terminal device provided in the present application.
The terminal device 500 comprises a memory 51 and a processor 52, wherein the memory 51 and the processor 52 are coupled.
The memory 51 is used for storing program data and the processor 52 is used for executing the program data to implement the image fusion method of the above-described embodiment.
In this embodiment, the processor 52 may also be referred to as a CPU (Central Processing Unit ). The processor 52 may be an integrated circuit chip having signal processing capabilities. Processor 52 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The general purpose processor may be a microprocessor or the processor 52 may be any conventional processor or the like.
The present application also provides a computer storage medium 600, as shown in fig. 6, where the computer storage medium 600 is configured to store program data 61, and the program data 61, when executed by a processor, is configured to implement the image fusion method according to the embodiment of the present application.
The method referred to in the image fusion method embodiment of the present application may be stored in a device, such as a computer readable storage medium, when implemented in the form of a software functional unit and sold or used as a stand alone product. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing description is only of embodiments of the present application, and is not intended to limit the scope of the application, and all equivalent structures or equivalent processes using the descriptions and the drawings of the present application or directly or indirectly applied to other related technical fields are included in the scope of the present application.

Claims (9)

1. A method of fusing images, the method comprising:
obtaining a visible light image and an infrared light image;
respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image;
The visible light image and the visible light low-frequency image are subjected to difference to obtain a first visible light detail matrix, and the infrared light image and the infrared light low-frequency image are subjected to difference to obtain a first infrared light detail matrix;
performing high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix;
Performing fusion processing on the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image;
Obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix;
the step of performing high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix comprises the following steps:
Converting the first visible light detail matrix by using a visible light detail weight function to obtain a second visible light detail matrix;
the step of converting the first infrared light detail matrix by using an infrared light detail weight function to obtain a second infrared light detail matrix, further comprises the following steps:
calculating a difference image between the infrared light low-frequency image and the visible light low-frequency image;
Performing high-frequency and low-frequency joint control processing on the difference image and the second infrared detail matrix by using the obtained infrared high-frequency suppression parameters to obtain a third infrared detail matrix, wherein the low-frequency pixel difference value of a target position in the difference image and the suppression capability of the infrared high-frequency suppression parameters on the second infrared detail matrix corresponding to the target position are positively correlated;
and performing high-frequency fusion processing by using the second visible light detail matrix and the third infrared light detail matrix to obtain a high-frequency fusion matrix.
2. The fusion method according to claim 1, wherein the step of performing high-frequency fusion processing using the second visible light detail matrix and the third infrared light detail matrix to obtain a high-frequency fusion matrix further comprises:
performing low-pass filtering processing on the third infrared light detail matrix to obtain a fourth infrared light detail matrix;
and performing low-pass filtering processing on the second visible light detail matrix to obtain a third visible light detail matrix, and further comprising the steps of:
Performing excessive value inhibition processing on the third visible light detail matrix to obtain a fourth visible light detail matrix;
and performing high-frequency fusion processing by using the first visible light detail matrix, the first infrared light detail matrix, the fourth visible light detail matrix and the fourth infrared light detail matrix to obtain a high-frequency fusion matrix.
3. The fusion method according to claim 2, wherein the step of performing high-frequency fusion processing using the first visible light detail matrix, the first infrared light detail matrix, the second visible light detail matrix, and the second infrared light detail matrix to obtain a high-frequency fusion matrix includes:
comparing the numerical values of the fourth visible light detail matrix and the fourth infrared light detail matrix;
if the value of the fourth visible light detail matrix is larger than that of the fourth infrared light detail matrix, the first infrared light detail matrix is used as a high-frequency fusion matrix;
and if the numerical value of the fourth visible light detail matrix is smaller than that of the fourth infrared light detail matrix, taking the first visible light detail matrix as a high-frequency fusion matrix.
4. The fusion method according to claim 1, wherein the step of performing a conversion process on the first visible light detail matrix by using a visible light detail weight function to obtain a second visible light detail matrix includes:
traversing the pixel points in the first visible light detail matrix, and calculating the absolute value of the pixel value of the selected pixel point in the first visible light detail matrix by using the visible light detail weight function to obtain the pixel point after the pixel point in the first visible light detail matrix is converted so as to form a second visible light detail matrix;
The step of converting the first infrared light detail matrix by using an infrared light detail weight function to obtain a second visible light detail matrix further comprises the following steps:
traversing the pixel points in the first infrared light detail matrix, and calculating the absolute value of the pixel value of the selected pixel point in the first infrared light detail matrix by using the infrared light detail weight function to obtain the pixel point after the pixel point in the first infrared light detail matrix is converted so as to form a second infrared light detail matrix.
5. The fusion method according to claim 1, wherein the step of performing fusion processing on the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image comprises:
Converting the infrared light low-frequency image by using an infrared light suppression weight function to obtain an infrared light low-frequency suppression image;
and performing low-frequency fusion on the infrared light low-frequency inhibition image and the visible light low-frequency image by using a low-frequency fusion technology to obtain a low-frequency fusion image.
6. The fusion method according to claim 5, wherein the step of converting the infrared light low-frequency image using an infrared light suppression weight function to obtain the infrared light low-frequency suppression image further comprises:
And traversing the pixel points in the infrared light low-frequency image, and calculating the pixel value of the selected pixel point in the infrared light low-frequency image by using the infrared light suppression weight function to obtain the pixel point converted by the pixel point in the infrared light low-frequency image so as to form an infrared light low-frequency suppression image.
7. An image fusion apparatus, the apparatus comprising:
An acquisition unit configured to acquire a visible light image and an infrared light image;
The filtering unit is used for respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image;
The computing unit is used for differentiating the visible light image and the visible light low-frequency image to obtain a first visible light detail matrix, and differentiating the infrared light image and the infrared light low-frequency image to obtain a first infrared light detail matrix;
The high-frequency fusion unit is used for carrying out high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix, and comprises the following steps: converting the first visible light detail matrix by using a visible light detail weight function to obtain a second visible light detail matrix; the step of converting the first infrared light detail matrix by using an infrared light detail weight function to obtain a second infrared light detail matrix, further comprises the following steps: calculating a difference image between the infrared light low-frequency image and the visible light low-frequency image; performing high-frequency and low-frequency joint control processing on the difference image and the second infrared detail matrix by using the obtained infrared high-frequency suppression parameters to obtain a third infrared detail matrix, wherein the low-frequency pixel difference value of a target position in the difference image and the suppression capability of the infrared high-frequency suppression parameters on the second infrared detail matrix corresponding to the target position are positively correlated; performing high-frequency fusion processing by using the second visible light detail matrix and the third infrared light detail matrix to obtain a high-frequency fusion matrix;
The low-frequency fusion unit is used for carrying out fusion processing on the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image;
And the target fusion unit is used for obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix.
8. A terminal device, the device comprising a memory and a processor coupled to the memory;
The memory is configured to store program data, and the processor is configured to execute the program data to implement the image fusion method according to any one of claims 1 to 7.
9. A computer readable storage medium for storing program data which, when executed by a processor, is adapted to carry out the method of fusion of images according to any one of claims 1 to 7.
CN202011009953.6A 2020-09-23 2020-09-23 Image fusion method, device, equipment and computer readable storage medium Active CN112233053B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011009953.6A CN112233053B (en) 2020-09-23 2020-09-23 Image fusion method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011009953.6A CN112233053B (en) 2020-09-23 2020-09-23 Image fusion method, device, equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN112233053A CN112233053A (en) 2021-01-15
CN112233053B true CN112233053B (en) 2024-05-07

Family

ID=74108551

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011009953.6A Active CN112233053B (en) 2020-09-23 2020-09-23 Image fusion method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112233053B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112950732B (en) * 2021-02-23 2022-04-01 北京三快在线科技有限公司 Image generation method and device, storage medium and electronic equipment
CN114677316B (en) * 2022-05-27 2022-11-25 深圳顶匠科技有限公司 Real-time visible light image and infrared image multi-channel fusion method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104504673A (en) * 2014-12-30 2015-04-08 武汉大学 Visible light and infrared images fusion method based on NSST and system thereof
CN105069768A (en) * 2015-08-05 2015-11-18 武汉高德红外股份有限公司 Visible-light image and infrared image fusion processing system and fusion method
KR101841939B1 (en) * 2016-12-12 2018-03-27 인천대학교 산학협력단 Image Processing Method using Fusion of Visible and Infrared Data
CN111586314A (en) * 2020-05-25 2020-08-25 浙江大华技术股份有限公司 Image fusion method and device and computer storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104504673A (en) * 2014-12-30 2015-04-08 武汉大学 Visible light and infrared images fusion method based on NSST and system thereof
CN105069768A (en) * 2015-08-05 2015-11-18 武汉高德红外股份有限公司 Visible-light image and infrared image fusion processing system and fusion method
KR101841939B1 (en) * 2016-12-12 2018-03-27 인천대학교 산학협력단 Image Processing Method using Fusion of Visible and Infrared Data
CN111586314A (en) * 2020-05-25 2020-08-25 浙江大华技术股份有限公司 Image fusion method and device and computer storage medium

Also Published As

Publication number Publication date
CN112233053A (en) 2021-01-15

Similar Documents

Publication Publication Date Title
US10672112B2 (en) Method and system for real-time noise removal and image enhancement of high-dynamic range images
US9082171B2 (en) Image processing device for reducing image noise and the method thereof
US9165346B2 (en) Method and apparatus for reducing image noise
CN112233053B (en) Image fusion method, device, equipment and computer readable storage medium
WO2017101489A1 (en) Method and device for image filtering
US9811882B2 (en) Method and apparatus for processing super resolution image using adaptive preprocessing filtering and/or postprocessing filtering
JP4858609B2 (en) Noise reduction device, noise reduction method, and noise reduction program
CN104335565A (en) Image processing method with detail-enhancing filter with adaptive filter core
WO2017113917A1 (en) Imaging method, imaging apparatus, and terminal
JP2014522051A (en) Enhanced local area contrast
US20230074180A1 (en) Method and apparatus for generating super night scene image, and electronic device and storage medium
CN106709890B (en) Method and device for low-illumination video image processing
WO2023273868A1 (en) Image denoising method and apparatus, terminal, and storage medium
US20170228854A1 (en) Method and device for reducing noise in a component of a picture
Xiao et al. Fast image enhancement based on color space fusion
CN112651911B (en) High dynamic range imaging generation method based on polarized image
Singh et al. A Novel Approach for Detail‐Enhanced Exposure Fusion Using Guided Filter
CN105631854A (en) FPGA platform-based self-adaptive image definition evaluation algorithm
US10217193B2 (en) Image processing apparatus, image capturing apparatus, and storage medium that stores image processing program
CN117408886A (en) Gas image enhancement method, gas image enhancement device, electronic device and storage medium
CN115689906A (en) Image processing method and device
Qi et al. Image denoising algorithm via spatially adaptive bilateral filtering
CN113469889A (en) Image noise reduction method and device
CN114596210A (en) Noise estimation method, device, terminal equipment and computer readable storage medium
TWI584642B (en) Filtering device and filter method of the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant