CN112233053A - Image fusion method, device, equipment and computer readable storage medium - Google Patents

Image fusion method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN112233053A
CN112233053A CN202011009953.6A CN202011009953A CN112233053A CN 112233053 A CN112233053 A CN 112233053A CN 202011009953 A CN202011009953 A CN 202011009953A CN 112233053 A CN112233053 A CN 112233053A
Authority
CN
China
Prior art keywords
image
frequency
infrared light
matrix
low
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011009953.6A
Other languages
Chinese (zh)
Inventor
杨熙丞
张东
王松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202011009953.6A priority Critical patent/CN112233053A/en
Publication of CN112233053A publication Critical patent/CN112233053A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application provides an image fusion method, an image fusion device, image fusion equipment and a computer-readable storage medium. The fusion method comprises the following steps: acquiring a visible light image and an infrared light image; respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image; the visible light image and the visible light low-frequency image are subjected to subtraction to obtain a first visible light detail matrix, and the infrared light image and the infrared light low-frequency image are subjected to subtraction to obtain a first infrared light detail matrix; carrying out high-frequency fusion processing by utilizing the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix; fusing the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fused image; and obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix. By the scheme, the natural degree of the fused image details is improved.

Description

Image fusion method, device, equipment and computer readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image fusion method, an image fusion device, an image fusion apparatus, and a computer-readable storage medium.
Background
With the development of monitoring technology, the requirements on images acquired by monitoring equipment are higher and higher. Especially in some special scenes, such as night low-light scenes, how to ensure the quality of the images is the key of the monitoring field.
In the prior art, usually, infrared texture information and visible texture information are obtained according to an acquired visible light image of a visible light band sensor and an acquired infrared light image of an infrared band sensor, maximum fusion texture information is acquired by fusing texture coefficients, and a reflection coefficient is acquired according to the fusion texture coefficients, so that a fusion result is acquired based on the infrared light image, the visible light image, the fusion texture coefficients and the reflection coefficient. However, the fusion result obtained by the above scheme often has the phenomena of abrupt fusion brightness and detail and unnatural fusion result.
Disclosure of Invention
The application provides an image fusion method, an image fusion device, image fusion equipment and a computer readable storage medium, and mainly solves the technical problem of how to improve the natural degree of fused image details.
In order to solve the above technical problem, the present application provides an image fusion method, where the fusion method includes:
acquiring a visible light image and an infrared light image;
respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image;
the visible light image and the visible light low-frequency image are subjected to subtraction to obtain a first visible light detail matrix, and the infrared light image and the infrared light low-frequency image are subjected to subtraction to obtain a first infrared light detail matrix;
carrying out high-frequency fusion processing by utilizing the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix;
fusing the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fused image;
and obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix.
In order to solve the above technical problem, the present application provides an image fusion apparatus, the apparatus including:
an acquisition unit configured to acquire a visible light image and an infrared light image;
the filtering unit is used for respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image;
the calculation unit is used for subtracting the visible light image from the visible light low-frequency image to obtain a first visible light detail matrix, and subtracting the infrared light image from the infrared light low-frequency image to obtain a first infrared light detail matrix;
the high-frequency fusion unit is used for performing high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix;
the low-frequency fusion unit is used for carrying out fusion processing on the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image;
and the target fusion unit is used for obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix.
To solve the above technical problem, the present application provides a terminal device, which includes a memory and a processor coupled to the memory;
the memory is used for storing program data, and the processor is used for executing the program data to realize the fusion method of the images.
To solve the above technical problem, the present application further provides a computer storage medium for storing program data, which when executed by a processor, is used to implement the image fusion method according to any one of the above.
In the scheme, a visible light image and an infrared light image are obtained; respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image; the visible light image and the visible light low-frequency image are subjected to subtraction to obtain a first visible light detail matrix, and the infrared light image and the infrared light low-frequency image are subjected to subtraction to obtain a first infrared light detail matrix; carrying out high-frequency fusion processing by utilizing the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix; fusing the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fused image; and obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix. By the scheme, the natural degree of the fused image details is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts. Wherein:
FIG. 1 is a schematic flow chart diagram illustrating a first embodiment of an image fusion method provided in the present application;
FIG. 2 is a schematic flow chart of obtaining a high-frequency fusion matrix in the image fusion method shown in FIG. 1;
FIG. 3 is a flowchart illustrating a second embodiment of an image fusion method provided in the present application;
FIG. 4 is a schematic structural diagram of an embodiment of an image fusion apparatus provided in the present application;
fig. 5 is a schematic structural diagram of an embodiment of a terminal device provided in the present application;
FIG. 6 is a block diagram of an embodiment of a computer storage medium provided herein.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The application provides an image fusion method, which can be applied to security monitoring and aims to perform fusion processing on a visible light image and an infrared light image in a low-light scene, the image fusion method can improve the natural degree of fusion image details, and specifically please refer to fig. 1, wherein fig. 1 is a schematic flow diagram of a first embodiment of the image fusion method provided by the application. The image fusion method of the embodiment can be applied to an image fusion device and can also be applied to a server with data processing capability. The image fusion method of the embodiment specifically comprises the following steps:
s101: a visible light image and an infrared light image are acquired.
According to the method and the device, the visible light image and the infrared light image in the low-light scene are obtained and are subjected to fusion processing, and the fusion image with natural details is obtained. The image fusion device in this embodiment can shoot the visible light image and the infrared light image in the low-light environment where the fusion device is located through the camera installed in the fusion device. Specifically, the camera may acquire a visible light image by sensing a visible light band and an infrared light image by sensing an infrared light band. The method comprises the steps of acquiring a visible light image and an infrared light image by arranging a camera in an image fusion device, and acquiring the visible light image and the infrared light image in a low-light environment where the image fusion device is located by the image fusion device at preset time intervals. The camera can be installed at any position in the image fusion device, one or more cameras can be arranged according to the number, so that the camera can be installed on one side of the image fusion device facing to the external environment in order to conveniently acquire visible light images and infrared light images in the low-light environment where the image fusion device is located.
S102: and respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image.
In order to obtain the visible light low-frequency image and the infrared light low-frequency image conveniently, the image fusion device carries out fusion processing on the visible light image, the infrared light image, the visible light low-frequency image and the infrared light low-frequency image to obtain a fusion image with natural details. The image fusion device of the embodiment performs low-pass filtering processing on the visible light image to obtain a visible light low-frequency image; and carrying out low-pass filtering processing on the infrared light image to obtain an infrared light low-frequency image. The image fusion device of this embodiment may use low-pass filters such as mean filtering, gaussian filtering, and guided filtering to perform low-pass filtering processing on the visible light image and the infrared light image, which is not limited in this embodiment.
S103: and subtracting the visible light image from the visible light low-frequency image to obtain a first visible light detail matrix, and subtracting the infrared light image from the infrared light low-frequency image to obtain a first infrared light detail matrix.
In order to obtain detail matrices corresponding to the visible light image and the infrared light image, the image fusion apparatus of this embodiment performs difference processing on the visible light image obtained in S101, the visible light low-frequency image obtained in S102, and the infrared light image obtained in S101 and the infrared light low-frequency image obtained in S102, to obtain a first visible light detail matrix and a first infrared light detail matrix, respectively.
Specifically, the image fusion device of the embodiment performs subtraction on the visible light image and the low-frequency light image to obtain a first visible light detail matrix; the image fusion device makes a difference between the infrared light image and the infrared light low-frequency image to obtain a first infrared light detail matrix.
Further, the calculation of the first visible light detail matrix satisfies the following equation:
V_detail=V1-V_base
where V _ detail is the first visible detail matrix, V1 is the visible image, and V _ base is the visible low-frequency image.
The calculation of the first infrared light detail matrix satisfies the following formula:
N_detail=N1-N_base
wherein, N _ detail is the first infrared light detail matrix, N1 is the infrared light image, and N _ base is the infrared light low frequency image.
S104: and performing high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix.
The image fusion device of this embodiment performs high-frequency fusion processing on the first visible light detail matrix and the first infrared light detail matrix based on the first visible light detail matrix and the first infrared light detail matrix acquired in S103, to obtain a high-frequency fusion matrix.
S105: and carrying out fusion processing on the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image.
The image fusion device of the present embodiment performs low-frequency fusion processing on the visible light low-frequency image and the infrared light low-frequency image based on the visible light low-frequency image and the infrared light low-frequency image acquired in S102, to obtain a low-frequency fusion image.
S106: and obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix.
In order to improve the natural degree of the details of the fused image, the image fusion apparatus of the present embodiment fuses the low-frequency fused image acquired in S105 and the high-frequency fused matrix acquired in S104 to obtain the target fused image.
In the scheme, the image fusion device acquires the visible light image and the infrared light image; respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image; the visible light image and the visible light low-frequency image are subjected to subtraction to obtain a first visible light detail matrix, and the infrared light image and the infrared light low-frequency image are subjected to subtraction to obtain a first infrared light detail matrix; carrying out high-frequency fusion processing by utilizing the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix; fusing the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fused image; and obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix, so that the natural degree of the details of the fusion image is improved.
Referring to fig. 2, fig. 2 is a schematic flow chart illustrating the process of obtaining the high-frequency fusion matrix in the image fusion method shown in fig. 1. In order to improve the natural degree of the details of the fused image, on the basis of the above embodiment, the image fusion method of the present embodiment further includes the following steps:
s201: and converting the first visible light detail matrix by using a visible light detail weight function to obtain a second visible light detail matrix.
In order to adjust the appropriate visible light image detail intensity, the image fusion apparatus of this embodiment performs conversion processing on the first visible light detail matrix by using the visible light weight function, so as to obtain a second visible light detail matrix.
The image fusion device of this embodiment traverses the pixels in the first visible light detail matrix, and calculates the absolute value of the pixel value of the selected pixel in the first visible light detail matrix by using the visible light detail weight function to obtain the converted pixels in the first visible light detail matrix, so as to form the second visible light detail matrix.
Specifically, the image fusion apparatus of this embodiment selects a pixel point in the first visible light detail matrix, queries a preset visible light detail weight function, and calculates an absolute value of a pixel value of the selected pixel point by using the visible light detail weight function, so as to convert the first visible light detail matrix into the second visible light detail matrix.
The conversion of the first visible light detail matrix and the second visible light detail matrix satisfies the following equation:
V_detail2(i,j)=sign(V_detail(i,j))*w_vis_detail(abs(V_detail(i,j)))
the method includes the steps that V _ detail2 is a second visible light detail matrix, i and j respectively represent a vertical coordinate and a horizontal coordinate of a visible light image, abs is an absolute value function, V _ detail is a first visible light detail matrix, w _ vis _ detail is a visible light detail weight function, and sign function is used for obtaining element symbol values in the first visible light detail matrix.
It should be noted that the value of the sign function satisfies the following formula:
Figure BDA0002697244160000061
as can be seen from the above, the value of the sign function is related to the first visible light detail matrix.
The i and the j respectively represent a vertical coordinate and a horizontal coordinate of the visible light image, the V _ detail is a first visible light detail matrix, and the sign function is used for acquiring element symbol values in the first visible light detail matrix.
Further, the visible light detail weight function satisfies the following equation:
Figure BDA0002697244160000071
wherein w _ vis _ detail is a visible light detail weight function, and x is a value of the selected pixel point in the first visible light detail matrix, namely V _ detail (i, j); low _ x and high _ x are respectively the abscissa of a turning point 1 and a turning point 2 in the visible light detail weight function, and low _ x is less than high _ x; low _ y and high _ y are the vertical coordinates of the turning point 1 and the turning point 2 in the visible light detail weight function, respectively, and low _ y is less than high _ y.
S202: and converting the first infrared light detail matrix by using an infrared light detail weight function to obtain a second infrared light detail matrix.
In order to adjust the appropriate infrared light image detail intensity, the image fusion apparatus of this embodiment performs conversion processing on the first infrared light detail matrix by using the infrared light detail weight function, so as to obtain a second infrared light detail matrix.
The image fusion device of this embodiment traverses the pixels in the first infrared light detail matrix, calculates the absolute value of the pixel value of the selected pixel in the first infrared light detail matrix by using the infrared light detail weight function, and obtains the converted pixels in the first infrared light detail matrix to form a second infrared light detail matrix.
Specifically, the image fusion apparatus of this embodiment selects a pixel point in the first infrared detail matrix, queries a preset infrared detail weight function, and calculates an absolute value of a pixel value of the selected pixel point by using the infrared detail weight function, so as to convert the first infrared detail matrix into the second infrared detail matrix.
The conversion of the first infrared light detail matrix and the second infrared light detail matrix satisfies the following equation:
N_detail2(i,j)=sign(N_detail(i,j))*w_nir_detail(abs(N_detail(i,j)))
the infrared image processing method includes the steps that N _ detail2 is a second infrared detail matrix, i and j respectively represent a vertical coordinate and a horizontal coordinate of an infrared image, abs is an absolute value function, N _ detail is a first infrared detail matrix, w _ nit _ detail is an infrared detail weight function, and sign function is used for obtaining element symbol values in the first infrared detail matrix.
In this step, reference may be made to S201 for values of the infrared light detail weight function and the sign function, which are not repeated herein.
S203: and calculating a difference image between the infrared light low-frequency image and the visible light low-frequency image.
In order to improve the natural degree of the fused image details, the image fusion device of the embodiment performs difference processing on the infrared light low-frequency image and the visible light low-frequency image, that is, calculates a difference image between the infrared light low-frequency image and the visible light low-frequency image, so as to obtain a third infrared light detail matrix according to the difference image.
S204: and carrying out high-frequency and low-frequency joint control processing on the difference image and the second infrared detail matrix to obtain a third infrared detail matrix.
The image fusion device of this embodiment performs high-frequency and low-frequency joint control processing on the difference image acquired in S203 and the second infrared light detail matrix acquired in S202, so as to obtain a third infrared light detail matrix.
Specifically, the calculation of the third infrared light detail matrix satisfies the following equation:
Figure BDA0002697244160000081
wherein diff _ base is a difference image between the infrared light low-frequency image and the visible light low-frequency image; i, j respectively represent the ordinate and the abscissa of the infrared light image; eps is a very small number, preventing divisor 0; n _ detail2 is a second infrared light detail matrix; n _ detail3 is a third infrared light detail matrix; k1 is the infrared light high frequency suppression parameter.
In a specific embodiment, the smaller k1 is, the stronger the ability to suppress the second infrared light detail matrix is, and when the difference between the low-frequency pixel of the infrared light image and the low-frequency pixel of the visible light image at a certain position is larger, the stronger the ability to suppress the second infrared light detail matrix at the certain position is.
S205: and performing low-pass filtering processing on the third infrared light detail matrix to obtain a fourth infrared light detail matrix, and performing low-pass filtering processing on the second visible light detail matrix to obtain a third visible light detail matrix.
In order to reduce noise in the third infrared light detail matrix and the second visible light detail matrix and increase the intensity of detail of the fused image, the image fusion device of the embodiment performs low-pass filtering processing on the third infrared light detail matrix to obtain a fourth infrared light detail matrix; and carrying out low-pass filtering processing on the second visible light detail matrix to obtain a third visible light detail matrix. The image fusion device of this embodiment may use low-pass filters such as mean filtering, gaussian filtering, and guided filtering to perform low-pass filtering processing on the third infrared light detail matrix and the second visible light detail matrix.
In a specific embodiment, the image fusion device performs low-pass filtering on the third infrared light detail matrix and the second visible light detail matrix by using an average filter to obtain a fourth infrared light detail matrix and a third visible light detail matrix respectively. Since the image fusion device performs low-pass filtering processing on the third infrared light detail matrix and the second visible light detail matrix by using the mean filter in the same manner, only the image fusion device performs low-pass filtering on the third infrared light detail matrix by using the mean filter to obtain the fourth infrared light detail matrix in this embodiment.
Specifically, the calculation of the fourth infrared light detail matrix satisfies the following equation:
Figure BDA0002697244160000091
wherein, N _ detail4 is a fourth infrared light detail matrix, N _ detail3 is a third infrared light detail matrix, x and y are position information of pixels of the infrared light image, r1 is a filtering radius, and w is a weight matrix of mean value filtering.
For the weighted matrix w of the mean filtering, in a specific embodiment, the image fusion device may take a neighborhood of 5 × 5 as an example, and the weighted matrix of the mean filtering satisfies the following formula:
Figure BDA0002697244160000092
s206: and performing excessive value suppression processing on the third visible light detail matrix to obtain a fourth visible light detail matrix.
In order to avoid the influence of the excessive value in the third visible light detail matrix on the natural degree of the fused image detail, the image fusion device of this embodiment performs the excessive value suppression processing on the third visible light detail matrix by using the third visible light detail matrix and the fourth infrared light detail matrix acquired in S205, so as to obtain a fourth visible light detail matrix.
Specifically, the calculation of the fourth visible light detail matrix satisfies the following equation:
Figure BDA0002697244160000101
w _ vis _ inhibit is a weight parameter for suppressing the visible light detail matrix, V _ detail3 is a third visible light detail matrix, V _ detail4 is a fourth visible light detail matrix, and N _ detail4 is a fourth infrared light detail matrix.
It should be noted that, when the weight parameter of the visible light detail suppression matrix is 1, the excessive value in the third visible light detail matrix is completely suppressed; when the weight parameter of the suppressed visible light detail matrix is 0, an excessive value in the third visible light detail matrix will not be suppressed at all.
In a specific embodiment, the image fusion device obtains the fourth visible light detail matrix by comparing the magnitude of the third visible light detail matrix with the magnitude of the fourth infrared light detail matrix. Specifically, if the value of the third visible light detail matrix is less than the value of the fourth infrared light detail matrix, the fourth infrared light detail matrix is V _ detail3(i, j); if the value of the third visible detail matrix is greater than or equal to the value of the fourth infrared detail matrix, the fourth infrared detail matrix is w _ vis _ inhibit _ V _ detail4(i, j) + (1-w _ vis _ inhibit) V _ detail3(i, j).
S207: comparing the magnitude of the fourth visible light detail matrix and the fourth infrared light detail matrix.
The image fusion device of this embodiment performs high-frequency fusion processing by using the acquired first visible light detail matrix, first infrared light detail matrix, fourth visible light detail matrix, and fourth infrared light detail matrix, to obtain a high-frequency fusion matrix.
Specifically, the image fusion device compares the magnitude of the fourth visible light detail matrix and the magnitude of the fourth infrared light detail matrix, and if the magnitude of the fourth visible light detail matrix is greater than the magnitude of the fourth infrared light detail matrix, S208 is executed, and the first infrared light detail matrix is taken as the high-frequency fusion matrix; if the value of the fourth visible light detail matrix is smaller than the value of the fourth infrared light detail matrix, S209 is performed to use the first visible light detail matrix as the high frequency fusion matrix.
Further, the calculation of the high frequency fusion matrix satisfies the following formula:
Figure BDA0002697244160000102
wherein F _ detail is a high frequency fusion matrix, N _ detail is a first infrared detail matrix, V _ detail is a first visible detail matrix, N _ detail4 is a fourth infrared detail matrix, and V _ detail4 is a fourth visible detail matrix.
S208: and if the numerical value of the fourth visible light detail matrix is greater than that of the fourth infrared light detail matrix, taking the first infrared light detail matrix as a high-frequency fusion matrix.
S209: and if the numerical value of the fourth visible light detail matrix is smaller than that of the fourth infrared light detail matrix, taking the first visible light detail matrix as a high-frequency fusion matrix.
In the scheme, the image fusion device converts the first visible light detail matrix by using a visible light detail weight function to obtain a second visible light detail matrix, and the detail intensity of the visible light image is adjusted; converting the first infrared detail matrix by using an infrared detail weight function to obtain a second infrared detail matrix, and adjusting the detail intensity of the infrared image; calculating a difference image between the infrared light low-frequency image and the visible light low-frequency image, and performing high-frequency and low-frequency joint control processing on the difference image and the second infrared light detail matrix to obtain a third infrared light detail matrix, so that the suppression on the second infrared light detail matrix is enhanced; the third infrared light detail matrix is subjected to low-pass filtering processing to obtain a fourth infrared light detail matrix, and the second visible light detail matrix is subjected to low-pass filtering processing to obtain a third visible light detail matrix, so that noise in the third infrared light detail matrix and the second visible light detail matrix is reduced, and detail intensity of the fused image is increased; and performing excessive value suppression processing on the third visible light detail matrix by using the third visible light detail matrix and the fourth infrared light detail matrix to obtain a fourth visible light detail matrix, and avoiding the influence of the excessive value in the third visible light detail matrix on the natural degree of the fused image details.
Referring to fig. 3, fig. 3 is a flowchart illustrating a second embodiment of an image fusion method according to the present application. Specifically, the image fusion method of the present embodiment includes the following steps:
s301: a visible light image and an infrared light image are acquired.
S302: and respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image.
S303: and subtracting the visible light image from the visible light low-frequency image to obtain a first visible light detail matrix, and subtracting the infrared light image from the infrared light low-frequency image to obtain a first infrared light detail matrix.
S304: and performing high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix.
The detailed descriptions of S301 to S304 in this embodiment can refer to the related descriptions in the above embodiments, and are not repeated herein.
S305: and converting the infrared light low-frequency image by using the infrared light suppression weight function to obtain the infrared light low-frequency suppression image.
In order to reduce the influence of the over-bright infrared light on the infrared light low-frequency image, the image fusion apparatus of this embodiment performs conversion processing on the infrared light low-frequency image by using an infrared light suppression weight function, so as to obtain an infrared light low-frequency suppression image.
The image fusion device of the embodiment traverses the pixel points in the infrared low-frequency image, calculates the pixel values of the selected pixel points in the infrared low-frequency image by using the infrared light suppression weight function, and obtains the pixel points after the conversion of the pixel points in the infrared low-frequency image to form the infrared low-frequency suppression image.
Specifically, the image fusion apparatus of this embodiment selects a pixel point in the infrared low-frequency image, queries a preset infrared light suppression weight function, and calculates a pixel value of the selected pixel point by using the infrared light suppression weight function, so as to convert the infrared low-frequency image into the infrared low-frequency suppression image.
The conversion of the infrared light low-frequency image and the infrared light low-frequency suppression image satisfies the following formula:
N_base2(i,j)=w_nir_inhibit*(N_base(i,j))
wherein, N _ base2 is an infrared light low-frequency suppression image; w _ nir _ inhibit is an infrared light rejection weight function; and N _ base is an infrared light low-frequency image.
It should be noted that, the value of the infrared light suppression weight function satisfies the following formula:
Figure BDA0002697244160000121
wherein x1 is the abscissa of the transition point in the infrared light suppression weight function, and x is the pixel value of the selected pixel point in the infrared light low-frequency image, i.e. N _ base (i, j); w _ nir _ inhibit is an infrared light rejection weight function; k2 represents the suppression degree of the infrared light low-frequency image, when k2 is 1, the image fusion device does not perform suppression processing on the infrared light low-frequency image, and when k2 is smaller than 1, and the smaller k2, the stronger the suppression degree of the infrared light low-frequency image by the image fusion device.
Further, in a specific embodiment, when the input pixel value x is smaller than x1, the image fusion device does not process the pixel value of the input selected pixel point in the infrared light low-frequency image.
S306: and carrying out low-frequency fusion on the infrared light low-frequency suppression image and the visible light low-frequency image by using a low-frequency fusion technology to obtain a low-frequency fusion image.
The image fusion device of this embodiment performs low-frequency fusion processing on the infrared light low-frequency suppression image and the visible light low-frequency image based on the infrared light low-frequency suppression image acquired in S305, to obtain a low-frequency fusion image.
Specifically, the calculation of the low-frequency fusion image satisfies the following formula:
F_base(i,j)=w_nir(i,j)*N_base2(i,j)+(1-w_nir(i,j))*V_base(i,j)
w _ nir is an infrared light fusion ratio matrix, N _ base2 is an infrared light low-frequency suppression image, V _ base is a visible light low-frequency image, and F _ base is a low-frequency fusion image.
In a specific embodiment, the weight value of each position in the infrared light fusion proportion matrix may be the same or different. The infrared light fusion proportion matrix can be obtained by the difference value of the infrared light low-frequency suppression image and the visible light low-frequency image. The acquisition of the infrared light fusion proportion matrix meets the following formula:
w_nir(i,j)=min(k3*(1-abs(V_base(i,j)-N_base2(i,j))),1)
wherein abs is an absolute value function, V _ base is a visible light low-frequency image, N _ base2 is an infrared light low-frequency suppression image, k3 is a parameter for adjusting the infrared light fusion ratio, and the larger the parameter for adjusting the infrared light fusion ratio, the higher the average fusion ratio of the infrared light.
In a specific embodiment, when the difference between the pixel of the infrared light low-frequency suppression image at a certain position and the pixel of the visible light low-frequency image is larger, the infrared light brightness is fused less; when the difference between the pixel of the infrared light low-frequency suppression image at a certain position and the pixel of the visible light low-frequency image is smaller, the more infrared light brightness is fused.
S307: and obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix.
The image fusion device of the present embodiment performs fusion processing using the high-frequency fusion matrix obtained in S304 and the low-frequency fusion image obtained in S306, to obtain a target fusion image.
Specifically, the fusion of the target images satisfies the following formula:
F_result=F_base+F_detail
wherein, F _ result is a target fusion image, F _ base is a low-frequency fusion image, and F _ detail is a high-frequency fusion matrix.
In the scheme, the image fusion device acquires a visible light image and an infrared light image; respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image; the visible light image and the visible light low-frequency image are subjected to subtraction to obtain a first visible light detail matrix, and the infrared light image and the infrared light low-frequency image are subjected to subtraction to obtain a first infrared light detail matrix; carrying out high-frequency fusion processing by utilizing the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix; converting the infrared light low-frequency image by using an infrared light suppression weight function to obtain an infrared light low-frequency suppression image; carrying out low-frequency fusion on the infrared light low-frequency suppression image and the visible light low-frequency image by using a low-frequency fusion technology to obtain a low-frequency fusion image; and obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix. The image fusion device of the embodiment converts the infrared light low-frequency image by using the infrared light suppression weight function to obtain the infrared light low-frequency suppression image, so that the influence of over-bright infrared light on the infrared light low-frequency image is reduced; the low-frequency fusion technology is utilized to perform low-frequency fusion on the infrared light low-frequency suppression image and the visible light low-frequency image to obtain a low-frequency fusion image, and the low-frequency fusion image and the high-frequency fusion matrix are utilized to perform fusion processing to obtain a target fusion image, so that the natural degree of details of the target fusion image is improved.
To implement the image fusion method according to the foregoing embodiment, the present application provides another terminal device, and specifically refer to fig. 4, where fig. 4 is a schematic structural diagram of an embodiment of an image fusion apparatus according to the present application.
The image fusion apparatus 400 includes an acquisition unit 41, a filtering unit 42, a calculation unit 43, a high-frequency fusion unit 44, a low-frequency fusion unit 45, and a target fusion unit 46.
An acquisition unit 41 for acquiring a visible light image and an infrared light image.
And the filtering unit 42 is configured to perform low-pass filtering processing on the visible light image and the infrared light image respectively to obtain a visible light low-frequency image and an infrared light low-frequency image.
The calculating unit 43 is configured to obtain a first visible light detail matrix by subtracting the visible light image from the visible light low-frequency image, and obtain a first infrared light detail matrix by subtracting the infrared light image from the infrared light low-frequency image.
And the high-frequency fusion unit 44 is configured to perform high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix.
And the low-frequency fusion unit 45 is used for performing fusion processing on the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image.
And a target fusion unit 46, configured to obtain a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix.
To implement the image fusion method of the above embodiment, the present application provides another terminal device, and specifically refer to fig. 5, where fig. 5 is a schematic structural diagram of an embodiment of the terminal device provided in the present application.
The terminal device 500 comprises a memory 51 and a processor 52, wherein the memory 51 and the processor 52 are coupled.
The memory 51 is used for storing program data, and the processor 52 is used for executing the program data to realize the fusion method of the images of the above-described embodiment.
In the present embodiment, the processor 52 may also be referred to as a CPU (Central Processing Unit). Processor 52 may be an integrated circuit chip having signal processing capabilities. The processor 52 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor 52 may be any conventional processor or the like.
The present application further provides a computer storage medium 600, as shown in fig. 6, the computer storage medium 600 is used for storing program data 61, and the program data 61 is used for implementing the fusion method of the images as described in the method embodiment of the present application when being executed by the processor.
The method involved in the embodiment of the fusion method of the image of the present application, when implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a device, for example, a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (10)

1. A method for fusing images, the method comprising:
acquiring a visible light image and an infrared light image;
respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image;
the visible light image and the visible light low-frequency image are subjected to subtraction to obtain a first visible light detail matrix, and the infrared light image and the infrared light low-frequency image are subjected to subtraction to obtain a first infrared light detail matrix;
carrying out high-frequency fusion processing by utilizing the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix;
fusing the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fused image;
and obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix.
2. The fusion method according to claim 1, wherein the step of performing high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix comprises:
converting the first visible light detail matrix by using a visible light detail weight function to obtain a second visible light detail matrix;
the step of converting the first infrared light detail matrix by using the infrared light detail weight function to obtain a second infrared light detail matrix further includes:
calculating a difference image between the infrared light low-frequency image and the visible light low-frequency image;
carrying out high-frequency and low-frequency joint control processing on the difference image and the second infrared detail matrix to obtain a third infrared detail matrix;
and performing high-frequency fusion processing by using the second visible light detail matrix and the third infrared light detail matrix to obtain a high-frequency fusion matrix.
3. The fusion method according to claim 2, wherein the step of performing high-frequency fusion processing by using the second visible light detail matrix and the third infrared light detail matrix to obtain a high-frequency fusion matrix further comprises:
performing low-pass filtering processing on the third infrared light detail matrix to obtain a fourth infrared light detail matrix;
the step of performing low-pass filtering processing on the second visible light detail matrix to obtain a third visible light detail matrix further includes:
performing excessive value suppression processing on the third visible light detail matrix to obtain a fourth visible light detail matrix;
and performing high-frequency fusion processing by using the first visible light detail matrix, the first infrared light detail matrix, the fourth visible light detail matrix and the fourth infrared light detail matrix to obtain a high-frequency fusion matrix.
4. The fusion method according to claim 3, wherein the step of performing high-frequency fusion processing by using the first visible light detail matrix, the first infrared light detail matrix, the second visible light detail matrix, and the second infrared light detail matrix to obtain a high-frequency fusion matrix comprises:
comparing the magnitude of the values of the fourth visible light detail matrix and the fourth infrared light detail matrix;
if the numerical value of the fourth visible light detail matrix is larger than the numerical value of the fourth infrared light detail matrix, taking the first infrared light detail matrix as a high-frequency fusion matrix;
and if the numerical value of the fourth visible light detail matrix is smaller than the numerical value of the fourth infrared light detail matrix, taking the first visible light detail matrix as a high-frequency fusion matrix.
5. The fusion method according to claim 2, wherein the step of transforming the first visible light detail matrix by using the visible light detail weight function to obtain a second visible light detail matrix comprises:
traversing the pixels in the first visible light detail matrix, and calculating the absolute value of the pixel value of the selected pixel in the first visible light detail matrix by using the visible light detail weight function to obtain the converted pixel in the first visible light detail matrix so as to form a second visible light detail matrix;
the step of converting the first infrared light detail matrix by using the infrared light detail weight function to obtain a second visible light detail matrix further includes:
and traversing the pixels in the first infrared light detail matrix, and calculating the absolute value of the pixel value of the selected pixel in the first infrared light detail matrix by using the infrared light detail weight function to obtain the converted pixel in the first infrared light detail matrix so as to form a second infrared light detail matrix.
6. The fusion method according to claim 1, wherein the step of fusing the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image comprises:
converting the infrared light low-frequency image by using an infrared light suppression weight function to obtain an infrared light low-frequency suppression image;
and carrying out low-frequency fusion on the infrared light low-frequency suppression image and the visible light low-frequency image by using a low-frequency fusion technology to obtain a low-frequency fusion image.
7. The fusion method according to claim 6, wherein the step of converting the infrared light low-frequency image by using an infrared light suppression weight function to obtain the infrared light low-frequency suppression image further comprises:
and traversing the pixels in the infrared light low-frequency image, and calculating the pixel value of the selected pixel in the infrared light low-frequency image by using the infrared light suppression weight function to obtain the converted pixel of the pixel in the infrared light low-frequency image so as to form the infrared light low-frequency suppression image.
8. An image fusion apparatus, characterized in that the apparatus comprises:
an acquisition unit configured to acquire a visible light image and an infrared light image;
the filtering unit is used for respectively carrying out low-pass filtering processing on the visible light image and the infrared light image to obtain a visible light low-frequency image and an infrared light low-frequency image;
the calculation unit is used for subtracting the visible light image from the visible light low-frequency image to obtain a first visible light detail matrix, and subtracting the infrared light image from the infrared light low-frequency image to obtain a first infrared light detail matrix;
the high-frequency fusion unit is used for performing high-frequency fusion processing by using the first visible light detail matrix and the first infrared light detail matrix to obtain a high-frequency fusion matrix;
the low-frequency fusion unit is used for carrying out fusion processing on the visible light low-frequency image and the infrared light low-frequency image to obtain a low-frequency fusion image;
and the target fusion unit is used for obtaining a target fusion image based on the low-frequency fusion image and the high-frequency fusion matrix.
9. A terminal device, comprising a memory and a processor coupled to the memory;
wherein the memory is used for storing program data, and the processor is used for executing the program data to realize the fusion method of the images according to any one of claims 1-7.
10. A computer-readable storage medium for storing program data which, when executed by a processor, is configured to implement the method of fusing images according to any one of claims 1 to 7.
CN202011009953.6A 2020-09-23 2020-09-23 Image fusion method, device, equipment and computer readable storage medium Pending CN112233053A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011009953.6A CN112233053A (en) 2020-09-23 2020-09-23 Image fusion method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011009953.6A CN112233053A (en) 2020-09-23 2020-09-23 Image fusion method, device, equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN112233053A true CN112233053A (en) 2021-01-15

Family

ID=74108551

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011009953.6A Pending CN112233053A (en) 2020-09-23 2020-09-23 Image fusion method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112233053A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112950732A (en) * 2021-02-23 2021-06-11 北京三快在线科技有限公司 Image generation method and device, storage medium and electronic equipment
CN114677316A (en) * 2022-05-27 2022-06-28 深圳顶匠科技有限公司 Real-time visible light image and infrared image multi-channel fusion method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104504673A (en) * 2014-12-30 2015-04-08 武汉大学 Visible light and infrared images fusion method based on NSST and system thereof
CN105069768A (en) * 2015-08-05 2015-11-18 武汉高德红外股份有限公司 Visible-light image and infrared image fusion processing system and fusion method
KR101841939B1 (en) * 2016-12-12 2018-03-27 인천대학교 산학협력단 Image Processing Method using Fusion of Visible and Infrared Data
CN111586314A (en) * 2020-05-25 2020-08-25 浙江大华技术股份有限公司 Image fusion method and device and computer storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104504673A (en) * 2014-12-30 2015-04-08 武汉大学 Visible light and infrared images fusion method based on NSST and system thereof
CN105069768A (en) * 2015-08-05 2015-11-18 武汉高德红外股份有限公司 Visible-light image and infrared image fusion processing system and fusion method
KR101841939B1 (en) * 2016-12-12 2018-03-27 인천대학교 산학협력단 Image Processing Method using Fusion of Visible and Infrared Data
CN111586314A (en) * 2020-05-25 2020-08-25 浙江大华技术股份有限公司 Image fusion method and device and computer storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112950732A (en) * 2021-02-23 2021-06-11 北京三快在线科技有限公司 Image generation method and device, storage medium and electronic equipment
CN112950732B (en) * 2021-02-23 2022-04-01 北京三快在线科技有限公司 Image generation method and device, storage medium and electronic equipment
CN114677316A (en) * 2022-05-27 2022-06-28 深圳顶匠科技有限公司 Real-time visible light image and infrared image multi-channel fusion method and device

Similar Documents

Publication Publication Date Title
CN109712102B (en) Image fusion method and device and image acquisition equipment
JP4640508B2 (en) Image processing apparatus, image processing method, program, and imaging apparatus
TWI596573B (en) Image processing device for reducing image noise and the method thereof
JP5144202B2 (en) Image processing apparatus and program
JP4858609B2 (en) Noise reduction device, noise reduction method, and noise reduction program
KR101327789B1 (en) Method and apparatus for reducing various noises of image simultaneously
CN111402258A (en) Image processing method, image processing device, storage medium and electronic equipment
JP2012165365A (en) Digital image stabilization method with adaptive filtering
CN104335565A (en) Image processing method with detail-enhancing filter with adaptive filter core
CN112233053A (en) Image fusion method, device, equipment and computer readable storage medium
US10127640B2 (en) Method and device for reducing noise in a component of a picture
TW201120809A (en) System and method for processing an image edge
CN111311482A (en) Background blurring method and device, terminal equipment and storage medium
CN112767291A (en) Visible light image and infrared image fusion method and device and readable storage medium
CN113168669A (en) Image processing method and device, electronic equipment and readable storage medium
JP2010268426A (en) Image processing apparatus, image processing method, and program
CN111383178A (en) Image enhancement method and device and terminal equipment
CN106709890B (en) Method and device for low-illumination video image processing
CN111311481A (en) Background blurring method and device, terminal equipment and storage medium
US11202045B2 (en) Image processing apparatus, imaging apparatus, image processing method, and program
JP2007334457A (en) Image processor and image processing method
JP2007323635A (en) Recursive filtering of video image
JP2009194721A (en) Image signal processing device, image signal processing method, and imaging device
US20120128076A1 (en) Apparatus and method for reducing blocking artifacts
CN112419161A (en) Image processing method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination