CN114418905A - Image fusion algorithm for pixel sampling center staggered overlapping arrangement dual-band detector - Google Patents

Image fusion algorithm for pixel sampling center staggered overlapping arrangement dual-band detector Download PDF

Info

Publication number
CN114418905A
CN114418905A CN202111434259.3A CN202111434259A CN114418905A CN 114418905 A CN114418905 A CN 114418905A CN 202111434259 A CN202111434259 A CN 202111434259A CN 114418905 A CN114418905 A CN 114418905A
Authority
CN
China
Prior art keywords
image
frequency
wave
low
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111434259.3A
Other languages
Chinese (zh)
Other versions
CN114418905B (en
Inventor
赵灿兵
粟宇路
苏俊波
李谦
洪闻青
苏兰
王海洋
杨绍明
罗正钦
曾兴容
陈绍林
徐思轶
李学宽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kunming Institute of Physics
Original Assignee
Kunming Institute of Physics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kunming Institute of Physics filed Critical Kunming Institute of Physics
Priority to CN202111434259.3A priority Critical patent/CN114418905B/en
Publication of CN114418905A publication Critical patent/CN114418905A/en
Application granted granted Critical
Publication of CN114418905B publication Critical patent/CN114418905B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4053Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Facsimile Image Signal Circuits (AREA)

Abstract

The invention relates to an image fusion algorithm of a pixel sampling center staggered and overlapped dual-band detector, belonging to the technical field of image processing. The image fusion algorithm comprises the following steps: separating the spatial information difference from the waveband information difference; recombining the spatial information and the waveband information; generating a black-and-white fused image based on the spatial characteristics; and generating and converting an LAB space color fusion image. Aiming at the novel dual-band detector with two band pixel sampling centers in staggered overlapping arrangement, the method effectively utilizes high-frequency information brought by staggered sampling on the premise of simultaneously reserving band difference information, and simultaneously maps band characteristics through a color model which is more in line with human vision to obtain a color fusion image with better visual effect.

Description

Image fusion algorithm for pixel sampling center staggered overlapping arrangement dual-band detector
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to an image fusion algorithm of a dual-band detector with staggered and overlapped pixel sampling centers.
Background
The dual-band infrared imaging technology can simultaneously obtain the infrared radiation information of two atmospheric windows, and has the advantages of two single-band imaging technologies. Through complementary advantages, the dual-waveband infrared imaging technology can improve the adaptability of the infrared imaging equipment to various complex environmental conditions and improve the success rate of detecting and identifying various targets. How to fuse and display different wave band information and space information through an image fusion algorithm and present a better visual effect has important significance for exerting the advantages of the dual-wave band infrared imaging technology.
At present, most of pixels of the dual-band infrared detector are of a laminated structure, and sampling centers of the two-band pixels are overlapped, as shown in fig. 2 (b). This means that the images of the two bands obtained by the detector are spatially identical, and only the band difference exists, and only the extracted band difference needs to be considered and fused when the images are fused. The image fusion algorithm of the current dual-band detector is basically designed for the situation. In a new generation of dual-band detector proposed by french CEA-leii, two band pixel sampling centers are arranged in a staggered manner, as shown in fig. 2 (a). The structure has the advantages that the design of a small-pixel dual-band detector which integrates and outputs simultaneously is facilitated, in addition, signals at different positions are sampled simultaneously through medium waves and long waves, the total space sampling frequency is improved, and more details can be obtained. However, the two band pixel sampling centers have spatial difference, which means that the obtained images have both spatial difference and band difference, and the two difference information is coupled together. If a similar strategy to the overlap-type is adopted when image fusion is performed, image quality may eventually be degraded. A novel algorithm needs to be designed, separation and recombination of space information and waveband information are achieved, and the advantage of sampling frequency improvement brought by pixel sampling center dislocation is brought to play on the premise that waveband characteristics are extracted and reserved. Therefore, how to overcome the defects of the prior art is a problem which needs to be solved in the technical field of image processing at present.
Disclosure of Invention
The invention aims to solve the problem that a novel image fusion algorithm is provided for a pixel sampling center staggered and overlapped dual-band detector, separation and recombination of space information and band information can be realized, band characteristics are mapped through a color model which is more in line with human vision, the advantages of the pixel sampling center staggered and overlapped dual-band detector are exerted, and a color fusion image with a better visual effect is obtained.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
the image fusion algorithm of the pixel sampling center staggered dual-band detector comprises the following steps:
step (1), separating spatial information difference and waveband information difference: the method comprises the steps of up-sampling a medium-long wave dual-waveband infrared image, and then performing frequency domain decomposition on the up-sampled image to obtain a high-frequency component and a low-frequency component;
and (2) recombining the spatial information and the waveband information: according to the pixel staggering condition, carrying out staggered arrangement or shifting on the long-wave high-frequency components to finish space difference recombination; taking the ratio of the low-frequency components of the two wave bands as a reference, and recombining the extracted high-frequency components and the extracted low-frequency components to ensure that the ratio of the recombined image is consistent with the ratio of the low-frequency components, and enhancing the high-frequency information in the recombining process;
and (3) generating a black-white fusion image based on the spatial characteristics: averaging the low-frequency components of the two wave bands to obtain low-frequency components of a black-white fused image, generating local contrast information by respectively combining the medium-wave high-frequency component, the long-wave high-frequency component and the recombined high-frequency component with the corresponding low-frequency components, selecting the high-frequency information with the maximum local contrast as the high-frequency components of the black-white fused image, and finally adding the high-frequency components and the low-frequency components to obtain the fused image;
and (4) generating and converting an LAB space color fusion image: adopting the black-white fusion image generated in the step (3) as an L component of the LAB space color fusion image; then, calculating a red-green component A and a blue-yellow component B of the LAB space color fusion image according to the intermediate wave and long wave gray level images recombined in the step (2); and finally, converting the color image in the LAB space into the RGB space to obtain a RGB space color fusion image.
Further, in step (1), preferably, the upsampling may adopt an interpolation algorithm or a super-resolution reconstruction algorithm.
Further, preferably, in the step (2), the ratio of the low-frequency components of the two wave bands is used as a reference, the extracted high-frequency components and the extracted low-frequency components are recombined, so that the ratio of the recombined image is consistent with the ratio of the low-frequency components, and in the process of recombination, the high-frequency information is enhanced, and the specific method is as follows:
Figure BDA0003381062880000021
Figure BDA0003381062880000022
in the formula, IBmw(i, j) and IBlw(i, j) are the intermediate wave and long wave gray images after the wave band information is recombined,
Figure BDA0003381062880000023
for high frequency detail enhancement factor, ILmw(i, j) and ILlw(i, j) decomposed medium and long wave low frequency components, IHmw(i, j) and IHlw(i, j) are decomposed medium and long wave high frequency components, i and j represent pixel position coordinates.
Further, it is preferable that the specific method of step (3) is:
firstly, averaging low-frequency components of two wave bands to obtain the low-frequency component of a black-white fusion image, wherein the calculation formula is as follows:
Figure BDA0003381062880000031
in the formula (II)gfu(i, j) is the low frequency component of the black and white fused image;
then, dividing a recombined high-frequency component obtained by recombining the medium-wave high frequency, the long-wave high frequency and the space difference by a corresponding low-frequency component to obtain local contrast information, and selecting the high-frequency information with the maximum local contrast as the high-frequency component of the black-white fused image;
Figure BDA0003381062880000032
Figure BDA0003381062880000033
and IH(i,j)∈{IHmw(i,j),IHlw(i,j),IHrb(i,j)}
in the formula,IHgfu(i, j) is the high frequency component of the black and white blended image, IHrb(i, j) is a recombined high-frequency component obtained by spatial difference recombination;
and finally, adding the high-frequency component and the low-frequency component of the black-white fused image to obtain a black-white fused image, wherein the calculation formula is as follows:
Igfu(i,j)=ILgfu(i,j)+IHgfu(i,j)
in the formula Igfu(i, j) is the final black and white fused image.
Further, it is preferable that the specific method of step (4) is:
adopting the black-white fusion image generated in the step (3) as an L component of the LAB space color fusion image; then, mapping the features of long wave intensity to red bias, mapping the features of medium wave intensity to blue bias, and performing tone control, wherein the calculation formula is as follows:
Acfu(i,j)=K1(IBlw(i,j)-IBmw(i,j))
Bcfu(i,j)=K2(IBlw(i,j)-IBmw(i,j))
in the formula, Acfu(i, j) and Bcfu(i, j) Red Green component A and blue yellow component B, K of the LAB space color fusion image, respectively1And K2Is a color control coefficient;
finally, the color image in the LAB space is converted to the RGB space, resulting in R, G, B components of the color-fused image for image transmission and display.
The high frequency component obtained by the frequency domain decomposition of the invention mainly comprises the spatial information difference, and the low frequency component mainly comprises the waveband information difference. The decomposition of the high and low frequency components may take any form of high pass or low pass filter.
In the present invention,
Figure BDA0003381062880000041
the detail enhancement factor for high frequency is set according to the detail enhancement effect desired to be obtained, and can be set to 0.5. Images reconstructed according to the formula of the invention, two of themThe ratio of each pixel of each wave band is consistent with the low-frequency component, and the wave band characteristic difference is kept.
In the present invention, K1And K2For the color control coefficient, it can be set according to the desired chromaticity and saturation, and can be set to 1.
In the invention, long-wave high-frequency components are staggered or shifted according to the pixel staggering condition to complete spatial difference recombination, as shown in figure 2, a middle-wave pixel sampling center and a long-wave pixel sampling center are staggered by half a pixel in two directions of a row and a column, so that the resolution of an image can be interpolated and amplified by one time in an algorithm, and one wave band is reversely moved by one pixel in the two directions of the row and the column.
Compared with the prior art, the invention has the beneficial effects that:
aiming at a novel dual-band detector with two band pixel sampling centers in staggered overlapping arrangement; because the two wave band pixel sampling centers have spatial difference, the obtained image has spatial difference and wave band difference at the same time, and the two kinds of difference information are coupled together, the spatial difference information and the wave band difference information need to be separated and recombined when the image is fused. The method effectively utilizes high-frequency information brought by staggered sampling on the premise of simultaneously retaining the wave band difference information, and simultaneously maps the wave band characteristics through the color model which is more in line with the vision of human eyes to obtain the color fusion image with better visual effect.
The existing dual-band image fusion technology aims at the detector with completely overlapped pixels or the discrete detector, and can not be completely multiplexed in the image fusion of the novel detector with staggered pixel sampling centers. The algorithm provided by the invention can fully play the advantages of the novel detector with the pixel sampling center in staggered arrangement, and recover the sub-pixel high-frequency information from the space difference of the two wave bands. For the four-bar target with characteristic frequency, as long as one waveband can obtain a distinguishable image, the obtained fusion image retains the resolution capability of the four-bar target; for the four-bar target beyond the characteristic frequency, when the two wave bands can not obtain the distinguishable images respectively, the obtained fusion image improves the integrity of the four-bar target image and obtains a barely distinguishable image.
In addition, in the prior art, color fusion is based on RGB and other spaces, and the influence of the color space on the color accuracy is not considered, the algorithm provided by the invention adopts LAB color space for fusion, so that the color accuracy is higher compared with RGB and other spaces, and the better visual effect is obtained.
Drawings
FIG. 1 is a flow chart of an image fusion algorithm of a pixel sampling center staggered overlapping arrangement dual-band detector according to the invention;
FIG. 2 is a schematic diagram of a staggered overlapping arrangement of pixel sampling centers of a dual-band detector; wherein, (a) is a staggered arrangement structure; (b) is of a laminated structure; d represents the center distance of the pixels;
FIG. 3 is a process result obtained by simulation using a simulation image; wherein, (a) is a long-wave infrared image; (b) is a medium wave infrared image; (c) is a black and white fused image; (d) is a color fused image.
Detailed Description
The present invention will be described in further detail with reference to examples.
It will be appreciated by those skilled in the art that the following examples are illustrative of the invention only and should not be taken as limiting the scope of the invention. The examples do not specify particular techniques or conditions, and are performed according to the techniques or conditions described in the literature in the art or according to the product specifications. The materials or equipment used are not indicated by manufacturers, and all are conventional products available by purchase.
Example 1
As shown in fig. 1, the image fusion algorithm of the pixel sampling center staggered dual-band detector comprises the following steps:
step (1), separating spatial information difference and waveband information difference: the method comprises the steps of up-sampling a medium-long wave dual-waveband infrared image, and then performing frequency domain decomposition on the up-sampled image to obtain a high-frequency component and a low-frequency component;
and (2) recombining the spatial information and the waveband information: according to the pixel staggering condition, carrying out staggered arrangement or shifting on the long-wave high-frequency components to finish space difference recombination; taking the ratio of the low-frequency components of the two wave bands as a reference, and recombining the extracted high-frequency components and the extracted low-frequency components to ensure that the ratio of the recombined image is consistent with the ratio of the low-frequency components, and enhancing the high-frequency information in the recombining process;
and (3) generating a black-white fusion image based on the spatial characteristics: averaging the low-frequency components of the two wave bands to obtain low-frequency components of a black-white fused image, generating local contrast information by respectively combining the medium-wave high-frequency component, the long-wave high-frequency component and the recombined high-frequency component with the corresponding low-frequency components, selecting the high-frequency information with the maximum local contrast as the high-frequency components of the black-white fused image, and finally adding the high-frequency components and the low-frequency components to obtain the fused image;
and (4) generating and converting an LAB space color fusion image: adopting the black-white fusion image generated in the step (3) as an L component of the LAB space color fusion image; then, calculating a red-green component A and a blue-yellow component B of the LAB space color fusion image according to the intermediate wave and long wave gray level images recombined in the step (2); and finally, converting the color image in the LAB space into the RGB space to obtain a RGB space color fusion image.
Example 2
As shown in fig. 1, the image fusion algorithm of the pixel sampling center staggered dual-band detector comprises the following steps:
step (1), separating spatial information difference and waveband information difference: the method comprises the steps of up-sampling a medium-long wave dual-waveband infrared image, and then performing frequency domain decomposition on the up-sampled image to obtain a high-frequency component and a low-frequency component;
and (2) recombining the spatial information and the waveband information: according to the pixel staggering condition, carrying out staggered arrangement or shifting on the long-wave high-frequency components to finish space difference recombination; taking the ratio of the low-frequency components of the two wave bands as a reference, and recombining the extracted high-frequency components and the extracted low-frequency components to ensure that the ratio of the recombined image is consistent with the ratio of the low-frequency components, and enhancing the high-frequency information in the recombining process;
and (3) generating a black-white fusion image based on the spatial characteristics: averaging the low-frequency components of the two wave bands to obtain low-frequency components of a black-white fused image, generating local contrast information by respectively combining the medium-wave high-frequency component, the long-wave high-frequency component and the recombined high-frequency component with the corresponding low-frequency components, selecting the high-frequency information with the maximum local contrast as the high-frequency components of the black-white fused image, and finally adding the high-frequency components and the low-frequency components to obtain the fused image;
and (4) generating and converting an LAB space color fusion image: adopting the black-white fusion image generated in the step (3) as an L component of the LAB space color fusion image; then, calculating a red-green component A and a blue-yellow component B of the LAB space color fusion image according to the intermediate wave and long wave gray level images recombined in the step (2); and finally, converting the color image in the LAB space into the RGB space to obtain a RGB space color fusion image.
In the step (1), the upsampling can adopt an interpolation algorithm or a super-resolution reconstruction algorithm.
In the step (2), the ratio of the low-frequency components of the two wave bands is used as a reference, the extracted high-frequency components and the extracted low-frequency components are recombined, so that the ratio of the recombined image is consistent with the ratio of the low-frequency components, and in the process of recombination, high-frequency information is enhanced, and the specific method comprises the following steps:
Figure BDA0003381062880000061
Figure BDA0003381062880000062
in the formula, IBmw(i, j) and IBlw(i, j) are the intermediate wave and long wave gray images after the wave band information is recombined,
Figure BDA0003381062880000063
for high frequency detail enhancement factor, ILmw(i, j) and ILlw(i, j) decomposed medium and long wave low frequency components, IHmw(i, j) and IHlw(i, j) are decomposed medium and long wave high frequency components, i and j represent pixel position coordinates.
The specific method of the step (3) is as follows:
firstly, averaging low-frequency components of two wave bands to obtain the low-frequency component of a black-white fusion image, wherein the calculation formula is as follows:
Figure BDA0003381062880000071
in the formula (II)gfu(i, j) is the low frequency component of the black and white fused image;
then, dividing a recombined high-frequency component obtained by recombining the medium-wave high frequency, the long-wave high frequency and the space difference by a corresponding low-frequency component to obtain local contrast information, and selecting the high-frequency information with the maximum local contrast as the high-frequency component of the black-white fused image;
Figure BDA0003381062880000072
Figure BDA0003381062880000073
and IH(i,j)∈{IHmw(i,j),IHlw(i,j),IHrb(i,j)}
in the formula, IHgfu(i, j) is the high frequency component of the black and white blended image, IHrb(i, j) is a recombined high-frequency component obtained by spatial difference recombination;
and finally, adding the high-frequency component and the low-frequency component of the black-white fused image to obtain a black-white fused image, wherein the calculation formula is as follows:
Igfu(i,j)=ILgfu(i,j)+IHgfu(i,j)
in the formula Igfu(i, j) is the final black and white fused image.
The specific method of the step (4) is as follows:
adopting the black-white fusion image generated in the step (3) as an L component of the LAB space color fusion image; then, mapping the features of long wave intensity to red bias, mapping the features of medium wave intensity to blue bias, and performing tone control, wherein the calculation formula is as follows:
Acfu(i,j)=K1(IBlw(ij)-IBmw(i,j))
Bcfu(i,j)=K2(IBlw(i,j)-IBmw(i,j))
in the formula, Acfu(i, j) and Bcfu(i, j) Red Green component A and blue yellow component B, K of the LAB space color fusion image, respectively1And K2Is a color control coefficient;
finally, the color image in the LAB space is converted to the RGB space, resulting in R, G, B components of the color-fused image for image transmission and display.
Example 3
As shown in fig. 1, the image fusion algorithm of the pixel sampling center staggered dual-band detector comprises the following steps:
step (1), separating spatial information difference and wave band information difference;
recombining the spatial information and the wave band information;
generating a black-white fusion image based on the spatial characteristics;
and (4) generating an LAB space color fusion image according to the wave band characteristics and converting the LAB space color fusion image into an RGB space.
Further, the step (1) specifically comprises:
the method comprises the steps of simultaneously carrying out up-sampling on a medium-long wave dual-waveband infrared image, doubling the resolution of the image, conveniently extracting and recombining high-frequency information obtained by staggered sampling, and carrying out up-sampling by adopting an interpolation or super-resolution reconstruction algorithm;
the upsampled image is decomposed into a high-frequency component and a low-frequency component in a frequency domain, and the frequency domain decomposition can be performed through any form of high-pass or low-pass filter, and the key point is to select a proper filtering frequency to separate difference information (generally, one half of the sampling frequency). The high frequency component mainly contains spatial information difference, and the low frequency component mainly contains band information difference.
Imw(i,j)=ILmw(i,j)+IHmw(i,j)
Ilw(i,j)=ILlw(i,j)+IHlw(i,j)
In the formula Imw(I, j) and Ilw(i, j) are medium and long wave images, ILmw(i, j) and ILlw(i, j) decomposed medium and long wave low frequency components, IHmw(i, j) and IHlw(i, j) are decomposed medium and long wave high frequency components, i and j represent pixel position coordinates.
Further, the step (2) specifically comprises:
according to the pixel staggering condition, carrying out staggered arrangement or shifting on the long-wave high-frequency components to finish space difference recombination;
the ratio of the low-frequency components of the two wave bands is used as a reference, the extracted high-frequency components and the extracted low-frequency components are recombined, so that the ratio of the recombined image is consistent with the ratio of the low-frequency components, in the process of recombination, high-frequency information is enhanced, and the calculation formula is as follows:
Figure BDA0003381062880000091
Figure BDA0003381062880000092
in the formula, IBmw(i, j) and IBlw(i, j) are the intermediate wave and long wave gray images after the wave band information is recombined,
Figure BDA0003381062880000096
the detail enhancement coefficient for the high frequency may be determined according to a ratio of the high frequency component and the low frequency component.
Further, the step (3) specifically comprises:
firstly, averaging low-frequency components of two wave bands to obtain low-frequency components of a black-white fusion image, wherein a calculation formula is as follows:
Figure BDA0003381062880000093
in the formula (II)gfu(i, j) is the low frequency component of the black and white fused image;
and then, dividing the recombined high-frequency component obtained by recombining the medium-wave high frequency, the long-wave high frequency and the spatial difference by the corresponding low-frequency component to obtain local contrast information, selecting the high-frequency information with the maximum local contrast as the high-frequency component of the black-white fused image, wherein different calculation methods can be adopted for calculating the local contrast information, and the key point is to obtain better detail contrast.
Figure BDA0003381062880000094
Figure BDA0003381062880000095
and IH(i,j)∈{IHmw(i,j),IHlw(i,j),IHrb(i,j)}
In the formula, IHgfu(i, j) is the high frequency component of the black and white blended image, IHrb(i, j) is a recombined high-frequency component obtained by spatial difference recombination;
and finally, adding the high-frequency component and the low-frequency component to obtain a fused image, wherein the calculation formula is as follows:
Igfu(i,j)=ILgfu(i,j)+IHgfu(i,j)
in the formula Igfu(i, j) is the final black and white fused image.
Further, the step (4) specifically comprises:
the LAB color space divides the color space into an L (brightness) component, an A (red green) component and a B (blue yellow) component according to the color visual effect of human eyes, and the correlation of the three components is low, thereby being beneficial to the control of colors. By adopting the space to carry out color fusion, L, A, B images of three components are generated according to the wave band characteristics of the two-wave band image.
First, since the human eye is more sensitive to a high frequency signal among luminance components, the black-and-white fused image having high frequency spatial information generated in step (3) is directly adopted as the L component of the color fused image.
Then, according to the difference of the medium wave and long wave characteristics, generating color components A and B, and considering the infrared radiation characteristics of the conventional scene, mapping the characteristics of the long wave intensity to the reddish color, mapping the characteristics of the medium wave intensity to the bluish color, and controlling the tone by the coefficient, wherein the calculation formula is as follows:
Acfu(i,j)=K1(IBlw(i,j)-IBmw(i,j))
Bcfu(i,j)=K2(IBlw(i,j)-IBmw(i,j))
in the formula, Acfu(i, j) and Bcfu(i, j) Red Green component A and blue yellow component B, K of the color fusion image, respectively1And K2The color control coefficients can be set according to the desired chromaticity and saturation.
Finally, the color image in the LAB space is converted to the RGB space, resulting in R, G, B components of the color-fused image for image transmission and display.
Examples of the applications
The invention provides an image fusion algorithm of a pixel sampling center staggered dual-band detector, the processing flow is shown in figure 1, and the specific implementation method comprises the following steps:
step 1, spatial information difference and waveband information difference separation:
and simultaneously carrying out up-sampling on the medium-long wave dual-waveband infrared image to double the resolution of the image, wherein the up-sampling can adopt an interpolation or super-resolution reconstruction algorithm. As a simple verification, a bilinear interpolation algorithm is used for upsampling.
Carrying out low-frequency filtering on the up-sampled image to decompose low-frequency components, wherein the band characteristic difference can be extracted through the low-frequency components, a Gaussian low-pass filter is adopted, and the calculation formula is as follows:
ILmw(i,j)=Imw(i,j)*gmw(i,j)
ILlw(i,j)=Ilw(i,j)*glw(i,j)
in the formula (II)mw(i, j) and ILlw(I, j) resolved medium and long wave high frequency components, Imw(I, j) and Ilw(i, j) are medium and long wave images, gmw(i, j) and glw(i, j) is a gaussian filter template, high-pass filtering is realized by convolution of the image and the filter template, and the adopted 3 × 3 gaussian filter template is as follows:
Figure BDA0003381062880000111
further subtracting the low-frequency component from the original image to obtain a high-frequency component, wherein the spatial information difference generated by the staggered sampling of the two wave bands mainly exists in the high-frequency component, and the calculation formula is as follows:
IHmw(i,j)=Imw(i,j)-ILmw(i,j)
IHlw(i,j)=Ilw(i,j)-ILlw(i,j)
in the formula, IHmw(i, j) and IHlwAnd (i, j) extracting medium wave and long wave low-frequency components.
And 2, recombining the spatial information and the waveband information:
according to the pixel staggering condition, carrying out staggered arrangement or shifting on the long-wave high-frequency components to finish space difference recombination;
the ratio of the low-frequency components of the two wave bands is used as a reference, the extracted high-frequency components and the extracted low-frequency components are recombined, so that the ratio of the recombined image is consistent with the ratio of the low-frequency components, in the process of recombination, high-frequency information is enhanced, and the calculation formula is as follows:
Figure BDA0003381062880000112
Figure BDA0003381062880000113
in the formula, IBmw(i, j) and IBlw(i, j) are the intermediate wave and long wave gray images after the wave band information is recombined,
Figure BDA0003381062880000114
for high frequency detail enhancement factor, it may be set to 0.5 here.
And 3, generating a black-white fusion image based on the spatial characteristics:
firstly, averaging low-frequency components of two wave bands to obtain low-frequency components of a black-white fusion image, wherein a calculation formula is as follows:
Figure BDA0003381062880000115
in the formula (II)gfu(i, j) is the low frequency component of the black and white fused image;
then, local contrast information is generated by the recombined high-frequency components obtained by recombining the medium-wave high frequency, the long-wave high frequency and the space difference and corresponding low-frequency components, and the high-frequency information with the maximum local contrast is selected as the high-frequency components of the black-white fused image. The recombination high-frequency component obtained by recombining the medium-wave high frequency, the long-wave high frequency and the space difference is directly compared with the corresponding low-frequency component to obtain the local contrast information, and the calculation formula is as follows:
Figure BDA0003381062880000121
Figure BDA0003381062880000122
and IH(i,j)∈{IHmw(i,j),IHlw(i,j),IHrb(i,j)}
in the formula, IHgfu(i, j) is the high frequency component of the black and white blended image, IHrb(i, j) is a high frequency component reconstructed using the two-band spatial difference information;
and finally, adding the high-frequency component and the low-frequency component to obtain a fused image, wherein the calculation formula is as follows:
Igfu(i,j)=ILgfu(i,j)+IHgfu(i,j)
in the formula Igfu(i, j) is the final black and white fused image.
And 4, generating an LAB space color fusion image according to the wave band characteristics, and converting the LAB space color fusion image into an RGB space:
the LAB color space divides the color space into an L (brightness) component, an A (red green) component and a B (blue yellow) component according to the color visual effect of human eyes, and the correlation of the three components is low, thereby being beneficial to the control of colors. By adopting the space to carry out color fusion, L, A, B images of three components are generated according to the wave band characteristics of the two-wave band image.
First, since the human eye is more sensitive to a high frequency signal among the luminance components, the black-and-white fused image having high frequency spatial information generated in step 3) is directly adopted as the L component of the color fused image.
Then, according to the difference between the medium wave and long wave characteristics, generating color components A and B, taking into account the infrared radiation characteristics of the conventional scene, mapping the strong long wave characteristics to the red bias, mapping the strong medium wave characteristics to the blue bias, and controlling the hue by the coefficient, wherein the calculation formula is as follows:
Acfu(i,j)=K1(IBlw(i,j)-IBmw(i,j))
Bcfu(i,j)=K2(IBlw(i,j)-IBmw(i,j))
in the formula, Acfu(i, j) and Bcfu(i, j) Red Green component A and blue yellow component B, K of the color fusion image, respectively1And K2For the color control coefficient, 1 may be set here.
Finally, the color image in the LAB space is converted into the RGB space (which may be a common method, and many common methods for converting the color space are not described herein), and R, G, B components of the color fusion image are obtained for image transmission and display.
The dual-band image is obtained by simulating the imaging characteristics of the staggered overlapped sampling dual-band detector, the simulation processing is carried out by adopting the algorithm, the effect shown in figure 3 is obtained, the difference characteristics of two bands are reserved by the highest row of four-bar target image visible algorithm, and the sub-pixel high-frequency information can be recovered by the middle row of near-characteristic-frequency four-bar target image visible algorithm.
The existing dual-band image fusion technology aims at the detector with completely overlapped pixels or the discrete detector, and can not be completely multiplexed in the image fusion of the novel detector with staggered pixel sampling centers. The fully overlapped detectors cannot fully sample information higher than the characteristic frequency, and the fused images cannot fully recover sub-pixel high-frequency information; and the discrete detector is influenced by the accuracy of the registration algorithm when fusing the high-frequency information of the image. The algorithm provided by the invention can fully play the advantages of the novel detector with the pixel sampling center in staggered arrangement, and recover the sub-pixel high-frequency information from the space difference of the two wave bands. For the four-bar target with characteristic frequency, as long as one waveband can obtain a distinguishable image, the obtained fusion image retains the resolution capability of the four-bar target; for the four-bar target beyond the characteristic frequency, when the two wave bands can not obtain the distinguishable images respectively, the obtained fusion image improves the integrity of the four-bar target image and obtains a barely distinguishable image.
In addition, in the prior art, color fusion is based on RGB and other spaces, and the influence of the color space on the color accuracy is not considered, the algorithm provided by the invention adopts LAB color space for fusion, so that the color accuracy is higher compared with RGB and other spaces, and the better visual effect is obtained.
The foregoing shows and describes the general principles, essential features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are described in the specification and illustrated only to illustrate the principle of the present invention, but that various changes and modifications may be made therein without departing from the spirit and scope of the present invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (5)

1. The image fusion algorithm of the pixel sampling center staggered dual-band detector is characterized by comprising the following steps of:
step (1), separating spatial information difference and waveband information difference: the method comprises the steps of up-sampling a medium-long wave dual-waveband infrared image, and then performing frequency domain decomposition on the up-sampled image to obtain a high-frequency component and a low-frequency component;
and (2) recombining the spatial information and the waveband information: according to the pixel staggering condition, carrying out staggered arrangement or shifting on the long-wave high-frequency components to finish space difference recombination; taking the ratio of the low-frequency components of the two wave bands as a reference, and recombining the extracted high-frequency components and the extracted low-frequency components to ensure that the ratio of the recombined image is consistent with the ratio of the low-frequency components, and enhancing the high-frequency information in the recombining process;
and (3) generating a black-white fusion image based on the spatial characteristics: averaging the low-frequency components of the two wave bands to obtain low-frequency components of a black-white fused image, generating local contrast information by respectively combining the medium-wave high-frequency component, the long-wave high-frequency component and the recombined high-frequency component with the corresponding low-frequency components, selecting the high-frequency information with the maximum local contrast as the high-frequency components of the black-white fused image, and finally adding the high-frequency components and the low-frequency components to obtain the fused image;
and (4) generating and converting an LAB space color fusion image: adopting the black-white fusion image generated in the step (3) as an L component of the LAB space color fusion image; then, calculating a red-green component A and a blue-yellow component B of the LAB space color fusion image according to the intermediate wave and long wave gray level images recombined in the step (2); and finally, converting the color image in the LAB space into the RGB space to obtain a RGB space color fusion image.
2. The image fusion algorithm of the dual-band detector with staggered and overlapped pixel sampling centers according to claim 1, wherein in the step (1), the up-sampling can adopt an interpolation algorithm or a super-resolution reconstruction algorithm.
3. The image fusion algorithm of a dual-band detector with staggered and overlapped pixel sampling centers according to claim 1, wherein in the step (2), the ratio of the low-frequency components of the two bands is used as a reference, the extracted high-frequency components and the extracted low-frequency components are regrouped, so that the ratio of the regrouped images is consistent with the ratio of the low-frequency components, and in the regrouping process, high-frequency information is enhanced, and the specific method is as follows:
Figure FDA0003381062870000011
Figure FDA0003381062870000012
in the formula, IBmw(i, j) and IBlw(i, j) are the intermediate wave and long wave gray images after the wave band information is recombined,
Figure FDA0003381062870000024
for high frequency detail enhancement factor, ILmw(i, j) and ILlw(i, j) decomposed medium and long wave low frequency components, IHmw(i, j) and IHlw(i, j) are decomposed medium and long wave high frequency components, i and j represent pixel position coordinates.
4. The image fusion algorithm of the pixel sampling center staggered and overlapped dual-band detector according to claim 3, wherein the specific method in the step (3) is as follows:
firstly, averaging low-frequency components of two wave bands to obtain the low-frequency component of a black-white fusion image, wherein the calculation formula is as follows:
Figure FDA0003381062870000021
in the formula (II)gfu(i, j) is the low frequency of the black and white blended imageA component;
then, dividing a recombined high-frequency component obtained by recombining the medium-wave high frequency, the long-wave high frequency and the space difference by a corresponding low-frequency component to obtain local contrast information, and selecting the high-frequency information with the maximum local contrast as the high-frequency component of the black-white fused image;
Figure FDA0003381062870000022
Figure FDA0003381062870000023
and IH(i,j)∈{IHmw(i,j),IHlw(i,j),IHrb(i,j)}
in the formula, IHgfu(i, j) is the high frequency component of the black and white blended image, IHrb(i, j) is a recombined high-frequency component obtained by spatial difference recombination;
and finally, adding the high-frequency component and the low-frequency component of the black-white fused image to obtain a black-white fused image, wherein the calculation formula is as follows:
Igfu(i,j)=ILgfu(i,j)+IHgfu(i,j)
in the formula Igfu(i, j) is the final black and white fused image.
5. The image fusion algorithm of the pixel sampling center staggered and overlapped dual-band detector according to claim 3, wherein the specific method in the step (4) is as follows:
adopting the black-white fusion image generated in the step (3) as an L component of the LAB space color fusion image; then, mapping the features of long wave intensity to red bias, mapping the features of medium wave intensity to blue bias, and performing tone control, wherein the calculation formula is as follows:
Acfu(i,j)=K1(IBlw(i,j)-IBmw(i,j))
Bcfu(i,j)=K2(IBlw(i,j)-IBmw(i,j))
in the formula, Acfu(i, j) and Bcfu(i, j) Red Green component A and blue yellow component B, K of the LAB space color fusion image, respectively1And K2Is a color control coefficient;
finally, the color image in the LAB space is converted to the RGB space, resulting in R, G, B components of the color-fused image for image transmission and display.
CN202111434259.3A 2021-11-29 2021-11-29 Image fusion method for dual-band detector with staggered and overlapped pixel sampling centers Active CN114418905B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111434259.3A CN114418905B (en) 2021-11-29 2021-11-29 Image fusion method for dual-band detector with staggered and overlapped pixel sampling centers

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111434259.3A CN114418905B (en) 2021-11-29 2021-11-29 Image fusion method for dual-band detector with staggered and overlapped pixel sampling centers

Publications (2)

Publication Number Publication Date
CN114418905A true CN114418905A (en) 2022-04-29
CN114418905B CN114418905B (en) 2024-04-16

Family

ID=81266143

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111434259.3A Active CN114418905B (en) 2021-11-29 2021-11-29 Image fusion method for dual-band detector with staggered and overlapped pixel sampling centers

Country Status (1)

Country Link
CN (1) CN114418905B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106023129A (en) * 2016-05-26 2016-10-12 西安工业大学 Infrared and visible light image fused automobile anti-blooming video image processing method
WO2017020595A1 (en) * 2015-08-05 2017-02-09 武汉高德红外股份有限公司 Visible light image and infrared image fusion processing system and fusion method
CN108364277A (en) * 2017-12-20 2018-08-03 南昌航空大学 A kind of infrared small target detection method of two-hand infrared image fusion
CN111815548A (en) * 2020-07-07 2020-10-23 昆明物理研究所 Medium-long wave dual-waveband infrared image fusion method
WO2021120404A1 (en) * 2019-12-17 2021-06-24 大连理工大学 Infrared and visible light fusing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017020595A1 (en) * 2015-08-05 2017-02-09 武汉高德红外股份有限公司 Visible light image and infrared image fusion processing system and fusion method
CN106023129A (en) * 2016-05-26 2016-10-12 西安工业大学 Infrared and visible light image fused automobile anti-blooming video image processing method
CN108364277A (en) * 2017-12-20 2018-08-03 南昌航空大学 A kind of infrared small target detection method of two-hand infrared image fusion
WO2021120404A1 (en) * 2019-12-17 2021-06-24 大连理工大学 Infrared and visible light fusing method
CN111815548A (en) * 2020-07-07 2020-10-23 昆明物理研究所 Medium-long wave dual-waveband infrared image fusion method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
普杰信;唐国良;黄心汉;张志勇;: "基于多小波和分维的图像融合算法", 计算机工程与应用, no. 32, 11 November 2006 (2006-11-11) *
沈瑜;党建武;冯鑫;王阳萍;侯越;: "基于Tetrolet变换的红外与可见光融合", 光谱学与光谱分析, no. 06, 15 June 2013 (2013-06-15) *
玄立超;谢亦才;: "一种新的基于Curvelet变换的遥感图像融合算法", 计算机技术与发展, no. 05, 10 May 2009 (2009-05-10) *
王少迪;李轶鲲;杨树文;: "一种耦合空间域与变换域的遥感影像融合方法", 遥感信息, no. 02, 15 April 2018 (2018-04-15) *
王海晖, 彭嘉雄: "基于多小波变换的图像融合研究", 中国图象图形学报, no. 08, 25 August 2004 (2004-08-25) *

Also Published As

Publication number Publication date
CN114418905B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
TWI278818B (en) System for improving display resolution
Losson et al. Comparison of color demosaicing methods
Alleysson et al. Linear demosaicing inspired by the human visual system
JP4376912B2 (en) Color enhancement method and color gamut mapping method
US6608632B2 (en) Methods and systems for improving display resolution in images using sub-pixel sampling and visual error filtering
CN109447922B (en) Improved IHS (induction heating system) transformation remote sensing image fusion method and system
CN103327323B (en) High bit depth video maps to the efficient tone of low bit depth displays
CN104539919B (en) Demosaicing method and device of image sensor
CN106897972A (en) A kind of self-adapting histogram underwater picture Enhancement Method of white balance and dark primary
US6775420B2 (en) Methods and systems for improving display resolution using sub-pixel sampling and visual error compensation
US7194147B2 (en) Methods and systems for improving display resolution in achromatic images using sub-pixel sampling and visual error filtering.
CN112528914B (en) Satellite image full-color enhancement method for gradually integrating detail information
CN117372564B (en) Method, system and storage medium for reconstructing multispectral image
US7609307B2 (en) Heterogeneity-projection hard-decision interpolation method for color reproduction
KR20210096925A (en) Flexible Color Correction Method for Massive Aerial Orthoimages
CN111563866A (en) Multi-source remote sensing image fusion method
JP2004524729A (en) Display resolution enhancement method and system using sub-pixel sampling and visual distortion filtering
CN114418905B (en) Image fusion method for dual-band detector with staggered and overlapped pixel sampling centers
CN105844640A (en) Color image quality evaluation method based on gradient
CN112734679A (en) Fusion defogging method for medical operation video images
CN110310249A (en) Visual enhancement method for remote sensing images
KR102397148B1 (en) Color Correction Method Using Low Resolution Color Image Of Large-capacity Aerial Orthoimage
CN109191368B (en) Method, system equipment and storage medium for realizing splicing and fusion of panoramic pictures
Kolta et al. A hybrid demosaicking algorithm using frequency domain and wavelet methods
Alleysson et al. Linear minimum mean square error demosaicking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant