CN111815548A - Medium-long wave dual-waveband infrared image fusion method - Google Patents

Medium-long wave dual-waveband infrared image fusion method Download PDF

Info

Publication number
CN111815548A
CN111815548A CN202010655336.7A CN202010655336A CN111815548A CN 111815548 A CN111815548 A CN 111815548A CN 202010655336 A CN202010655336 A CN 202010655336A CN 111815548 A CN111815548 A CN 111815548A
Authority
CN
China
Prior art keywords
image
wave
channel
medium
long
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010655336.7A
Other languages
Chinese (zh)
Other versions
CN111815548B (en
Inventor
张润琦
洪闻青
苏俊波
刘传明
王晓东
赵灿兵
杨波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kunming Institute of Physics
Original Assignee
Kunming Institute of Physics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kunming Institute of Physics filed Critical Kunming Institute of Physics
Priority to CN202010655336.7A priority Critical patent/CN111815548B/en
Publication of CN111815548A publication Critical patent/CN111815548A/en
Application granted granted Critical
Publication of CN111815548B publication Critical patent/CN111815548B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for fusing medium-long wave dual-waveband infrared images, which comprises the following steps: reading an image to be fused, processing a Y channel, extracting detail features of a medium wave image, extracting difference features of the medium wave image and a long wave image, calculating reference mean values of a Cb channel and a Cr channel, calculating reference variance of the Cb channel and the Cr channel, calculating mean values and variance of a source image, transmitting colors and overlapping the features, and outputting a final fused image. The medium-long wave infrared image fusion algorithm provided by the invention is based on the color transfer theory, does not need a reference image to have a good visual effect in each scene, has extremely high environmental adaptability, has more radiation characteristics and detail characteristics, and is easy to popularize and apply.

Description

Medium-long wave dual-waveband infrared image fusion method
Technical Field
The invention relates to the field of image fusion, in particular to a method for fusing medium-long wave dual-waveband infrared images.
Background
In the field of infrared medium-long wave image fusion, a great deal of algorithm research is available so far, the algorithms can be divided into gray scale image fusion and color image fusion, but because people have high resolution on colors, people can only distinguish dozens of gray scales, but can distinguish thousands of colors with different chroma, saturation and lightness, and a great deal of research and experiments show that when shooting is carried out, compared with black and white images, the color images are more beneficial for an observer to carry out cognitive reading on a seen scene, so that the observer obtains higher environmental perception sensitivity and higher accuracy of target identification, and the reaction time required for judging is shortened. Therefore, in recent years, color fusion is more important than grayscale image fusion.
Because the information that the gray level image can hold is far less than the color image, compared with the gray level image fusion algorithm, the purpose of the method is to keep the most effective information of each source image, the color image fusion takes less attention to the process of information selection and selection, and the research difficulty in the color image fusion is how to highlight the difference characteristics, the details and the characteristics of the source images and enable the color image to be in the area where the human eyes can observe comfortably. The current color fusion methods are mainly classified into three types, namely color mapping, color transfer and color lookup tables, typical color mapping methods include NRL, TNO and the like, and the methods are simple and easy to implement, but have color distortion and serious color cast in partial scenes. The color transfer enables the fused image to obtain a color effect close to that of the reference image, but the fused effect is greatly influenced by the reference image, and the loss of detail characteristics is easily caused in the transfer process. The color lookup table has high running speed, colors have certain naturalness and do not need to refer to images, but the color stability is poor in actual video fusion, and the conditions of color change and identification degree reduction are easy to occur along with the time. In the current infrared dual-waveband image fusion method, few methods are designed by fully combining waveband characteristics.
The NRL, TNO and other algorithms are color fusion algorithms which are most widely applied in the early stage, the color transfer is an algorithm which is applied more in the current engineering, the color transfer is firstly proposed in the l alpha beta space, the multiband black-and-white image is subjected to preliminary color fusion to obtain a pseudo-color image, then the color components of the pseudo-color image are subjected to linear transformation by using the color reference image, so that the mean value and the variance of each color component are the same as those of the color reference image, and the characteristics of the overall color distribution of the color reference image are transferred to the pseudo-color image. The method has good transmission effect in l alpha beta, YCbCr, YUV and other color spaces. Although the color transfer method effectively improves the color cast problem of the classical infrared fusion algorithms such as NRL and TNO, the fusion effect of the color transfer method is limited by the selection of the reference image, and the mean value and the variance of the reference image not only influence the detail and the feature saliency of the fused image, but also determine the comfortable degree of human eye observation.
Therefore, how to overcome the above problems, ensure that the fused image has better infrared radiation characteristic performance in each scene, and always enable the image to keep a comfortable state for human eyes to observe becomes a problem which needs to be solved urgently for color image fusion.
Disclosure of Invention
The invention provides a medium-long wave dual-waveband color image fusion method with high environmental adaptability and outstanding waveband characteristics, aiming at solving the exposed defect of the existing algorithm in engineering application.
The original purpose of color transmission is to adjust the colors of the fused image to a required area (for example, a forest reference image is adopted for a forest scene) through the reference image so as to be conveniently observed by human eyes comfortably, but the effect difference of different reference images in different scenes is large, so that the problems of detail loss and low contrast are brought, and if the reference images are selected improperly, human eyes are not easy to observe, and the original purpose of color transmission is violated. Therefore, the reference mean and variance of color delivery are of particular importance. The method obtains the proper reference mean value and variance through the difference characteristics and the detail characteristics of the source images, and finally obtains the reference image in a channel-dividing processing mode instead of the reference image. In this way, although the fused image cannot be toned to the required color, the difference characteristics and detail information of the source image can be fully highlighted, and the fused image can be in a color area which is comfortable to observe by human eyes under each scene.
In order to achieve the purpose, the algorithm adopts the following technical scheme:
a method for fusing medium-long wave dual-waveband infrared images comprises the following steps:
step (1) reading an image to be fused
Inputting the intermediate wave and long wave infrared fusion images to be fused into YCbCr color space, inputting the intermediate wave images into a Cb channel, inputting the long wave images into a Cr channel, performing weighted average processing on the intermediate wave images and the long wave images, inputting the intermediate wave images and the long wave images into a Y channel, and recording the intermediate wave images as FMWThe long-wave image is FLWThe weighted average image is recorded as FAVG
Step (2), Y-channel processing
Weighted average image F of the input Y channel obtained in the step (1)AVGPerforming single-scale Retinex enhancement processing to obtain a final output image F of a Y channelY
Step (3) extracting detail features of the medium wave image
Extracting detail features of the medium wave image obtained in the step (1), and recording the detail features of the medium wave as DMW
Step (4) extracting difference characteristics of the medium wave image and the long wave image
Extracting the long-wave image obtained in the step (1) as FLWAnd the medium wave image is FMWCalculating the difference characteristic of the long wave image and the difference characteristic of the medium wave image, and recording the difference characteristic of the long wave image as difLWThe difference characteristic of the medium wave image is difMW
Step (5), calculating Cb channel and Cr channel reference mean value
According to the detail characteristics D of the medium wave image obtained in the step (3)MWCalculating Cb channel reference mean value muCbObtained according to the step (4)Long wave image difference characteristic difLWCalculating the Cr channel reference mean value muCr
Step (6) calculating reference variances of Cb channel and Cr channel
According to the difference characteristic dif of the long wave image obtained in the step (4)LWAnd the difference characteristic dif of the medium wave imageMWCb channel reference variance Cb2And Cr channel reference varianceCr 2
Step (7), calculating the mean value and the variance of the source image
The mean and variance of the medium wave image are recorded as muMWAndMW 2the mean and variance of the long-wave image are recorded as μLWAndLW 2
step (8), color transfer and superposition of features
Step (9) of outputting the final fused image
And (4) converting the YCbCr color space output image obtained in the step (2) and the step (8) into an RGB color space for output.
Further, in the step (2), the specific method is as follows:
FYthe calculation method is as follows:
FY(i,j)=lnFAVG(i,j)-ln[F(i,j)*lnFAVG(i,j)];
where F (i, j) is the surround function and i, j is the image coordinate position, the formula is as follows:
Figure BDA0002574081220000031
wherein c is the standard deviation of the Gaussian function, and the value of K is such that the surrounding function F (i, j) satisfies the following relationship:
∫∫F(i,j)didj=1。
further, in the step (3), the detail feature extraction is performed on the medium wave source image, which is specifically as follows:
performing median filtering on the medium wave image obtained in the step (1), and marking the filtered image as BMWDetails of the medium wave image feature DMWThe calculation formula is as follows:
DMW=FMW-BMW
further, in step (4), the long-wave image difference difLWAnd medium wave image difference difMWThe calculation formula is as follows:
difMW=FMW-FLW∩FMW
difLW=FLW-FLW∩FMW
wherein FLW∩FMWThe common component of the medium wave image and the long wave image is realized by comparing the local parts of the medium wave image and the long wave image and taking the minimum, and the calculation formula is as follows:
FLW(i,j)∩FMW=Min{FLW(i,j),FMW(i,j)};
i, j are image coordinate positions.
Further, in the step (5), the calculation formula is as follows:
μCb=Mid-mean(DMW);
μCr=Mid-mean(difLW);
wherein Mid is a channel data median value, mean represents the image mean value; mid takes the value of 128 when the data bit width is 8 bits.
Further, in the step (6), the difference characteristic dif of the long-wave image obtained in the step (4) is used as the difference characteristic difLWAnd the difference characteristic dif of the medium wave imageMWCb channel reference varianceCb 2And Cr channel reference varianceCr 2The calculation formula is as follows:
Cb 2=-a×mean(difMW)+b;
Cr 2=-a×mean(difLW)+b;
wherein a and b are linear mapping parameters, and the value range is 0-1.
Further, in step (7), the mean and variance of the medium wave image are recorded as μMWAndMW 2the mean and variance of the long-wave image are recorded as μLWAndLW 2the calculation formula is as follows:
Figure BDA0002574081220000051
Figure BDA0002574081220000052
Figure BDA0002574081220000053
Figure BDA0002574081220000054
wherein m and n are the number of image rows and columns, respectively; i, j are image coordinate positions.
Further, in the step (8), according to the detail feature D of the medium wave image obtained in the step (3)MWAccording to the difference characteristic dif of the long wave image obtained in the step (4)LWCalculating to obtain a final output image F of the Cb channel and the Cr channel according to the reference mean value of the Cb channel and the Cr channel obtained in the step (5), the reference variance of the Cb channel and the Cr channel obtained in the step (6), and the mean value and the variance of the medium wave image and the long wave image obtained in the step (7)CbAnd FCrThe calculation formula is as follows:
Figure BDA0002574081220000055
Figure BDA0002574081220000056
i, j are image coordinate positions.
Further, in step (9), the conversion calculation formula is as follows:
Figure BDA0002574081220000057
wherein, FR、FGAnd FBRespectively RGB color spaceAn R-channel image, a G-channel image, and a B-channel image.
And (3) extracting detail features of the medium wave source image, wherein the method for extracting the medium wave image details is low in complexity and easy to implement in engineering.
In the step (4), difference feature extraction is carried out on the medium wave image and the long wave image, and the method for extracting the difference feature, which is low in complexity and easy to implement by engineering, is also used by the invention by using the idea of extracting the difference feature of the medium wave image and the long wave image by TNO.
In the step (6), the reference variance required by color transfer is determined through the detail characteristics and the difference characteristics of the two-waveband source images, and the core idea is that the variance of the fused image expected to be output is adjusted according to the difference between the medium-wave source images and the long-wave source images, the larger the difference is, the smaller the reference variance is, and the smaller the difference is, the larger the reference variance is, the method used by the invention is a linear mapping method, the method has low complexity and is easy to implement by engineering, but the mapping methods have various methods including nonlinear mapping and are not limited to the method adopted by the invention.
In step (8), the image after color transmission is subjected to detail and difference feature superposition, so that in order to keep the average value of the superposed channel image at the median value, the average value of the superposed features needs to be subtracted in step (7).
Compared with the prior art, the invention has the beneficial effects that:
(1) the invention does not need to refer to images, and can obtain a fused image without color cast and comfortable for human eyes to observe under various complex scenes;
(2) according to the invention, the infrared band characteristics are fully considered, the medium wave image has richer detail characteristics, the long wave image has more obvious target characteristics, and the characteristics are fully highlighted in the fused image;
(3) the invention can ensure the brightness and contrast of the fused image to be proper in each scene, and has higher total information entropy.
The method mainly comprises the steps of taking a large number of various pictures actually acquired by the medium-long wave dual-waveband thermal infrared imager as a main point, and carrying out simulation comparison verification on the method and the existing widely-applied color fusion algorithm. Compared with other methods, the method gives consideration to the salient characteristics of the detail characteristics of the medium-wave image and the difference characteristics of the long-wave image, and compared with other algorithms, the image color is suitable for long-term observation of human eyes in each scene, and the method has better effects in comprehensive performance and objective index evaluation.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a medium and long wave image actually acquired using a medium and long wave dual band thermal imager, wherein graph (a) is a long wave image and graph (b) is a medium wave image;
FIG. 3 shows the actual fusion effect of the present invention.
Detailed Description
The present invention will be described in further detail with reference to examples.
The following examples are merely illustrative of the present invention and should not be construed as limiting the scope of the invention. The examples do not specify particular techniques or conditions, and are performed according to the techniques or conditions described in the literature in the art or according to the product specifications. The source image acquisition equipment used in the embodiment is a homonymous common-optical-axis medium-long wave dual-waveband thermal infrared imager, so that the images of the two wave bands do not need to be subjected to image fusion preprocessing operations such as registration and the like.
Fig. 2(a) and 2(b) are long-wave and medium-wave images actually acquired by using a medium-long-wave dual-band thermal imager, and the two source images are used for explaining the embodiment of the present invention, and the size of the two source images is 384x 288.
As shown in fig. 1, the method for fusing a long-wave two-band image in a non-reference image based on color delivery of the embodiment includes the following steps:
step (1) reading an image to be fused
Inputting the intermediate wave and long wave infrared fusion images to be fused into YCbCr color space, inputting the intermediate wave images into a Cb channel, inputting the long wave images into a Cr channel, performing weighted average processing on the intermediate wave images and the long wave images, inputting the intermediate wave images and the long wave images into a Y channel, and recording the intermediate wave images as FMWThe long-wave image is FLWWeighted average is recordedThe latter image is FAVG,FAVGThe calculation formula is as follows:
Figure BDA0002574081220000071
step (2), Y-channel processing
Weighted average image F of the input Y channel obtained in the step (1)AVGPerforming single-scale Retinex enhancement processing to obtain a final output image F of a Y channelY,FYThe calculation method is as follows:
FY(i,j)=lnFAVG(i,j)-ln[F(i,j)*ln FAVG(i,j)];
where F (i, j) is the surround function. The formula is as follows:
Figure BDA0002574081220000072
wherein c is the standard deviation of the Gaussian function, and the value of K is such that the surrounding function F (i, j) satisfies the following relationship:
∫∫F(i,j)didj=1;
wherein i and j are image coordinate positions, i is more than or equal to 1 and less than or equal to 384, j is more than or equal to 1 and less than or equal to 288, and the value of the standard deviation c of the Gaussian function is 80.
Step (3) extracting detail features of the medium wave image
Performing median filtering on the medium wave image obtained in the step (1), and marking the filtered image as BMWDetails of the medium wave image feature DMWThe calculation formula is as follows:
DMW=FMW-BMW
wherein the median filter template size is [15, 15 ].
Step (4) extracting difference characteristics of the medium wave image and the long wave image
According to the step (1), the medium-long wave image FLWSum wave image FMWDifference dif of the obtained long-wave imageLWAnd medium wave image difference difMWThe calculation formula is as follows:
difMW=FMW-FLW∩FMW
difLW=FLW-FLW∩FMW
wherein FLW∩FMWThe common component of the medium wave image and the long wave image is realized by comparing the local parts of the medium wave image and the long wave image and taking the minimum, and the calculation formula is as follows:
FLW(i,j)∩FMW=Min{FLW(i,j),FMW(i,j)};
wherein i and j are image coordinate positions, i is more than or equal to 1 and less than or equal to 384, and j is more than or equal to 1 and less than or equal to 288.
Step (5), calculating Cb channel and Cr channel reference mean value
According to the detail characteristics D of the medium wave image obtained in the step (3)MWCalculating Cb channel reference mean value muCbAccording to the difference characteristic dif of the long wave image obtained in the step (4)LWCalculating the Cr channel reference mean value muCrThe calculation formula is as follows:
μCb=128-mean(DMW);
μCr=128-mean(difLW);
where mean represents averaging the image.
Step (6) calculating reference variances of Cb channel and Cr channel
According to the difference dif of the long wave image obtained in the step (4)LWAnd medium wave image difference difMWCb channel reference varianceCb 2And Cr channel reference varianceCr 2The calculation formula is as follows:
Cb 2=-a×mean(difMW)+b;
Cr 2=-a×mean(difLW)+b;
wherein the value of a is 0.02 and the value of b is 0.007.
Step (7), calculating the mean value and the variance of the source image
The mean and variance of the medium wave image are recorded as muMWAndMW 2the mean and variance of the long-wave image are recorded as μLWAndLW 2calculatingThe formula is as follows:
Figure BDA0002574081220000091
Figure BDA0002574081220000092
Figure BDA0002574081220000093
Figure BDA0002574081220000094
wherein, i is more than or equal to 1 and less than or equal to 384, j is more than or equal to 1 and less than or equal to 288, m is 384, and n is 288.
Step (8), color transfer and superposition of features
According to the detail characteristics D of the medium wave image obtained in the step (3)MWAccording to the difference characteristic dif of the long wave image obtained in the step (4)LWCalculating to obtain a final output image F of the Cb channel and the Cr channel according to the reference mean value of the Cb channel and the Cr channel obtained in the step (5), the reference variance of the Cb channel and the Cr channel obtained in the step (6), and the mean value and the variance of the medium wave image and the long wave image obtained in the step (7)CbAnd FCrThe calculation formula is as follows:
Figure BDA0002574081220000101
Figure BDA0002574081220000102
wherein i is more than or equal to 1 and less than or equal to 384, and j is more than or equal to 1 and less than or equal to 288.
Step (9) of outputting the final fused image
Converting the YCbCr color space output image obtained in the step (2) and the step (8) into an RGB color space for output, wherein the conversion calculation formula is as follows:
Figure BDA0002574081220000103
wherein, FR、FGAnd FBThe image comprises an R channel image, a G channel image and a B channel image of an RGB color space, wherein i is more than or equal to 1 and less than or equal to 384, and j is more than or equal to 1 and less than or equal to 288.
The final output fused image is shown in fig. 3.
The method used by the invention is not limited to the bit width of 8bit image data, and can be used in each data bit width.
The method is not limited to YCbCr color space, space is only described in detail in the YCbCr color space, but the method is effective in YUV, lab and other color spaces.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (9)

1. A method for fusing medium-long wave dual-waveband infrared images is characterized by comprising the following steps: the method comprises the following steps:
step (1) reading an image to be fused
Inputting the intermediate wave and long wave infrared fusion images to be fused into YCbCr color space, inputting the intermediate wave images into a Cb channel, inputting the long wave images into a Cr channel, performing weighted average processing on the intermediate wave images and the long wave images, inputting the intermediate wave images and the long wave images into a Y channel, and recording the intermediate wave images as FMWThe long-wave image is FLWThe weighted average image is recorded as FAVG
Step (2), Y-channel processing
Weighted average image F of the input Y channel obtained in the step (1)AVGPerforming single-scale Retinex enhancement processing to obtain a final output image F of a Y channelY
Step (3) extracting detail features of the medium wave image
Extracting detail features of the medium wave image obtained in the step (1), and recording the detail features of the medium wave as DMW
Step (4) extracting difference characteristics of the medium wave image and the long wave image
Extracting the long-wave image obtained in the step (1) as FLWAnd the medium wave image is FMWCalculating the difference characteristic of the long wave image and the difference characteristic of the medium wave image, and recording the difference characteristic of the long wave image as difLWThe difference characteristic of the medium wave image is difMW
Step (5), calculating Cb channel and Cr channel reference mean value
According to the detail characteristics D of the medium wave image obtained in the step (3)MWCalculating Cb channel reference mean value muCbAccording to the difference characteristic dif of the long wave image obtained in the step (4)LWCalculating the Cr channel reference mean value muCr
Step (6) calculating reference variances of Cb channel and Cr channel
According to the difference characteristic dif of the long wave image obtained in the step (4)LWAnd the difference characteristic dif of the medium wave imageMWCb channel reference varianceCb 2And Cr channel reference varianceCr 2
Step (7), calculating the mean value and the variance of the source image
The mean and variance of the medium wave image are recorded as muMWAndMW 2the mean and variance of the long-wave image are recorded as μLWAndLW 2
step (8), color transfer and superposition of features
Step (9) of outputting the final fused image
And (4) converting the YCbCr color space output image obtained in the step (2) and the step (8) into an RGB color space for output.
2. The medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in the step (2), the specific method is as follows:
FYthe calculation method is as follows:
FY(i,j)=ln FAVG(i,j)-ln[F(i,j)*ln FAVG(ij)];
where F (i, j) is the surround function and i, j is the image coordinate position, the formula is as follows:
Figure FDA0002574081210000021
wherein c is the standard deviation of the Gaussian function, and the value of K is such that the surrounding function F (i, j) satisfies the following relationship:
∫∫F(i,j)didj=1。
3. the medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in the step (3), extracting detail features of the medium wave source image, which is specifically as follows:
performing median filtering on the medium wave image obtained in the step (1), and marking the filtered image as BMWDetails of the medium wave image feature DMWThe calculation formula is as follows:
DMW=FMW-BMW
4. the medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in step (4), the long-wave image difference difLWAnd medium wave image difference difMWThe calculation formula is as follows:
difMW=FMW-FLW∩FMW
difLW=FLW-FLW∩FMW
wherein FLW∩FMWThe common component of the medium wave image and the long wave image is realized by comparing the local parts of the medium wave image and the long wave image and taking the minimum, and the calculation formula is as follows:
FLW(i,j)∩FMW=Min{FLW(i,j),FMW(i,j)};
i, j are image coordinate positions.
5. The medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in the step (5), the calculation formula is as follows:
μCb=Mid-mean(DMW);
μCr=Mid-mean(difLW);
wherein Mid is a channel data median value, mean represents the image mean value; mid takes the value of 128 when the data bit width is 8 bits.
6. The medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in the step (6), the difference characteristic dif of the long-wave image obtained in the step (4) is used as the basisLWAnd the difference characteristic dif of the medium wave imageMWCb channel reference varianceCb 2And Cr channel reference varianceCr 2The calculation formula is as follows:
Cb 2=-α×mean(difMW)+b;
Cr 2=-a×mean(difLW)+b;
wherein a and b are linear mapping parameters, and the value range is 0-1.
7. The medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in the step (7), the mean and variance of the medium wave image are recorded as muMWAndMW 2the mean and variance of the long-wave image are recorded as μLWAndLW 2the calculation formula is as follows:
Figure FDA0002574081210000031
Figure FDA0002574081210000032
Figure FDA0002574081210000033
Figure FDA0002574081210000034
wherein m and n are the number of image rows and columns, respectively; i, j are image coordinate positions.
8. The medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in the step (8), according to the detail feature D of the medium wave image obtained in the step (3)MWAccording to the difference characteristic dif of the long wave image obtained in the step (4)LWCalculating to obtain a final output image F of the Cb channel and the Cr channel according to the reference mean value of the Cb channel and the Cr channel obtained in the step (5), the reference variance of the Cb channel and the Cr channel obtained in the step (6), and the mean value and the variance of the medium wave image and the long wave image obtained in the step (7)CbAnd FCrThe calculation formula is as follows:
Figure FDA0002574081210000041
Figure FDA0002574081210000042
i, j are image coordinate positions.
9. The medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in step (9), the conversion calculation formula is as follows:
Figure FDA0002574081210000043
wherein, FR、FGAnd FBAn R channel image, a G channel image, and a B channel image of an RGB color space, respectively.
CN202010655336.7A 2020-07-07 2020-07-07 Medium-long wave double-band infrared image fusion method Active CN111815548B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010655336.7A CN111815548B (en) 2020-07-07 2020-07-07 Medium-long wave double-band infrared image fusion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010655336.7A CN111815548B (en) 2020-07-07 2020-07-07 Medium-long wave double-band infrared image fusion method

Publications (2)

Publication Number Publication Date
CN111815548A true CN111815548A (en) 2020-10-23
CN111815548B CN111815548B (en) 2023-11-03

Family

ID=72842035

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010655336.7A Active CN111815548B (en) 2020-07-07 2020-07-07 Medium-long wave double-band infrared image fusion method

Country Status (1)

Country Link
CN (1) CN111815548B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113205470A (en) * 2021-03-19 2021-08-03 昆明物理研究所 Infrared medium-short wave double-color fusion method based on hue saturation mapping
CN114418905A (en) * 2021-11-29 2022-04-29 昆明物理研究所 Image fusion algorithm for pixel sampling center staggered overlapping arrangement dual-band detector

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339653A (en) * 2008-01-30 2009-01-07 西安电子科技大学 Infrared and colorful visual light image fusion method based on color transfer and entropy information
CN101714251A (en) * 2009-12-22 2010-05-26 上海电力学院 Infrared and visual pseudo-color image fusion and enhancement method
CN102609927A (en) * 2012-01-12 2012-07-25 北京理工大学 Foggy visible light/infrared image color fusion method based on scene depth
CN105427268A (en) * 2015-12-01 2016-03-23 中国航空工业集团公司洛阳电光设备研究所 Medium-long-wave dual-waveband infrared image feature level color fusion method
CN107103596A (en) * 2017-04-27 2017-08-29 湖南源信光电科技股份有限公司 A kind of color night vision image interfusion method based on yuv space
CN109255774A (en) * 2018-09-28 2019-01-22 中国科学院长春光学精密机械与物理研究所 A kind of image interfusion method, device and its equipment
CN109658367A (en) * 2018-11-14 2019-04-19 国网新疆电力有限公司信息通信公司 Image interfusion method based on Color transfer

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339653A (en) * 2008-01-30 2009-01-07 西安电子科技大学 Infrared and colorful visual light image fusion method based on color transfer and entropy information
CN101714251A (en) * 2009-12-22 2010-05-26 上海电力学院 Infrared and visual pseudo-color image fusion and enhancement method
CN102609927A (en) * 2012-01-12 2012-07-25 北京理工大学 Foggy visible light/infrared image color fusion method based on scene depth
CN105427268A (en) * 2015-12-01 2016-03-23 中国航空工业集团公司洛阳电光设备研究所 Medium-long-wave dual-waveband infrared image feature level color fusion method
CN107103596A (en) * 2017-04-27 2017-08-29 湖南源信光电科技股份有限公司 A kind of color night vision image interfusion method based on yuv space
CN109255774A (en) * 2018-09-28 2019-01-22 中国科学院长春光学精密机械与物理研究所 A kind of image interfusion method, device and its equipment
CN109658367A (en) * 2018-11-14 2019-04-19 国网新疆电力有限公司信息通信公司 Image interfusion method based on Color transfer

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
MA J等: "Infrared and visible image fusion methods and applications: A survey", 《INFORMATION FUSION》 *
ZHANG R等: "Adaptive infrared medium and long wave image fusion method based on color transfer algorithm", 《SEVENTH SYMPOSIUM ON NOVEL PHOTOELECTRONIC DETECTION TECHNOLOGY AND APPLICATIONS》 *
张润琦等: "弱环境辐射下中短波双波段红外图像融合方法", 《红外技术》 *
粟宇路等: "红外中长波图像彩色融合方法研究", 《红外技术》 *
许廷发等: "真彩色传递双波段图像融合", 《中国光学》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113205470A (en) * 2021-03-19 2021-08-03 昆明物理研究所 Infrared medium-short wave double-color fusion method based on hue saturation mapping
CN113205470B (en) * 2021-03-19 2022-08-30 昆明物理研究所 Infrared medium-short wave double-color fusion method based on hue saturation mapping
CN114418905A (en) * 2021-11-29 2022-04-29 昆明物理研究所 Image fusion algorithm for pixel sampling center staggered overlapping arrangement dual-band detector
CN114418905B (en) * 2021-11-29 2024-04-16 昆明物理研究所 Image fusion method for dual-band detector with staggered and overlapped pixel sampling centers

Also Published As

Publication number Publication date
CN111815548B (en) 2023-11-03

Similar Documents

Publication Publication Date Title
Zheng et al. Qualitative and quantitative comparisons of multispectral night vision colorization techniques
CN105574514B (en) The raw tomato automatic identifying method in greenhouse
CN110969631B (en) Method and system for dyeing hair by refined photos
CN105427268B (en) A kind of middle long wave dual-band infrared image feature level Color Fusion
CN108711160B (en) Target segmentation method based on HSI (high speed input/output) enhanced model
CN108230407B (en) Image processing method and device
CN111815548B (en) Medium-long wave double-band infrared image fusion method
CA3153067C (en) Picture-detecting method and apparatus
CN107993209A (en) Image processing method, device, computer-readable recording medium and electronic equipment
Lee et al. Color image enhancement using histogram equalization method without changing hue and saturation
CN112233024B (en) Medium-long wave double-band infrared image fusion method based on difference characteristic color mapping
CN106056565B (en) A kind of MRI and PET image fusion method decomposed based on Multiscale Morphological bilateral filtering and contrast is compressed
CN111445419A (en) Medical endoscope image enhancement method based on mathematical morphology
CN109658367A (en) Image interfusion method based on Color transfer
CN111275644A (en) Retinex algorithm-based underwater image enhancement method and device
US20140093168A1 (en) Image processing apparatus, image processing method and non-transitory computer readable medium
CN111563866A (en) Multi-source remote sensing image fusion method
CN111611940A (en) Rapid video face recognition method based on big data processing
Hsu et al. A novel automatic white balance method for color constancy under different color temperatures
CN111311502A (en) Method for processing foggy day image by using bidirectional weighted fusion
Saha et al. A novel method for automated correction of non-uniform/poor illumination of retinal images without creating false artifacts
CN113989138A (en) Method for extracting inflammation of facial skin image and forming red region spectrum
CN108198156B (en) Crop leaf image enhancement method and device
CN110807739A (en) Image color feature processing method, system and device for target detection and storage medium
Yuan et al. Full convolutional color constancy with adding pooling

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant