CN111815548A - Medium-long wave dual-waveband infrared image fusion method - Google Patents
Medium-long wave dual-waveband infrared image fusion method Download PDFInfo
- Publication number
- CN111815548A CN111815548A CN202010655336.7A CN202010655336A CN111815548A CN 111815548 A CN111815548 A CN 111815548A CN 202010655336 A CN202010655336 A CN 202010655336A CN 111815548 A CN111815548 A CN 111815548A
- Authority
- CN
- China
- Prior art keywords
- image
- wave
- channel
- medium
- long
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000007500 overflow downdraw method Methods 0.000 title claims description 13
- 238000000034 method Methods 0.000 claims abstract description 37
- 230000004927 fusion Effects 0.000 claims abstract description 22
- 238000012546 transfer Methods 0.000 claims abstract description 12
- 238000012545 processing Methods 0.000 claims abstract description 11
- 238000004364 calculation method Methods 0.000 claims description 27
- 238000013507 mapping Methods 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 238000001914 filtration Methods 0.000 claims description 3
- 238000004422 calculation algorithm Methods 0.000 abstract description 12
- 239000003086 colorant Substances 0.000 abstract description 5
- 230000007613 environmental effect Effects 0.000 abstract description 3
- 230000005855 radiation Effects 0.000 abstract description 2
- 230000000007 visual effect Effects 0.000 abstract 1
- 230000000694 effects Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 4
- 238000011160 research Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 241001270131 Agaricus moelleri Species 0.000 description 1
- 238000012935 Averaging Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000035484 reaction time Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
- G06T2207/20032—Median filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a method for fusing medium-long wave dual-waveband infrared images, which comprises the following steps: reading an image to be fused, processing a Y channel, extracting detail features of a medium wave image, extracting difference features of the medium wave image and a long wave image, calculating reference mean values of a Cb channel and a Cr channel, calculating reference variance of the Cb channel and the Cr channel, calculating mean values and variance of a source image, transmitting colors and overlapping the features, and outputting a final fused image. The medium-long wave infrared image fusion algorithm provided by the invention is based on the color transfer theory, does not need a reference image to have a good visual effect in each scene, has extremely high environmental adaptability, has more radiation characteristics and detail characteristics, and is easy to popularize and apply.
Description
Technical Field
The invention relates to the field of image fusion, in particular to a method for fusing medium-long wave dual-waveband infrared images.
Background
In the field of infrared medium-long wave image fusion, a great deal of algorithm research is available so far, the algorithms can be divided into gray scale image fusion and color image fusion, but because people have high resolution on colors, people can only distinguish dozens of gray scales, but can distinguish thousands of colors with different chroma, saturation and lightness, and a great deal of research and experiments show that when shooting is carried out, compared with black and white images, the color images are more beneficial for an observer to carry out cognitive reading on a seen scene, so that the observer obtains higher environmental perception sensitivity and higher accuracy of target identification, and the reaction time required for judging is shortened. Therefore, in recent years, color fusion is more important than grayscale image fusion.
Because the information that the gray level image can hold is far less than the color image, compared with the gray level image fusion algorithm, the purpose of the method is to keep the most effective information of each source image, the color image fusion takes less attention to the process of information selection and selection, and the research difficulty in the color image fusion is how to highlight the difference characteristics, the details and the characteristics of the source images and enable the color image to be in the area where the human eyes can observe comfortably. The current color fusion methods are mainly classified into three types, namely color mapping, color transfer and color lookup tables, typical color mapping methods include NRL, TNO and the like, and the methods are simple and easy to implement, but have color distortion and serious color cast in partial scenes. The color transfer enables the fused image to obtain a color effect close to that of the reference image, but the fused effect is greatly influenced by the reference image, and the loss of detail characteristics is easily caused in the transfer process. The color lookup table has high running speed, colors have certain naturalness and do not need to refer to images, but the color stability is poor in actual video fusion, and the conditions of color change and identification degree reduction are easy to occur along with the time. In the current infrared dual-waveband image fusion method, few methods are designed by fully combining waveband characteristics.
The NRL, TNO and other algorithms are color fusion algorithms which are most widely applied in the early stage, the color transfer is an algorithm which is applied more in the current engineering, the color transfer is firstly proposed in the l alpha beta space, the multiband black-and-white image is subjected to preliminary color fusion to obtain a pseudo-color image, then the color components of the pseudo-color image are subjected to linear transformation by using the color reference image, so that the mean value and the variance of each color component are the same as those of the color reference image, and the characteristics of the overall color distribution of the color reference image are transferred to the pseudo-color image. The method has good transmission effect in l alpha beta, YCbCr, YUV and other color spaces. Although the color transfer method effectively improves the color cast problem of the classical infrared fusion algorithms such as NRL and TNO, the fusion effect of the color transfer method is limited by the selection of the reference image, and the mean value and the variance of the reference image not only influence the detail and the feature saliency of the fused image, but also determine the comfortable degree of human eye observation.
Therefore, how to overcome the above problems, ensure that the fused image has better infrared radiation characteristic performance in each scene, and always enable the image to keep a comfortable state for human eyes to observe becomes a problem which needs to be solved urgently for color image fusion.
Disclosure of Invention
The invention provides a medium-long wave dual-waveband color image fusion method with high environmental adaptability and outstanding waveband characteristics, aiming at solving the exposed defect of the existing algorithm in engineering application.
The original purpose of color transmission is to adjust the colors of the fused image to a required area (for example, a forest reference image is adopted for a forest scene) through the reference image so as to be conveniently observed by human eyes comfortably, but the effect difference of different reference images in different scenes is large, so that the problems of detail loss and low contrast are brought, and if the reference images are selected improperly, human eyes are not easy to observe, and the original purpose of color transmission is violated. Therefore, the reference mean and variance of color delivery are of particular importance. The method obtains the proper reference mean value and variance through the difference characteristics and the detail characteristics of the source images, and finally obtains the reference image in a channel-dividing processing mode instead of the reference image. In this way, although the fused image cannot be toned to the required color, the difference characteristics and detail information of the source image can be fully highlighted, and the fused image can be in a color area which is comfortable to observe by human eyes under each scene.
In order to achieve the purpose, the algorithm adopts the following technical scheme:
a method for fusing medium-long wave dual-waveband infrared images comprises the following steps:
step (1) reading an image to be fused
Inputting the intermediate wave and long wave infrared fusion images to be fused into YCbCr color space, inputting the intermediate wave images into a Cb channel, inputting the long wave images into a Cr channel, performing weighted average processing on the intermediate wave images and the long wave images, inputting the intermediate wave images and the long wave images into a Y channel, and recording the intermediate wave images as FMWThe long-wave image is FLWThe weighted average image is recorded as FAVG;
Step (2), Y-channel processing
Weighted average image F of the input Y channel obtained in the step (1)AVGPerforming single-scale Retinex enhancement processing to obtain a final output image F of a Y channelY;
Step (3) extracting detail features of the medium wave image
Extracting detail features of the medium wave image obtained in the step (1), and recording the detail features of the medium wave as DMW;
Step (4) extracting difference characteristics of the medium wave image and the long wave image
Extracting the long-wave image obtained in the step (1) as FLWAnd the medium wave image is FMWCalculating the difference characteristic of the long wave image and the difference characteristic of the medium wave image, and recording the difference characteristic of the long wave image as difLWThe difference characteristic of the medium wave image is difMW;
Step (5), calculating Cb channel and Cr channel reference mean value
According to the detail characteristics D of the medium wave image obtained in the step (3)MWCalculating Cb channel reference mean value muCbObtained according to the step (4)Long wave image difference characteristic difLWCalculating the Cr channel reference mean value muCr;
Step (6) calculating reference variances of Cb channel and Cr channel
According to the difference characteristic dif of the long wave image obtained in the step (4)LWAnd the difference characteristic dif of the medium wave imageMWCb channel reference variance Cb2And Cr channel reference varianceCr 2;
Step (7), calculating the mean value and the variance of the source image
The mean and variance of the medium wave image are recorded as muMWAndMW 2the mean and variance of the long-wave image are recorded as μLWAndLW 2;
step (8), color transfer and superposition of features
Step (9) of outputting the final fused image
And (4) converting the YCbCr color space output image obtained in the step (2) and the step (8) into an RGB color space for output.
Further, in the step (2), the specific method is as follows:
FYthe calculation method is as follows:
FY(i,j)=lnFAVG(i,j)-ln[F(i,j)*lnFAVG(i,j)];
where F (i, j) is the surround function and i, j is the image coordinate position, the formula is as follows:
wherein c is the standard deviation of the Gaussian function, and the value of K is such that the surrounding function F (i, j) satisfies the following relationship:
∫∫F(i,j)didj=1。
further, in the step (3), the detail feature extraction is performed on the medium wave source image, which is specifically as follows:
performing median filtering on the medium wave image obtained in the step (1), and marking the filtered image as BMWDetails of the medium wave image feature DMWThe calculation formula is as follows:
DMW=FMW-BMW。
further, in step (4), the long-wave image difference difLWAnd medium wave image difference difMWThe calculation formula is as follows:
difMW=FMW-FLW∩FMW;
difLW=FLW-FLW∩FMW;
wherein FLW∩FMWThe common component of the medium wave image and the long wave image is realized by comparing the local parts of the medium wave image and the long wave image and taking the minimum, and the calculation formula is as follows:
FLW(i,j)∩FMW=Min{FLW(i,j),FMW(i,j)};
i, j are image coordinate positions.
Further, in the step (5), the calculation formula is as follows:
μCb=Mid-mean(DMW);
μCr=Mid-mean(difLW);
wherein Mid is a channel data median value, mean represents the image mean value; mid takes the value of 128 when the data bit width is 8 bits.
Further, in the step (6), the difference characteristic dif of the long-wave image obtained in the step (4) is used as the difference characteristic difLWAnd the difference characteristic dif of the medium wave imageMWCb channel reference varianceCb 2And Cr channel reference varianceCr 2The calculation formula is as follows:
Cb 2=-a×mean(difMW)+b;
Cr 2=-a×mean(difLW)+b;
wherein a and b are linear mapping parameters, and the value range is 0-1.
Further, in step (7), the mean and variance of the medium wave image are recorded as μMWAndMW 2the mean and variance of the long-wave image are recorded as μLWAndLW 2the calculation formula is as follows:
wherein m and n are the number of image rows and columns, respectively; i, j are image coordinate positions.
Further, in the step (8), according to the detail feature D of the medium wave image obtained in the step (3)MWAccording to the difference characteristic dif of the long wave image obtained in the step (4)LWCalculating to obtain a final output image F of the Cb channel and the Cr channel according to the reference mean value of the Cb channel and the Cr channel obtained in the step (5), the reference variance of the Cb channel and the Cr channel obtained in the step (6), and the mean value and the variance of the medium wave image and the long wave image obtained in the step (7)CbAnd FCrThe calculation formula is as follows:
i, j are image coordinate positions.
Further, in step (9), the conversion calculation formula is as follows:
wherein, FR、FGAnd FBRespectively RGB color spaceAn R-channel image, a G-channel image, and a B-channel image.
And (3) extracting detail features of the medium wave source image, wherein the method for extracting the medium wave image details is low in complexity and easy to implement in engineering.
In the step (4), difference feature extraction is carried out on the medium wave image and the long wave image, and the method for extracting the difference feature, which is low in complexity and easy to implement by engineering, is also used by the invention by using the idea of extracting the difference feature of the medium wave image and the long wave image by TNO.
In the step (6), the reference variance required by color transfer is determined through the detail characteristics and the difference characteristics of the two-waveband source images, and the core idea is that the variance of the fused image expected to be output is adjusted according to the difference between the medium-wave source images and the long-wave source images, the larger the difference is, the smaller the reference variance is, and the smaller the difference is, the larger the reference variance is, the method used by the invention is a linear mapping method, the method has low complexity and is easy to implement by engineering, but the mapping methods have various methods including nonlinear mapping and are not limited to the method adopted by the invention.
In step (8), the image after color transmission is subjected to detail and difference feature superposition, so that in order to keep the average value of the superposed channel image at the median value, the average value of the superposed features needs to be subtracted in step (7).
Compared with the prior art, the invention has the beneficial effects that:
(1) the invention does not need to refer to images, and can obtain a fused image without color cast and comfortable for human eyes to observe under various complex scenes;
(2) according to the invention, the infrared band characteristics are fully considered, the medium wave image has richer detail characteristics, the long wave image has more obvious target characteristics, and the characteristics are fully highlighted in the fused image;
(3) the invention can ensure the brightness and contrast of the fused image to be proper in each scene, and has higher total information entropy.
The method mainly comprises the steps of taking a large number of various pictures actually acquired by the medium-long wave dual-waveband thermal infrared imager as a main point, and carrying out simulation comparison verification on the method and the existing widely-applied color fusion algorithm. Compared with other methods, the method gives consideration to the salient characteristics of the detail characteristics of the medium-wave image and the difference characteristics of the long-wave image, and compared with other algorithms, the image color is suitable for long-term observation of human eyes in each scene, and the method has better effects in comprehensive performance and objective index evaluation.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a medium and long wave image actually acquired using a medium and long wave dual band thermal imager, wherein graph (a) is a long wave image and graph (b) is a medium wave image;
FIG. 3 shows the actual fusion effect of the present invention.
Detailed Description
The present invention will be described in further detail with reference to examples.
The following examples are merely illustrative of the present invention and should not be construed as limiting the scope of the invention. The examples do not specify particular techniques or conditions, and are performed according to the techniques or conditions described in the literature in the art or according to the product specifications. The source image acquisition equipment used in the embodiment is a homonymous common-optical-axis medium-long wave dual-waveband thermal infrared imager, so that the images of the two wave bands do not need to be subjected to image fusion preprocessing operations such as registration and the like.
Fig. 2(a) and 2(b) are long-wave and medium-wave images actually acquired by using a medium-long-wave dual-band thermal imager, and the two source images are used for explaining the embodiment of the present invention, and the size of the two source images is 384x 288.
As shown in fig. 1, the method for fusing a long-wave two-band image in a non-reference image based on color delivery of the embodiment includes the following steps:
step (1) reading an image to be fused
Inputting the intermediate wave and long wave infrared fusion images to be fused into YCbCr color space, inputting the intermediate wave images into a Cb channel, inputting the long wave images into a Cr channel, performing weighted average processing on the intermediate wave images and the long wave images, inputting the intermediate wave images and the long wave images into a Y channel, and recording the intermediate wave images as FMWThe long-wave image is FLWWeighted average is recordedThe latter image is FAVG,FAVGThe calculation formula is as follows:
step (2), Y-channel processing
Weighted average image F of the input Y channel obtained in the step (1)AVGPerforming single-scale Retinex enhancement processing to obtain a final output image F of a Y channelY,FYThe calculation method is as follows:
FY(i,j)=lnFAVG(i,j)-ln[F(i,j)*ln FAVG(i,j)];
where F (i, j) is the surround function. The formula is as follows:
wherein c is the standard deviation of the Gaussian function, and the value of K is such that the surrounding function F (i, j) satisfies the following relationship:
∫∫F(i,j)didj=1;
wherein i and j are image coordinate positions, i is more than or equal to 1 and less than or equal to 384, j is more than or equal to 1 and less than or equal to 288, and the value of the standard deviation c of the Gaussian function is 80.
Step (3) extracting detail features of the medium wave image
Performing median filtering on the medium wave image obtained in the step (1), and marking the filtered image as BMWDetails of the medium wave image feature DMWThe calculation formula is as follows:
DMW=FMW-BMW;
wherein the median filter template size is [15, 15 ].
Step (4) extracting difference characteristics of the medium wave image and the long wave image
According to the step (1), the medium-long wave image FLWSum wave image FMWDifference dif of the obtained long-wave imageLWAnd medium wave image difference difMWThe calculation formula is as follows:
difMW=FMW-FLW∩FMW;
difLW=FLW-FLW∩FMW;
wherein FLW∩FMWThe common component of the medium wave image and the long wave image is realized by comparing the local parts of the medium wave image and the long wave image and taking the minimum, and the calculation formula is as follows:
FLW(i,j)∩FMW=Min{FLW(i,j),FMW(i,j)};
wherein i and j are image coordinate positions, i is more than or equal to 1 and less than or equal to 384, and j is more than or equal to 1 and less than or equal to 288.
Step (5), calculating Cb channel and Cr channel reference mean value
According to the detail characteristics D of the medium wave image obtained in the step (3)MWCalculating Cb channel reference mean value muCbAccording to the difference characteristic dif of the long wave image obtained in the step (4)LWCalculating the Cr channel reference mean value muCrThe calculation formula is as follows:
μCb=128-mean(DMW);
μCr=128-mean(difLW);
where mean represents averaging the image.
Step (6) calculating reference variances of Cb channel and Cr channel
According to the difference dif of the long wave image obtained in the step (4)LWAnd medium wave image difference difMWCb channel reference varianceCb 2And Cr channel reference varianceCr 2The calculation formula is as follows:
Cb 2=-a×mean(difMW)+b;
Cr 2=-a×mean(difLW)+b;
wherein the value of a is 0.02 and the value of b is 0.007.
Step (7), calculating the mean value and the variance of the source image
The mean and variance of the medium wave image are recorded as muMWAndMW 2the mean and variance of the long-wave image are recorded as μLWAndLW 2calculatingThe formula is as follows:
wherein, i is more than or equal to 1 and less than or equal to 384, j is more than or equal to 1 and less than or equal to 288, m is 384, and n is 288.
Step (8), color transfer and superposition of features
According to the detail characteristics D of the medium wave image obtained in the step (3)MWAccording to the difference characteristic dif of the long wave image obtained in the step (4)LWCalculating to obtain a final output image F of the Cb channel and the Cr channel according to the reference mean value of the Cb channel and the Cr channel obtained in the step (5), the reference variance of the Cb channel and the Cr channel obtained in the step (6), and the mean value and the variance of the medium wave image and the long wave image obtained in the step (7)CbAnd FCrThe calculation formula is as follows:
wherein i is more than or equal to 1 and less than or equal to 384, and j is more than or equal to 1 and less than or equal to 288.
Step (9) of outputting the final fused image
Converting the YCbCr color space output image obtained in the step (2) and the step (8) into an RGB color space for output, wherein the conversion calculation formula is as follows:
wherein, FR、FGAnd FBThe image comprises an R channel image, a G channel image and a B channel image of an RGB color space, wherein i is more than or equal to 1 and less than or equal to 384, and j is more than or equal to 1 and less than or equal to 288.
The final output fused image is shown in fig. 3.
The method used by the invention is not limited to the bit width of 8bit image data, and can be used in each data bit width.
The method is not limited to YCbCr color space, space is only described in detail in the YCbCr color space, but the method is effective in YUV, lab and other color spaces.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (9)
1. A method for fusing medium-long wave dual-waveband infrared images is characterized by comprising the following steps: the method comprises the following steps:
step (1) reading an image to be fused
Inputting the intermediate wave and long wave infrared fusion images to be fused into YCbCr color space, inputting the intermediate wave images into a Cb channel, inputting the long wave images into a Cr channel, performing weighted average processing on the intermediate wave images and the long wave images, inputting the intermediate wave images and the long wave images into a Y channel, and recording the intermediate wave images as FMWThe long-wave image is FLWThe weighted average image is recorded as FAVG;
Step (2), Y-channel processing
Weighted average image F of the input Y channel obtained in the step (1)AVGPerforming single-scale Retinex enhancement processing to obtain a final output image F of a Y channelY;
Step (3) extracting detail features of the medium wave image
Extracting detail features of the medium wave image obtained in the step (1), and recording the detail features of the medium wave as DMW;
Step (4) extracting difference characteristics of the medium wave image and the long wave image
Extracting the long-wave image obtained in the step (1) as FLWAnd the medium wave image is FMWCalculating the difference characteristic of the long wave image and the difference characteristic of the medium wave image, and recording the difference characteristic of the long wave image as difLWThe difference characteristic of the medium wave image is difMW;
Step (5), calculating Cb channel and Cr channel reference mean value
According to the detail characteristics D of the medium wave image obtained in the step (3)MWCalculating Cb channel reference mean value muCbAccording to the difference characteristic dif of the long wave image obtained in the step (4)LWCalculating the Cr channel reference mean value muCr;
Step (6) calculating reference variances of Cb channel and Cr channel
According to the difference characteristic dif of the long wave image obtained in the step (4)LWAnd the difference characteristic dif of the medium wave imageMWCb channel reference varianceCb 2And Cr channel reference varianceCr 2;
Step (7), calculating the mean value and the variance of the source image
The mean and variance of the medium wave image are recorded as muMWAndMW 2the mean and variance of the long-wave image are recorded as μLWAndLW 2;
step (8), color transfer and superposition of features
Step (9) of outputting the final fused image
And (4) converting the YCbCr color space output image obtained in the step (2) and the step (8) into an RGB color space for output.
2. The medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in the step (2), the specific method is as follows:
FYthe calculation method is as follows:
FY(i,j)=ln FAVG(i,j)-ln[F(i,j)*ln FAVG(ij)];
where F (i, j) is the surround function and i, j is the image coordinate position, the formula is as follows:
wherein c is the standard deviation of the Gaussian function, and the value of K is such that the surrounding function F (i, j) satisfies the following relationship:
∫∫F(i,j)didj=1。
3. the medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in the step (3), extracting detail features of the medium wave source image, which is specifically as follows:
performing median filtering on the medium wave image obtained in the step (1), and marking the filtered image as BMWDetails of the medium wave image feature DMWThe calculation formula is as follows:
DMW=FMW-BMW。
4. the medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in step (4), the long-wave image difference difLWAnd medium wave image difference difMWThe calculation formula is as follows:
difMW=FMW-FLW∩FMW;
difLW=FLW-FLW∩FMW;
wherein FLW∩FMWThe common component of the medium wave image and the long wave image is realized by comparing the local parts of the medium wave image and the long wave image and taking the minimum, and the calculation formula is as follows:
FLW(i,j)∩FMW=Min{FLW(i,j),FMW(i,j)};
i, j are image coordinate positions.
5. The medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in the step (5), the calculation formula is as follows:
μCb=Mid-mean(DMW);
μCr=Mid-mean(difLW);
wherein Mid is a channel data median value, mean represents the image mean value; mid takes the value of 128 when the data bit width is 8 bits.
6. The medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in the step (6), the difference characteristic dif of the long-wave image obtained in the step (4) is used as the basisLWAnd the difference characteristic dif of the medium wave imageMWCb channel reference varianceCb 2And Cr channel reference varianceCr 2The calculation formula is as follows:
Cb 2=-α×mean(difMW)+b;
Cr 2=-a×mean(difLW)+b;
wherein a and b are linear mapping parameters, and the value range is 0-1.
7. The medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in the step (7), the mean and variance of the medium wave image are recorded as muMWAndMW 2the mean and variance of the long-wave image are recorded as μLWAndLW 2the calculation formula is as follows:
wherein m and n are the number of image rows and columns, respectively; i, j are image coordinate positions.
8. The medium-long wave dual-band infrared image fusion method of claim 1, characterized in that: in the step (8), according to the detail feature D of the medium wave image obtained in the step (3)MWAccording to the difference characteristic dif of the long wave image obtained in the step (4)LWCalculating to obtain a final output image F of the Cb channel and the Cr channel according to the reference mean value of the Cb channel and the Cr channel obtained in the step (5), the reference variance of the Cb channel and the Cr channel obtained in the step (6), and the mean value and the variance of the medium wave image and the long wave image obtained in the step (7)CbAnd FCrThe calculation formula is as follows:
i, j are image coordinate positions.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010655336.7A CN111815548B (en) | 2020-07-07 | 2020-07-07 | Medium-long wave double-band infrared image fusion method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010655336.7A CN111815548B (en) | 2020-07-07 | 2020-07-07 | Medium-long wave double-band infrared image fusion method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111815548A true CN111815548A (en) | 2020-10-23 |
CN111815548B CN111815548B (en) | 2023-11-03 |
Family
ID=72842035
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010655336.7A Active CN111815548B (en) | 2020-07-07 | 2020-07-07 | Medium-long wave double-band infrared image fusion method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111815548B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113205470A (en) * | 2021-03-19 | 2021-08-03 | 昆明物理研究所 | Infrared medium-short wave double-color fusion method based on hue saturation mapping |
CN114418905A (en) * | 2021-11-29 | 2022-04-29 | 昆明物理研究所 | Image fusion algorithm for pixel sampling center staggered overlapping arrangement dual-band detector |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101339653A (en) * | 2008-01-30 | 2009-01-07 | 西安电子科技大学 | Infrared and colorful visual light image fusion method based on color transfer and entropy information |
CN101714251A (en) * | 2009-12-22 | 2010-05-26 | 上海电力学院 | Infrared and visual pseudo-color image fusion and enhancement method |
CN102609927A (en) * | 2012-01-12 | 2012-07-25 | 北京理工大学 | Foggy visible light/infrared image color fusion method based on scene depth |
CN105427268A (en) * | 2015-12-01 | 2016-03-23 | 中国航空工业集团公司洛阳电光设备研究所 | Medium-long-wave dual-waveband infrared image feature level color fusion method |
CN107103596A (en) * | 2017-04-27 | 2017-08-29 | 湖南源信光电科技股份有限公司 | A kind of color night vision image interfusion method based on yuv space |
CN109255774A (en) * | 2018-09-28 | 2019-01-22 | 中国科学院长春光学精密机械与物理研究所 | A kind of image interfusion method, device and its equipment |
CN109658367A (en) * | 2018-11-14 | 2019-04-19 | 国网新疆电力有限公司信息通信公司 | Image interfusion method based on Color transfer |
-
2020
- 2020-07-07 CN CN202010655336.7A patent/CN111815548B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101339653A (en) * | 2008-01-30 | 2009-01-07 | 西安电子科技大学 | Infrared and colorful visual light image fusion method based on color transfer and entropy information |
CN101714251A (en) * | 2009-12-22 | 2010-05-26 | 上海电力学院 | Infrared and visual pseudo-color image fusion and enhancement method |
CN102609927A (en) * | 2012-01-12 | 2012-07-25 | 北京理工大学 | Foggy visible light/infrared image color fusion method based on scene depth |
CN105427268A (en) * | 2015-12-01 | 2016-03-23 | 中国航空工业集团公司洛阳电光设备研究所 | Medium-long-wave dual-waveband infrared image feature level color fusion method |
CN107103596A (en) * | 2017-04-27 | 2017-08-29 | 湖南源信光电科技股份有限公司 | A kind of color night vision image interfusion method based on yuv space |
CN109255774A (en) * | 2018-09-28 | 2019-01-22 | 中国科学院长春光学精密机械与物理研究所 | A kind of image interfusion method, device and its equipment |
CN109658367A (en) * | 2018-11-14 | 2019-04-19 | 国网新疆电力有限公司信息通信公司 | Image interfusion method based on Color transfer |
Non-Patent Citations (5)
Title |
---|
MA J等: "Infrared and visible image fusion methods and applications: A survey", 《INFORMATION FUSION》 * |
ZHANG R等: "Adaptive infrared medium and long wave image fusion method based on color transfer algorithm", 《SEVENTH SYMPOSIUM ON NOVEL PHOTOELECTRONIC DETECTION TECHNOLOGY AND APPLICATIONS》 * |
张润琦等: "弱环境辐射下中短波双波段红外图像融合方法", 《红外技术》 * |
粟宇路等: "红外中长波图像彩色融合方法研究", 《红外技术》 * |
许廷发等: "真彩色传递双波段图像融合", 《中国光学》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113205470A (en) * | 2021-03-19 | 2021-08-03 | 昆明物理研究所 | Infrared medium-short wave double-color fusion method based on hue saturation mapping |
CN113205470B (en) * | 2021-03-19 | 2022-08-30 | 昆明物理研究所 | Infrared medium-short wave double-color fusion method based on hue saturation mapping |
CN114418905A (en) * | 2021-11-29 | 2022-04-29 | 昆明物理研究所 | Image fusion algorithm for pixel sampling center staggered overlapping arrangement dual-band detector |
CN114418905B (en) * | 2021-11-29 | 2024-04-16 | 昆明物理研究所 | Image fusion method for dual-band detector with staggered and overlapped pixel sampling centers |
Also Published As
Publication number | Publication date |
---|---|
CN111815548B (en) | 2023-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110969631B (en) | Method and system for dyeing hair by refined photos | |
CN105574514B (en) | The raw tomato automatic identifying method in greenhouse | |
CN108230407B (en) | Image processing method and device | |
CN107657619A (en) | A kind of low-light (level) Forest fire image dividing method | |
CN105427268B (en) | A kind of middle long wave dual-band infrared image feature level Color Fusion | |
CA3153067C (en) | Picture-detecting method and apparatus | |
CN107993209A (en) | Image processing method, device, computer-readable recording medium and electronic equipment | |
CN108711160B (en) | Target segmentation method based on HSI (high speed input/output) enhanced model | |
CN111815548B (en) | Medium-long wave double-band infrared image fusion method | |
CN111445419A (en) | Medical endoscope image enhancement method based on mathematical morphology | |
CN109658367A (en) | Image interfusion method based on Color transfer | |
CN112233024B (en) | Medium-long wave double-band infrared image fusion method based on difference characteristic color mapping | |
CN111311502A (en) | Method for processing foggy day image by using bidirectional weighted fusion | |
CN114511567B (en) | Tongue body and tongue coating image identification and separation method | |
CN111563866A (en) | Multi-source remote sensing image fusion method | |
CN114881899B (en) | Quick color-preserving fusion method and device for visible light and infrared image pair | |
US9053552B2 (en) | Image processing apparatus, image processing method and non-transitory computer readable medium | |
CN111275644A (en) | Retinex algorithm-based underwater image enhancement method and device | |
CN111611940A (en) | Rapid video face recognition method based on big data processing | |
CN112734679A (en) | Fusion defogging method for medical operation video images | |
Hsu et al. | A novel automatic white balance method for color constancy under different color temperatures | |
CN116630198A (en) | Multi-scale fusion underwater image enhancement method combining self-adaptive gamma correction | |
CN113989138B (en) | Method for extracting facial skin image inflammation and forming red area spectrum | |
CN113870141A (en) | Color fundus image enhancement method | |
CN113989137A (en) | Method for extracting pigmentation of facial skin image and forming spectrum of brown region |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |