WO2019196539A1 - 图像融合方法及其装置 - Google Patents
图像融合方法及其装置 Download PDFInfo
- Publication number
- WO2019196539A1 WO2019196539A1 PCT/CN2019/073090 CN2019073090W WO2019196539A1 WO 2019196539 A1 WO2019196539 A1 WO 2019196539A1 CN 2019073090 W CN2019073090 W CN 2019073090W WO 2019196539 A1 WO2019196539 A1 WO 2019196539A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- brightness
- fusion
- detail
- pixel
- Prior art date
Links
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 18
- 230000004927 fusion Effects 0.000 claims abstract description 210
- 238000004364 calculation method Methods 0.000 claims abstract description 61
- 238000000034 method Methods 0.000 claims abstract description 26
- 238000012545 processing Methods 0.000 claims description 40
- 238000013507 mapping Methods 0.000 claims description 21
- 238000001914 filtration Methods 0.000 claims description 15
- 238000005520 cutting process Methods 0.000 claims description 4
- 238000002156 mixing Methods 0.000 claims description 4
- 238000012937 correction Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 238000007499 fusion processing Methods 0.000 description 4
- 238000005286 illumination Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 238000005282 brightening Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
Definitions
- the present application relates to image processing technologies, and in particular, to an image fusion method and apparatus therefor.
- the human eye has weak or even no perception of infrared light, and the imaging system (including the lens and sensor) in the monitoring device still has good imaging ability for near-infrared light. Therefore, taking infrared fill light and imaging can solve the problem of light pollution. However, infrared images have the problem of no color and poor layering.
- the present application provides an image fusion method and apparatus therefor.
- an image fusion method is provided, which is applied to an image acquisition device, the method comprising: converting a visible light image collected by the image acquisition device into a red, green, and blue RGB image; Converting the near-infrared image collected by the device into a first brightness image; converting the RGB image into a second brightness image; performing fusion weight calculation according to the first brightness image to obtain a fusion weight map of the near-infrared image; The fusion weight map performs brightness fusion on the first brightness image and the second brightness image to obtain a brightness fusion image; and performs RGB fusion according to the second brightness image, the brightness fusion image, and the RGB image A fused RGB image is obtained as an output image of the image acquisition device.
- performing brightness fusion on the first brightness image and the second brightness image according to the fusion weight map further comprising: performing detail calculation on the first brightness image to obtain a first detail image; The detail calculation is performed on the second brightness map to obtain a second detail image.
- performing brightness fusion on the first brightness image and the second brightness image according to the fusion weight map including: according to the fusion weight map, the first detail image, and the second detail
- the image performs luminance blending on the first luminance image and the second luminance image.
- performing detail calculation on the first brightness image to obtain a first detail image including: performing mean filtering on the first brightness image to obtain a first mean image; and comparing the first brightness image to The first mean image is subjected to difference to obtain a first difference image; and the first difference image is subjected to a cutoff operation to obtain the first detail image.
- performing a detail calculation on the second brightness image to obtain a second detail image including: performing mean filtering on the second brightness image to obtain a second mean image; and The second mean image is subjected to difference to obtain a second difference image; and the second difference image is subjected to a cutoff operation to obtain the second detail image.
- the cutting off operation of the difference image is implemented by using the following formula:
- Detail p is the brightness detail of the pixel point P in the detail image
- Diff p is the value of the pixel point P in the difference image
- str is the detail intensity control parameter
- [deNirMin, deNirMax] is the cutoff interval
- CLIP() is the cutoff calculation.
- the fusion brightness Y of the pixel is determined by the following formula:
- Y nir is a luminance value of a pixel at the position in the first luminance image
- wt is a fusion weight of a pixel at the position in the fusion weight map
- Y LL is the location in the second luminance image
- Detail nir is the brightness detail of the pixel at the position in the first detail image
- Detail LL is the brightness detail of the pixel at the position in the second detail image
- CLIP() is the cutoff Calculation.
- performing the fusion weight calculation according to the first brightness image including: searching, for any pixel point in the first brightness image, a preset brightness mapping model according to the brightness value of the pixel point to determine the brightness The fusion weight corresponding to the value, wherein the brightness mapping model records a correspondence between the brightness value and the fusion weight.
- performing brightness fusion on the first brightness image and the second brightness image according to the fusion weight map including: determining, for a pixel point of any position, a fusion brightness Y of the pixel point by using the following formula:
- Y nir is a luminance value of a pixel at the position in the first luminance image
- wt is a fusion weight of a pixel at the position in the fusion weight map
- Y LL is the location in the second luminance image
- RGB merging is performed according to the second brightness image, the brightness fused image, and the RGB image to obtain a fused RGB image, including: determining, for a pixel point at any position, the pixel by using the following formula Point R, G, B three channel values V out :
- V out CLIP(V in *Y/Y LL , 0,255),
- V out CLIP(Y,0,255)
- V in is a three-channel value of R, G, and B of the pixel at the position in the RGB image
- Y LL is a luminance value of the pixel at the position in the second luminance image
- Y is the luminance fusion
- CLIP() is the cutoff calculation.
- an image fusion device which is applied to an image collection device, and includes: a first visible light processing unit, configured to convert a visible light image collected by the image acquisition device into red, green, and blue An RGB image; a first infrared processing unit, configured to convert a near-infrared image acquired by the image capturing device into a first brightness image; and a second visible light processing unit configured to convert the RGB image into a second brightness image; a second infrared processing unit, configured to perform a fusion weight calculation according to the first brightness image to obtain a fusion weight map of the near-infrared image; and a brightness fusion unit, configured to: the first brightness image according to the fusion weight map Performing brightness fusion with the second brightness image to obtain a brightness fusion image; and RGB fusion unit, configured to perform RGB fusion according to the second brightness image, the brightness fusion image, and the RGB image to obtain the fused RGB
- the image serves as an output image of the image acquisition device, and includes: a first visible light processing unit, configured to
- the second visible light processing unit is further configured to perform a detail calculation on the second brightness image to obtain a second detail image;
- the second infrared processing unit is further configured to use the first brightness Performing a detail calculation to obtain a first detail image;
- the brightness blending unit is specifically configured to: pair the first brightness image and the second detail image according to the fusion weight map, the first detail image, and the second detail image The second luminance image is subjected to luminance fusion.
- the second visible light processing unit is specifically configured to perform mean filtering on the second brightness image to obtain a second mean image; and perform a difference between the second brightness image and the second mean image And obtaining a second difference image; performing a cutoff operation on the second difference image to obtain the second detail image.
- the second infrared processing unit is specifically configured to perform mean filtering on the first brightness image to obtain a first mean image; and perform a difference between the first brightness image and the first mean image And obtaining a first difference image; performing a cutoff operation on the first difference image to obtain the first detail image.
- the cutting off operation of the difference image is implemented by using the following formula:
- Detail p is the brightness detail of the pixel point P in the detail image
- Diff p is the value of the pixel point P in the difference image
- str is the detail intensity control parameter
- [deNirMin, deNirMax] is the cutoff interval
- CLIP() is the cutoff calculation.
- the brightness fusion unit is specifically configured to determine a fusion brightness Y of the pixel point for a pixel point at any position by using the following formula:
- Y nir is a luminance value of a pixel at the position in the first luminance image
- wt is a fusion weight of a pixel at the position in the fusion weight map
- Y LL is the location in the second luminance image
- Detail nir is the brightness detail of the pixel at the position in the first detail image
- Detail LL is the brightness detail of the pixel at the position in the second detail image
- CLIP() is the cutoff Calculation.
- the second infrared processing unit is configured to: for any pixel in the first brightness image, query a preset brightness mapping model according to the brightness value of the pixel point to determine the brightness value corresponding to the pixel value A fusion weight, wherein the brightness mapping model records a correspondence between a luminance value and a fusion weight.
- the brightness fusion unit is specifically configured to determine a fusion brightness Y of the pixel point for a pixel point at any position by using the following formula:
- Y nir is a luminance value of a pixel at the position in the first luminance image
- wt is a fusion weight of a pixel at the position in the fusion weight map
- Y LL is the location in the second luminance image
- the RGB merging unit is specifically configured to determine, for the pixel points at any position, the R, G, and B three-channel values V out of the pixel by using the following formula:
- V out CLIP(V in *Y/Y LL , 0,255),
- V out CLIP(Y,0,255)
- V in is a three-channel value of R, G, and B of the pixel at the position in the RGB image
- Y LL is a luminance value of the pixel at the position in the second luminance image
- Y is the luminance fusion
- CLIP() is the cutoff calculation.
- an image fusion apparatus comprising a processor and a machine readable storage medium storing machine executable instructions executable by the processor,
- the processor is exemplified by the machine executable instructions to implement the method of image fusion as described in the first aspect of the embodiments of the present application.
- a machine readable storage medium storing machine executable instructions that, when invoked and executed by a processor, cause the processor to: acquire an image Converting the visible light image collected by the device into a red, green and blue RGB image; converting the near infrared image acquired by the image capturing device into a first brightness image; converting the RGB image into a second brightness image; according to the first brightness image Performing a fusion weight calculation to obtain a fusion weight map of the near-infrared image; performing brightness fusion on the first luminance image and the second luminance image according to the fusion weight map to obtain a luminance fusion image; The second luminance image, the luminance fused image, and the RGB image are RGB-fused to obtain a fused RGB image as an output image of the image acquisition device.
- the image fusion method of the embodiment of the present application converts the visible light image collected by the image acquisition device into an RGB image, and converts the near-infrared image collected by the image acquisition device into a first brightness image, and then converts the RGB image into a second brightness.
- Image performing fusion weight calculation according to the first brightness image to obtain a fusion weight map of the near-infrared image, and further performing brightness fusion on the first brightness image and the second brightness image according to the fusion weight map to obtain a brightness fusion image, and according to the image
- the second luminance image, the luminance fused image, and the RGB image are RGB-fused, and the fused RGB image is obtained as an output image of the image acquisition device, thereby realizing fusion of the visible light image and the near-infrared image. While improving the brightness of the image in a low illumination environment, the color information of the image is maintained, and the quality of the merged image is improved.
- FIG. 1 is a flowchart of an image fusion method according to an exemplary embodiment of the present application
- FIG. 2 is a schematic diagram of a luminance mapping model according to an exemplary embodiment of the present application.
- FIG. 3 is a flowchart of an image fusion method according to still another exemplary embodiment of the present application.
- FIG. 4 is a schematic flow chart of a detail calculation shown in an exemplary embodiment of the present application.
- FIG. 5 is a schematic flowchart of a fusion weight map calculation according to an exemplary embodiment of the present application.
- FIG. 6 is a flowchart of an image fusion method according to still another exemplary embodiment of the present application.
- FIG. 7 is a schematic structural diagram of an image fusion device according to an exemplary embodiment of the present application.
- FIG. 8 is a schematic diagram showing the hardware structure of an image fusion device according to an exemplary embodiment of the present application.
- the image fusion method may be applied to an image collection device, such as a surveillance camera in a video surveillance scenario. As shown in FIG. 1, the image fusion method may include the following steps.
- Step S100 Convert the visible light image collected by the image acquisition device into an RGB image.
- Step S110 Convert the near-infrared image collected by the image acquisition device into a first brightness image.
- the image acquisition device when the ambient brightness of the area where the image acquisition device is located is low, the image acquisition device can simultaneously collect the visible light image and the near-infrared image through the infrared fill light, and improve the image by the fusion of the visible light image and the near-infrared image. quality.
- the image acquisition device can also acquire infrared images for subsequent fusion processing, the near-infrared images have more details than the infrared images, and the merged images can retain more details.
- the visible light image may be converted into an RGB (Red, Green, Blue, red, green and blue) image, and the near infrared image may be converted.
- RGB Red, Green, Blue, red, green and blue
- the near infrared image may be converted.
- It is an infrared brightness image (also referred to as an infrared Y channel image, referred to herein as a first brightness image).
- the near-infrared image in the embodiment of the present application may be replaced with an infrared image. Since the near-infrared image presents more image details than the infrared image, the near-infrared image is preferred.
- the image acquisition device can recover the visible light image color by AWB (Automatic White Balance) correction, and perform DENOISE processing, and then perform visible light processing by DEMOISC (demosaicing).
- AWB Automatic White Balance
- DENOISE demosaicing
- the image is interpolated to the original RGB image, and GAMMA correction is performed on the initial RGB image to enhance the image brightness, and the RGB image described in step S100 is obtained for use in subsequent steps.
- GAMMA correction is performed on the initial RGB image to enhance the image brightness
- the RGB image described in step S100 is obtained for use in subsequent steps.
- the present disclosure does not limit the order of the above AWB correction, DENOISE processing, DEMOISC processing, and GAMMA correction.
- the image acquisition device can interpolate the near-infrared image into the RGB image corresponding to the near-infrared image through DEMOSIC processing, and perform GAMMA correction on the RGB image to enhance the brightness of the image, and further convert the RGB image into a brightness image.
- the near-infrared image and the corresponding visible light image are aligned by the pixel through the Y-channel registration process.
- step S100 there is no necessary timing relationship between step S100 and step S110, and the operation in step S100 may be performed first, and then the operation in step S110 may be performed; or the operation in step S110 may be performed first, and then the steps are executed.
- the operations in S100; the operations in steps S100 and S110 may also be performed concurrently.
- Step S120 converting the RGB image into a second brightness image.
- step S100 after the image capturing device converts the visible light image into an RGB image, the RGB image may be converted into a visible light luminance image (also referred to as a visible light Y channel image, referred to herein as a second brightness). image).
- a visible light luminance image also referred to as a visible light Y channel image, referred to herein as a second brightness).
- the image capture device may convert the RGB image into a second brightness image by the following formula (1):
- R p , G p , and B p are respectively R, G, and B three-channel values at the pixel point P in the RGB image
- y p is a luminance channel value at the pixel point P in the second luminance image
- the pixel point P For any pixel in the RGB image, the pixel point P in the RGB image is the same as the pixel point P in the second luminance image.
- Step S130 Perform a fusion weight calculation according to the first brightness image to obtain a fusion weight map of the near-infrared image.
- the fusion weight calculation may be performed according to the first brightness image to determine the brightness fusion of the visible light image and the near-infrared image.
- the weight of the luminance value of the pixel is obtained, thereby obtaining a fusion weight map of the near-infrared image.
- the weighting map of the near-infrared image records the weight of the luminance value of each pixel in the near-infrared image when the visible light image and the near-infrared image are luminance-fused.
- the threshold may be set according to an actual scene
- the weight of the pixel point luminance value of the near-infrared image may be increased to Increase the brightness of the fused image.
- the weight of the luminance value of the pixel of the near-infrared image may be reduced to fuse the image. More details in the visible image can be preserved.
- the performing the fusion weight calculation according to the first brightness image may include: for any pixel point in the first brightness image, the preset brightness mapping model may be queried according to the brightness value of the pixel point, The fusion weight corresponding to the brightness value is determined.
- a brightness mapping model may be preset, and the brightness mapping model may record the correspondence between the brightness value and the fusion weight.
- the brightness mapping model may be respectively queried according to the brightness value of each pixel point to obtain the fusion weight of each pixel point.
- FIG. 2 is a schematic diagram of a brightness mapping model according to an embodiment of the present application.
- the brightness mapping model is controlled by three parameters: min_wt, min_limit, and max_limit.
- the abscissa of the model is the luminance value and the ordinate is the fusion weight.
- the fusion weight For any pixel in the first luminance image, when its luminance value is less than min_limit, its fusion weight is 255. When its luminance value is greater than max_limit, its fusion weight is min_wt. When the brightness value is in [min_limit, max_limit], the fusion weight gradually decreases with the increase of the brightness value, and the specific mapping relationship between the brightness value and the fusion weight can be determined according to the actual brightness mapping model.
- the values of the parameters min_wt, min_limit, and max_limit may be empirical values, for example, may be 180, 200, 250, respectively.
- the fusion weight map may also be subjected to mean filtering processing, and the specific implementation thereof is not Make a statement.
- step S120 there is no necessary timing relationship between the step S120 and the step S130, and the operation in step S120 may be performed first, and then the operation in step S130 may be performed; or the operation in step S130 may be performed first, and then the steps are performed.
- the operations in S120; the operations in steps S120 and S130 may also be performed concurrently.
- Step S140 Perform brightness fusion on the first brightness image and the second brightness image according to the fusion weight map to obtain a brightness fusion image.
- the first brightness image obtained in step S110 and the second brightness image obtained in step S120 may be subjected to brightness fusion processing according to the fusion weight map.
- a brightness fusion image is obtained.
- the fusion luminance Y of the pixel is determined by the following formula (2):
- Y nir is the luminance value of the pixel at the position in the first luminance image
- wt is the fusion weight value of the pixel at the position in the fusion weight map
- Y LL is the luminance of the pixel at the location in the second luminance image
- Step S150 Perform RGB fusion according to the second brightness image, the brightness fusion image, and the RGB image to obtain the fused RGB image as an output image of the image collection device.
- the RGB fusion processing may be performed according to the brightness fused image, the second brightness image obtained in step S120, and the RGB image obtained in step S100 to obtain the fused image.
- RGB image may be performed according to the brightness fused image, the second brightness image obtained in step S120, and the RGB image obtained in step S100 to obtain the fused image.
- the RGB fusion according to the second luminance image, the luminance fused image, and the RGB image to obtain the fused RGB image may include:
- the R, G, and B three-channel values V out of the pixel are respectively determined by the following formulas (3)-(4):
- V out CLIP(V in *Y/Y LL ,0,255) (3)
- V out CLIP (Y, 0, 255) (4).
- V in may be the R, G, and B three-channel values of the pixel at the position in the RGB image (the RGB image before the fusion), and Y LL is the brightness value of the pixel at the position in the second luminance image, where Y is The luminance value of the pixel at the position in the luminance fused image, CLIP() is the cutoff calculation.
- V out when Y LL> 0 when, if V in the R channel value of the pixel of the RGB image in this position, the V out is a pixel of the position RGB image after fusion R channel value; if V in When it is the G channel value of the pixel at the position in the RGB image, then V out is the G channel value of the pixel at the position in the RGB image after the fusion; if Vin is the B channel of the pixel at the position in the RGB image When the value is, V out is the B channel value of the pixel at the position in the fused RGB image.
- the ideal brightness information of the scene is acquired by the near-infrared image, and the color information is obtained from the visible light image, and the visible light image and the near
- the infrared image combines brightness and color to enhance the brightness of the image while improving the brightness of the image in a low illumination environment (ie, a scene with low ambient brightness), and improves the quality of the image after fusion.
- the calculation method provided by the present application is simple in calculating the fused image, and the required processing time is short.
- FIG. 3 is a schematic flowchart of another image fusion method according to an embodiment of the present disclosure.
- the image fusion method may be applied to an image capture device, such as a surveillance camera in a video surveillance scenario, as shown in FIG. 3 .
- the image fusion method can include the following steps:
- Step S300 Convert the visible light image collected by the image acquisition device into an RGB image.
- Step S310 Convert the near-infrared image collected by the image acquisition device into a first brightness image.
- Step S320 converting the RGB image into a second brightness image.
- Step S330 performing fusion weight calculation according to the first brightness image to obtain a fusion weight map of the near-infrared image.
- step S300 to the step S330 in the embodiment of the present application, reference may be made to the related description in the step S100 to the step S130, and details are not repeatedly described herein.
- Step S340 performing detail calculation on the second brightness image to obtain a second detail image.
- Step S350 performing detail calculation on the first brightness image to obtain a first detail image.
- the first brightness image and the second brightness image may be further calculated to obtain a corresponding correspondence.
- the detail image obtained by the detail calculation of the first brightness image is referred to herein as a first detail image
- the detail image obtained by the detail calculation of the second brightness image is referred to as a second detail image.
- performing the detail calculation on the first brightness image to obtain the first detail image may include: performing mean filtering on the first brightness image to obtain a first mean image; and pairing the first brightness image Performing a difference with the first mean image to obtain a first difference image; and performing a cutoff operation on the first difference image to obtain a first detail image.
- the first brightness image may be average filtered to obtain a corresponding mean image (referred to herein as a first mean image).
- the image capturing device may perform mean filtering on the first brightness image with a radius r (r is an empirical value, which may be set according to an actual scene) to obtain a first mean image.
- the first luminance image and the first average image may be subjected to difference to obtain a signed difference image (referred to herein as a first difference image). .
- the first luminance image and the first average image are subjected to difference and are amplified.
- both the first luminance image and the first mean image are 8-bit images
- the difference between the first luminance image and the first average image results in a 9-bit difference image. Therefore, after the image acquisition device obtains the first difference image, it is also necessary to perform a cut-off operation on the first difference image, and cut it to a specified interval to obtain a corresponding detail image (referred to herein as a first detail image).
- the performing the cutting operation on the first difference image may be implemented by the following formula (5):
- Detail p is the brightness detail of the pixel point P in the first detail image
- Diff p is the value of the pixel point P in the first difference image
- str is the detail intensity control parameter
- [deNirMin, deNirMax] is the cut-off interval
- CLIP For the cutoff calculation, the input value is cut off within the cutoff interval.
- the str, the deNirMin, and the deNirMax are all empirical values.
- the str value can be 64
- the deNirMin value can be -64
- the deNirMax value can be 32.
- the image acquisition device performs detailed calculation on the second brightness image to obtain a second detail image.
- the image acquisition device performs detailed calculation on the second brightness image to obtain a second detail image.
- step S340 there is no necessary timing relationship between step S340 and step S350.
- the operation in step S340 may be performed first, and then the operation in step S350 may be performed.
- the operation in step S350 may be performed first, and then the steps are performed.
- the operation in S340; the operations in step S340 and step S350 can also be performed concurrently.
- Step S360 Perform brightness blending on the first brightness image and the second brightness image according to the fusion weight map, the first detail image, and the second detail image to obtain a brightness fusion image.
- the first brightness image obtained in step S310 may be obtained according to the first detail image, the second detail image, and the fusion weight obtained in step S330.
- the brightness is merged with the second brightness image obtained in step S320.
- the brightness fusion of the first brightness image and the second brightness image according to the fusion weight map, the first detail image, and the second detail image may include:
- the fusion luminance Y of the pixel is determined by the following formula (6):
- Y nir is the luminance value of the pixel at the position in the first luminance image
- wt is the fusion weight of the pixel at the position in the fusion weight map
- Y LL is the luminance value of the pixel at the location in the second luminance image
- the Detail nir is the brightness detail of the pixel at the position in the first detail image
- the Detail LL is the brightness detail of the pixel at the position in the second detail image.
- Step S370 performing RGB fusion according to the second luminance image, the luminance fused image, and the RGB image, to obtain the fused RGB image as an output image of the image acquisition device.
- step S370 in the embodiment of the present application, refer to the related description in the step S150, and details are not repeatedly described herein.
- the visible light image and the near-infrared image collected by the image acquisition device are both 8-bit images.
- the visible light image and the near-infrared image acquired by the image acquisition device are not limited to 8-bit images, and may be 12-bit images or 16-bit images.
- AWB correction may be separately performed to obtain an 8-bit visible RGB image.
- DEMOSIC processing, GAMMA correction RGB2Y (RGB to Y, RGB image conversion to Y-channel image, ie luminance image) processing, and Y-channel registration can be performed separately to obtain an 8-bit infrared luminance image.
- the image acquisition device can convert it into an 8-bit luminance image by RGB2Y processing, and perform detailed calculation on the 8-bit luminance image to obtain an 8-bit detail image.
- the flow chart of the detail calculation of the 8-bit visible light brightness image by the image acquisition device can be seen in FIG. 4 .
- the image acquisition device can perform average filtering of the radius of r on the 8-bit visible light brightness image to obtain an 8-bit average image, and then perform the difference between the 8-bit visible light brightness image and the 8-bit average image to obtain a 9-bit signed difference image, and then pass For the cutoff operation, the 9-bit signed difference image is cut off to the specified interval ([deNirMin, deNirMax]) to obtain an 8-bit visible detail map.
- the image acquisition device can perform fusion weight calculation.
- the flowchart of the fusion weight calculation of the 8-bit infrared brightness image by the image acquisition device may be as shown in FIG. 5 .
- the image acquisition device can query the preset brightness mapping model according to the brightness value of each pixel in the 8-bit infrared brightness image (as shown in FIG. 2), and obtain the fusion weight of each pixel point, thereby obtaining an 8-bit fusion weight map and integrating the 8-bit fusion.
- the weight map performs mean filtering to obtain a filtered 8-bit fusion weight map.
- the image acquisition device can perform detailed calculation on the 8-bit infrared brightness image to obtain an 8-bit infrared detail image.
- the specific implementation refer to the description of the 8-bit visible light brightness image, which is not described herein again.
- the image capturing device can perform brightness fusion on the 8-bit visible light brightness image and the 8-bit infrared brightness image according to the 8-bit visible light detail image, the 8-bit infrared detail image, and the filtered 8-bit fusion weight map.
- the RGB fusion processing is performed according to the fused luminance image, the 8-bit visible light luminance image, and the 8-bit visible RGB image to obtain the fused RGB image, and the fusion of the visible light image and the near-infrared image is realized, and the flowchart thereof can be as shown in FIG. 6 .
- the visible light image collected by the image capturing device is converted into an RGB image
- the near infrared image collected by the image capturing device is converted into the first brightness image
- the RGB brightness image is converted into the second brightness image.
- performing fusion weight calculation according to the first brightness image to obtain a fusion weight map of the near-infrared image.
- the detail calculation is then performed on the first brightness image and the second brightness image, respectively, to obtain a first detail image and a second detail image.
- brightness fusion is performed on the first brightness image and the second brightness image according to the fusion weight map, the first detail image, and the second detail image to obtain a brightness fusion image, and according to the second brightness image, the brightness fusion image, and the RGB image.
- RGB fusion to obtain the RGB image after fusion, realizes the fusion of visible light image and near-infrared image, enhances the brightness of the image in low illumination environment, maintains the color information of the image, and improves the quality of the image after fusion.
- FIG. 7 is a schematic structural diagram of an image fusion device according to an embodiment of the present disclosure, where the image fusion device can be applied to an image collection device in the foregoing method embodiment, as shown in FIG. 7 , the image fusion device The following units can be included.
- the first visible light processing unit 710 is configured to convert the visible light image collected by the image capturing device into a red, green, and blue RGB image.
- the first infrared processing unit 720 is configured to convert the near-infrared image collected by the image capturing device into a first brightness image.
- the second visible light processing unit 730 is configured to convert the RGB image into a second brightness image.
- the second infrared processing unit 740 is configured to perform a fusion weight calculation according to the first brightness image to obtain a fusion weight map of the near-infrared image.
- the brightness fusion unit 750 is configured to perform brightness fusion on the first brightness image and the second brightness image according to the fusion weight map to obtain a brightness fusion image.
- the RGB fusion unit 760 is configured to perform RGB fusion according to the second luminance image, the luminance fused image, and the RGB image, to obtain a fused RGB image as an output image of the image collection device.
- the second visible light processing unit 730 is further configured to perform detail calculation on the second brightness image to obtain a second detail image;
- the second infrared processing unit 740 further For performing detailed calculation on the first brightness map to obtain a first detail image;
- the brightness fusion unit 750 is specifically configured to use, according to the fusion weight map, the first detail image, and the second detail image Brightening the brightness of the first brightness image and the second brightness image.
- the second visible light processing unit 730 is specifically configured to perform mean filtering on the second luminance image to obtain a second average image; and to the second luminance image and the The second mean image is subjected to difference to obtain a second difference image; and the second difference image is subjected to a cutoff operation to obtain the second detail image.
- the second infrared processing unit 740 is specifically configured to perform mean filtering on the first luminance image to obtain a first average image; and to the first luminance image and the The first mean image is subjected to difference to obtain a first difference image; and the first difference image is subjected to a cutoff operation to obtain the first detail image.
- the cutoff operation of the difference image is implemented by equation (5).
- the brightness fusion unit 750 is specifically configured to determine the fusion brightness Y of the pixel point by using equation (6) for a pixel point at any position.
- the second infrared processing unit 740 is configured to query, according to a brightness value of the pixel point, a preset brightness mapping model for any pixel in the first brightness image. A fusion weight corresponding to the brightness value is determined. The brightness mapping model records the correspondence between the brightness value and the fusion weight.
- the RGB fusion unit 760 is specifically configured to determine, for the pixel points at any position, the R, G, and B three-channel values of the pixel by using formulas (3)-(4). V out .
- the brightness fusion unit 750 is specifically configured to determine the fusion brightness Y of the pixel point by using formula (2) for a pixel point at any position.
- FIG. 8 is a schematic structural diagram of hardware of an image fusion device according to an embodiment of the present application.
- the image fusion device can include a processor 801, a machine readable storage medium 802 that stores machine executable instructions.
- Processor 801 and machine readable storage medium 802 can communicate via system bus 803. And, by reading and executing machine executable instructions in the machine readable storage medium 802 corresponding to image fusion logic, the processor 801 can perform the image fusion method described above.
- the machine-readable storage medium 802 referred to herein can be any electronic, magnetic, optical, or other physical storage device that can contain or store information such as executable instructions, data, and the like.
- the machine-readable storage medium may be: RAM (Radom Access Memory), volatile memory, non-volatile memory, flash memory, storage drive (such as a hard disk drive), solid state drive, any type of storage disk. (such as a disc, DVD, etc.), or a similar storage medium, or a combination thereof.
- the embodiment of the present application also provides a machine readable storage medium including machine executable instructions, such as the machine readable storage medium 802 of FIG. 8, which may be executed by the processor 801 in the image fusion device.
- the image fusion method described above is implemented.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (16)
- 一种图像融合方法,应用于图像采集设备,包括:将所述图像采集设备采集的可见光图像转换为红绿蓝RGB图像;将所述图像采集设备采集的近红外图像转换为第一亮度图像;将所述RGB图像转换为第二亮度图像;根据所述第一亮度图像进行融合权重计算,以得到所述近红外图像的融合权重图;根据所述融合权重图对所述第一亮度图像和所述第二亮度图像进行亮度融合,以得到亮度融合图像;根据所述第二亮度图像、所述亮度融合图像以及所述RGB图像进行RGB融合,得到融合后的RGB图像作为所述图像采集设备的输出图像。
- 根据权利要求1所述的方法,其特征在于,根据所述融合权重图对所述第一亮度图像和所述第二亮度图像进行亮度融合之前,还包括:对所述第一亮度图像进行细节计算,以得到第一细节图像;对所述第二亮度图像进行细节计算,以得到第二细节图像;根据所述融合权重图对所述第一亮度图像和所述第二亮度图像进行亮度融合,包括:根据所述融合权重图、所述第一细节图像以及所述第二细节图像对所述第一亮度图像和所述第二亮度图像进行亮度融合。
- 根据权利要求2所述的方法,其特征在于,对所述第一亮度图像进行细节计算,以得到第一细节图像,包括:对所述第一亮度图像进行均值滤波,以得到第一均值图像;对所述第一亮度图像和所述第一均值图像进行求差,以得到第一差值图像;对所述第一差值图像进行截止操作,以得到所述第一细节图像。
- 根据权利要求2所述的方法,其特征在于,对所述第二亮度图像进行细节计算,以得到第二细节图像,包括:对所述第二亮度图像进行均值滤波,以得到第二均值图像;对所述第二亮度图像和所述第二均值图像进行求差,以得到第二差值图像;对所述第二差值图像进行截止操作,以得到所述第二细节图像。
- 根据权利要求3或4所述的方法,其特征在于,对所述差值图像进行截止操作通过以下公式实现:Detail p=CLIP(Diff p*str/128,deNirMin,deNirMax)其中,Detail p为细节图像中像素点P的亮度细节,Diff p为差值图像中像素点P的值,str为细节强度控制参数,[deNirMin,deNirMax]为截止区间,CLIP()为截止计算。
- 根据权利要求2所述的方法,其特征在于,根据所述融合权重图、所述第一细节图像以及所述第二细节图像对所述第一亮度图像和所述第二亮度图像进行亮度融合包括:对于任一位置的像素点,通过以下公式确定该像素点的融合亮度Y:Y=CLIP(((Y nir*wt+Y LL*(256-wt))/256+Detail nir+Detail LL),0,255)其中,Y nir为所述第一亮度图像中该位置的像素点的亮度值,wt为所述融合权重图中该位置的像素点的融合权重,Y LL为所述第二亮度图像中该位置的像素点的亮度值,Detail nir为所述第一细节图像中该位置的像素点的亮度细节,Detail LL为所述第二细节图像中该位置的像素点的亮度细节,CLIP()为截止计算。
- 根据权利要求1所述的方法,其特征在于,根据所述第一亮度图像进行融合权重计算,包括:对于所述第一亮度图像中任一像素点,根据该像素点的亮度值查询预设的亮度映射模型,以确定该亮度值对应的融合权重,其中,所述亮度映射模型记录有亮度值与融合权重的对应关系。
- 根据权利要求1所述的方法,其特征在于,根据所述融合权重图对所述第一亮度图像和所述第二亮度图像进行亮度融合,包括:对于任一位置的像素点,通过以下公式确定该像素点的融合亮度Y:Y=CLIP(((Y nir*wt+Y LL*(256-wt))/256),0,255)其中,Y nir为所述第一亮度图像中该位置的像素点的亮度值,wt为所述融合权重图中该位置的像素点的融合权重,Y LL为所述第二亮度图像中该位置的像素点的亮度值,CLIP()为截止计算。
- 根据权利要求1所述的方法,其特征在于,根据所述第二亮度图像、所述亮度融合图像以及所述RGB图像进行RGB融合,以得到融合后的RGB图像,包括:对于任一位置的像素点,通过以下公式确定该像素点的R、G、B三通道值V out:当Y LL>0时,V out=CLIP(V in*Y/Y LL,0,255)当Y LL=0时,V out=CLIP(Y,0,255)其中,V in为所述RGB图像中该位置的像素点的R、G、B三通道值,Y LL为所述第二亮度图像中该位置的像素点的亮度值,Y为所述亮度融合图像中该位置的像素点的亮度值,CLIP()为截止计算。
- 一种图像融合装置,应用于图像采集设备,该装置包括:第一可见光处理单元,用于将所述图像采集设备采集的可见光图像转换为红绿蓝RGB图像;第一红外处理单元,用于将所述图像采集设备采集的近红外图像转换为第一亮度图像;第二可见光处理单元,用于将所述RGB图像转换为第二亮度图像;第二红外处理单元,用于根据所述第一亮度图像进行融合权重计算,以得到所述近红外图像的融合权重图;亮度融合单元,用于根据所述融合权重图对所述第一亮度图像和所述第二亮度图像进行亮度融合,以得到亮度融合图像;RGB融合单元,用于根据所述第二亮度图像、所述亮度融合图像以及所述RGB图像进行RGB融合,得到融合后的RGB图像作为所述图像采集设备的输出图像。
- 根据权利要求10所述的装置,其特征在于,所述第二可见光处理单元,还用于对所述第二亮度图像进行细节计算,以得到第二细节图像;所述第二红外处理单元,还用于对所述第一亮度图像进行细节计算,以得到第一细节图像;所述亮度融合单元,具体用于根据所述融合权重图、所述第一细节图像以及所述第二细节图像对所述第一亮度图像和所述第二亮度图像进行亮度融合。
- 根据权利要求11所述的装置,其特征在于,所述第二可见光处理单元,具体用于对所述第二亮度图像进行均值滤波,以得到第二均值图像;对所述第二亮度图像和所述第二均值图像进行求差,以得到第二差值图像;对所述第二差值图像进行截止操作,以得到所述第二细节图像。
- 根据权利要求11所述的装置,其特征在于,所述第二红外处理单元,具体用于对所述第一亮度图像进行均值滤波,以得到第一均值图像;对所述第一亮度图像和所述第一均值图像进行求差,以得到第一差值图像;对所述第一差值图像进行截止操作,以得到所述第一细节图像。
- 根据权利要求10所述的装置,其特征在于,所述第二红外处理单元,具体用于:对于所述第一亮度图像中任一像素点,根据该像素点的亮度值查询预设的亮度映射模型,以确定该亮度值对应的融合权重,其中,所述亮度映射模型记录有亮度值与融合权重的对应关系。
- 一种图像融合装置,包括处理器和机器可读存储介质,所述机器可读存储介质存储有能够被所述处理器执行的机器可执行指令,所述处理器被所述机器可执行指令促使实现权利要求1-9中任一项所述的图像融合的方法。
- 一种机器可读存储介质,存储有机器可执行指令,在被处理器调用和执行时,所述机器可执行指令促使所述处理器:将图像采集设备采集的可见光图像转换为红绿蓝RGB图像;将所述图像采集设备采集的近红外图像转换为第一亮度图像;将所述RGB图像转换为第二亮度图像;根据所述第一亮度图像进行融合权重计算,以得到所述近红外图像的融合权重图;根据所述融合权重图对所述第一亮度图像和所述第二亮度图像进行亮度融合,以得到亮度融合图像;根据所述第二亮度图像、所述亮度融合图像以及所述RGB图像进行RGB融合,得到融合后的RGB图像作为所述图像采集设备的输出图像。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810320154.7 | 2018-04-11 | ||
CN201810320154.7A CN110363732A (zh) | 2018-04-11 | 2018-04-11 | 一种图像融合方法及其装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019196539A1 true WO2019196539A1 (zh) | 2019-10-17 |
Family
ID=68163504
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/073090 WO2019196539A1 (zh) | 2018-04-11 | 2019-01-25 | 图像融合方法及其装置 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110363732A (zh) |
WO (1) | WO2019196539A1 (zh) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112233079A (zh) * | 2020-10-12 | 2021-01-15 | 东南大学 | 多传感器图像融合的方法及系统 |
US20220044374A1 (en) * | 2019-12-17 | 2022-02-10 | Dalian University Of Technology | Infrared and visible light fusion method |
US11250550B2 (en) * | 2018-02-09 | 2022-02-15 | Huawei Technologies Co., Ltd. | Image processing method and related device |
EP4273794A4 (en) * | 2020-12-30 | 2024-06-19 | Hangzhou Hikmicro Sensing Technology Co., Ltd. | IMAGE FUSION METHOD AND APPARATUS, IMAGE PROCESSING DEVICE AND BINOCULAR SYSTEM |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112712485B (zh) * | 2019-10-24 | 2024-06-04 | 杭州海康威视数字技术股份有限公司 | 一种图像融合方法及装置 |
CN111161356B (zh) * | 2019-12-17 | 2022-02-15 | 大连理工大学 | 一种基于双层优化的红外和可见光融合方法 |
CN111095919B (zh) * | 2019-12-17 | 2021-10-08 | 威创集团股份有限公司 | 一种视频融合方法、装置及存储介质 |
CN111369486B (zh) * | 2020-04-01 | 2023-06-13 | 浙江大华技术股份有限公司 | 一种图像融合处理方法及装置 |
CN113763295B (zh) * | 2020-06-01 | 2023-08-25 | 杭州海康威视数字技术股份有限公司 | 图像融合方法、确定图像偏移量的方法及装置 |
CN112767298B (zh) * | 2021-03-16 | 2023-06-13 | 杭州海康威视数字技术股份有限公司 | 一种可见光图像和红外图像的融合方法、装置 |
CN113421195B (zh) * | 2021-06-08 | 2023-03-21 | 杭州海康威视数字技术股份有限公司 | 一种图像处理方法、装置及设备 |
CN114841904A (zh) * | 2022-03-03 | 2022-08-02 | 浙江大华技术股份有限公司 | 图像融合方法、电子设备及存储装置 |
CN115239610B (zh) * | 2022-07-28 | 2024-01-26 | 爱芯元智半导体(上海)有限公司 | 图像融合方法、装置、系统及存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140169671A1 (en) * | 2012-12-14 | 2014-06-19 | Industry-Academic Cooperation Foundation, Yonsei University | Apparatus and method for color restoration |
CN104200452A (zh) * | 2014-09-05 | 2014-12-10 | 西安电子科技大学 | 基于谱图小波变换的红外与可见光图像融合方法及其装置 |
CN104268847A (zh) * | 2014-09-23 | 2015-01-07 | 西安电子科技大学 | 一种基于交互非局部均值滤波的红外与可见光图像融合方法 |
CN107784642A (zh) * | 2016-08-26 | 2018-03-09 | 北京航空航天大学 | 一种红外视频和可见光视频自适应融合方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015157058A1 (en) * | 2014-04-07 | 2015-10-15 | Bae Systems Information & Electronic Systems Integration Inc. | Contrast based image fusion |
CN105069768B (zh) * | 2015-08-05 | 2017-12-29 | 武汉高德红外股份有限公司 | 一种可见光图像与红外图像融合处理系统及融合方法 |
CN106600572A (zh) * | 2016-12-12 | 2017-04-26 | 长春理工大学 | 一种自适应的低照度可见光图像和红外图像融合方法 |
-
2018
- 2018-04-11 CN CN201810320154.7A patent/CN110363732A/zh active Pending
-
2019
- 2019-01-25 WO PCT/CN2019/073090 patent/WO2019196539A1/zh active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140169671A1 (en) * | 2012-12-14 | 2014-06-19 | Industry-Academic Cooperation Foundation, Yonsei University | Apparatus and method for color restoration |
CN104200452A (zh) * | 2014-09-05 | 2014-12-10 | 西安电子科技大学 | 基于谱图小波变换的红外与可见光图像融合方法及其装置 |
CN104268847A (zh) * | 2014-09-23 | 2015-01-07 | 西安电子科技大学 | 一种基于交互非局部均值滤波的红外与可见光图像融合方法 |
CN107784642A (zh) * | 2016-08-26 | 2018-03-09 | 北京航空航天大学 | 一种红外视频和可见光视频自适应融合方法 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11250550B2 (en) * | 2018-02-09 | 2022-02-15 | Huawei Technologies Co., Ltd. | Image processing method and related device |
US20220044374A1 (en) * | 2019-12-17 | 2022-02-10 | Dalian University Of Technology | Infrared and visible light fusion method |
US11823363B2 (en) * | 2019-12-17 | 2023-11-21 | Dalian University Of Technology | Infrared and visible light fusion method |
CN112233079A (zh) * | 2020-10-12 | 2021-01-15 | 东南大学 | 多传感器图像融合的方法及系统 |
CN112233079B (zh) * | 2020-10-12 | 2022-02-11 | 东南大学 | 多传感器图像融合的方法及系统 |
EP4273794A4 (en) * | 2020-12-30 | 2024-06-19 | Hangzhou Hikmicro Sensing Technology Co., Ltd. | IMAGE FUSION METHOD AND APPARATUS, IMAGE PROCESSING DEVICE AND BINOCULAR SYSTEM |
Also Published As
Publication number | Publication date |
---|---|
CN110363732A (zh) | 2019-10-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019196539A1 (zh) | 图像融合方法及其装置 | |
WO2019119842A1 (zh) | 图像融合方法、装置、电子设备及计算机可读存储介质 | |
US8363131B2 (en) | Apparatus and method for local contrast enhanced tone mapping | |
WO2019148912A1 (zh) | 一种图像处理方法、装置、电子设备及存储介质 | |
WO2017202061A1 (zh) | 一种图像透雾方法及实现图像透雾的图像采集设备 | |
WO2021109620A1 (zh) | 一种曝光参数的调节方法及装置 | |
JP6394338B2 (ja) | 画像処理装置、画像処理方法、及び撮像システム | |
US9426437B2 (en) | Image processor performing noise reduction processing, imaging apparatus equipped with the same, and image processing method for performing noise reduction processing | |
WO2021073140A1 (zh) | 单目摄像机、图像处理系统以及图像处理方法 | |
JP5996970B2 (ja) | 車載撮像装置 | |
JP6559229B2 (ja) | 画像処理装置、撮像装置、画像処理方法及び画像処理装置の画像処理プログラムを記憶した記憶媒体 | |
WO2019105254A1 (zh) | 背景虚化处理方法、装置及设备 | |
WO2017080348A2 (zh) | 一种基于场景的拍照装置、方法、计算机存储介质 | |
JP2014107852A (ja) | 撮像装置 | |
US20130120608A1 (en) | Light source estimation device, light source estimation method, light source estimation program, and imaging apparatus | |
KR20190116077A (ko) | 이미지 처리 | |
JP2016126750A (ja) | 画像処理システム、画像処理装置、撮像装置、画像処理方法、プログラムおよび記録媒体 | |
JP2004219277A (ja) | 人体検知方法およびシステム、プログラム、記録媒体 | |
TWI542212B (zh) | Photographic system with visibility enhancement | |
JP2020509661A (ja) | 視程状態の変化にロバストな複合フィルタリング基盤のオートフォーカシング機能を有する監視カメラ及びそれが適用された映像監視システム | |
KR102336449B1 (ko) | 촬영 장치 및 촬영 장치의 동작방법 | |
US20120314044A1 (en) | Imaging device | |
US8743236B2 (en) | Image processing method, image processing apparatus, and imaging apparatus | |
JP2014209681A (ja) | 色調調整装置および色調調整方法 | |
CN112241735A (zh) | 一种图像处理方法、装置及系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19785139 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19785139 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19785139 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 10.05.2021) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19785139 Country of ref document: EP Kind code of ref document: A1 |