WO2018145575A1 - 图像融合设备和图像融合方法 - Google Patents

图像融合设备和图像融合方法 Download PDF

Info

Publication number
WO2018145575A1
WO2018145575A1 PCT/CN2018/074093 CN2018074093W WO2018145575A1 WO 2018145575 A1 WO2018145575 A1 WO 2018145575A1 CN 2018074093 W CN2018074093 W CN 2018074093W WO 2018145575 A1 WO2018145575 A1 WO 2018145575A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal
image
channel
value
pixel
Prior art date
Application number
PCT/CN2018/074093
Other languages
English (en)
French (fr)
Inventor
范蒙
俞海
浦世亮
Original Assignee
杭州海康威视数字技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州海康威视数字技术股份有限公司 filed Critical 杭州海康威视数字技术股份有限公司
Priority to US16/481,397 priority Critical patent/US11049232B2/en
Priority to EP18750832.0A priority patent/EP3582490B1/en
Publication of WO2018145575A1 publication Critical patent/WO2018145575A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4015Image demosaicing, e.g. colour filter arrays [CFA] or Bayer patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/131Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time
    • H04N25/533Control of the integration time by using differing integration times for different sensor regions
    • H04N25/534Control of the integration time by using differing integration times for different sensor regions depending on the spectral component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present application relates to the field of image processing technologies, and in particular, to an image fusion device and an image fusion method.
  • an image fusion device that combines light fusion is usually required to acquire the image.
  • the basic principle of the image acquisition device of the split-light fusion image is: collecting the visible light image corresponding to the visible light signal and collecting the infrared image corresponding to the infrared light signal, and merging the visible light image and the infrared image to obtain the merged image.
  • the fused image is a dual-band image, and the image information is more reflected than any one of the visible light image and the infrared image belonging to the single-band.
  • the image-collecting device for splitting and concentrating comprises: a semi-reverse half lens, a visible light sensor, an infrared light sensor, a registration unit and a fusion unit.
  • the half-reflex lens is used to decompose incident light into visible light and infrared light
  • the visible light The sensor is configured to form a visible light image by being sensitive to visible light
  • the infrared light sensor is configured to form an infrared image by infrared light.
  • the registration unit is configured to eliminate positional deviation between the infrared image and the visible light image, and the fusion unit is used for registration
  • the position-corrected infrared image formed by the unit is weighted and merged with the visible light image, wherein the merged image formed by the fusion is an output image of the image acquisition device.
  • the image fusion device of the split-light fusion can obtain the merged image
  • the infrared image is doped in the visible light image, so that the image processor removes the infrared component from the visible light image (ie, the color signal) when receiving the visible light image. Large amounts lead to low integration efficiency.
  • An object of the embodiments of the present application is to provide an image fusion device and an image fusion method to reduce the amount of calculation of the infrared component by the image processor from the color signal, and improve the fusion efficiency.
  • the specific technical solutions are as follows:
  • an image fusion device including:
  • the light collecting device, the image processor, and the image sensor having four types of photosensitive channels, the four types of photosensitive channels include: red, green and blue RGB channels and infrared IR channels;
  • the light collecting device is configured to block a spectrum of a first predetermined wavelength interval in the incident light to obtain a target light; wherein the first predetermined wavelength interval is: the RGB channel and the IR channel in the image sensor are in an infrared band a wavelength interval of a spectrum in which the difference is higher than the first predetermined threshold;
  • the image sensor is configured to convert the target light into an image signal through the RGB channel and the IR channel;
  • the image processor is configured to parse the image signal into a color signal and a brightness signal that is sensitive to an infrared band, and fuse the color signal and the brightness signal to obtain a merged image, wherein the image is
  • the color signal is a signal based on the analysis of the R channel, the G channel, or the B channel.
  • the embodiment of the present application further provides an image fusion device, including:
  • the light collecting device, the image processor, and the image sensor having four types of photosensitive channels, the four types of photosensitive channels include: red, green and blue RGB channels and full-band W channels;
  • the light collecting device is configured to block a spectrum of a first predetermined wavelength interval in the incident light to obtain a target light, wherein the first predetermined wavelength interval is: the RGB channel and the W channel of the image sensor are responsive in an infrared band a wavelength interval of a spectrum having a difference higher than a first predetermined threshold;
  • the image sensor is configured to convert the target light into an image signal through the RGB channel and the W channel;
  • the image processor is configured to parse the image signal into a color signal and a brightness signal, and fuse the color signal and the brightness signal to obtain a fused image, wherein the color signal is based on R
  • the signal obtained by channel, G channel or B channel parsing.
  • an embodiment of the present application further provides an image fusion method, which is applied to an image fusion device, where the image fusion device has four types of photosensitive channels, and the four types of photosensitive channels include: red, green, and blue RGB channels and infrared IR. Channel; the method includes:
  • the first predetermined wavelength interval is: the image sensor of the image fusion device has a high response difference between the RGB channel and the IR channel in the infrared band a wavelength interval of a spectrum of a first predetermined threshold;
  • the embodiment of the present application further provides an image fusion method, which is applied to an image fusion device, where the image fusion device has four types of photosensitive channels, and the four types of photosensitive channels include: red, green, and blue RGB channels and full bands.
  • W channel the method includes:
  • the first predetermined wavelength interval is: the difference between the RGB channel and the W channel in the image sensor of the image fusion device is higher than that in the infrared band a wavelength interval of a predetermined threshold of the spectrum;
  • the embodiment of the present application provides a storage medium for storing executable code, where the executable code is executed at runtime: a method step of an image fusion method provided by the third aspect of the embodiment of the present application .
  • the embodiment of the present application provides a storage medium for storing executable code, where the executable code is executed at runtime: method steps of the image fusion method provided by the fourth aspect of the embodiment of the present application .
  • the embodiment of the present application provides an application program for performing, at runtime, the method steps of the image fusion method provided by the third aspect of the embodiment of the present application.
  • an embodiment of the present application provides an application program for performing, at runtime, the method steps of the image fusion method provided by the fourth aspect of the embodiments of the present application.
  • the light collecting device blocks the spectrum of the first predetermined wavelength interval in the incident light, so that the portion with a large quantum efficiency contrast of each channel in the near-infrared band is filtered, thereby simplifying the color of the image processor.
  • the operation of removing the infrared component in the signal improves the fusion efficiency.
  • the image fusion device provided by the embodiment of the present invention performs image acquisition by using an image sensor having four types of photosensitive channels, and the infrared light and the visible light are simultaneously acquired when the optical system needs special design, and the structural complexity is greatly reduced. This allows the scope of use to be expanded.
  • the image fusion method provided by the embodiment of the present application can reduce the calculation amount of removing the infrared component from the color signal, improve the fusion efficiency, and achieve the purpose of acquiring the dual-band image through the simple image fusion device. .
  • FIG. 1 is a schematic structural diagram of an image fusion device according to a first aspect of the present application.
  • FIG. 2 is a schematic diagram of a Bayer array
  • FIG. 3 is a schematic diagram of an array corresponding to an RGBIR image sensor
  • Figure 5 is a schematic diagram of the principle of spectral blocking
  • FIG. 6 is another schematic structural diagram of an image fusion device according to a first aspect of the present disclosure.
  • FIG. 7 is another schematic structural diagram of an image fusion device according to a first aspect of the present disclosure.
  • FIG. 8 is a schematic structural diagram of an image fusion device according to a second aspect of the present application.
  • FIG. 9 is a schematic diagram of an array corresponding to an RGBW image sensor
  • FIG. 10 is a schematic diagram of interpolation results corresponding to W channel interpolation
  • FIG. 11 is another schematic structural diagram of an image fusion device according to a second aspect of the present application.
  • FIG. 12 is another schematic structural diagram of an image fusion device according to a second aspect of the present disclosure.
  • Figure 13 is another schematic diagram of spectral blocking
  • FIG. 14 is a flowchart of an image fusion method applied to an image fusion device provided by the first aspect provided by the implementation of the present application;
  • FIG. 15 is a flowchart of an image fusion method applied to an image fusion device according to a second aspect of the present disclosure.
  • the embodiment of the present application provides an image fusion device.
  • an image fusion device provided by an embodiment of the present application may include:
  • the light collecting device 110, the image processor 130, and the image sensor 120 having four types of photosensitive channels, the four types of photosensitive channels include: red, green and blue RGB channels and infrared IR channels;
  • the light collecting device 110 is configured to block a spectrum of a first predetermined wavelength interval of the incident light to obtain a target light; wherein the first predetermined wavelength interval is: the RGB channel and the IR channel of the image sensor 120 are in an infrared band a wavelength interval of a spectrum in which the difference is higher than the first predetermined threshold;
  • the image sensor 120 is configured to convert the target light into an image signal through the RGB channel and the IR channel;
  • the image processor 130 is configured to parse the image signal into a color signal and a brightness signal that is sensitive to an infrared band, and fuse the color signal and the brightness signal to obtain a fused image, wherein the color signal is based on The signal obtained by parsing the R channel, G channel or B channel.
  • the portion of the quantum efficiency contrast of each channel in the near-infrared band is filtered out, thereby simplifying the image processor to remove the infrared component from the color signal.
  • the operation improves the fusion efficiency.
  • the specific value of the first predetermined threshold may be set according to actual conditions, and is not limited herein. It should be emphasized that, among the three differences of the values of the R channel or the G channel or the B channel and the values of the IR channel, respectively, as long as any one of the differences is higher than the first predetermined threshold, the light collecting device 110 may have a corresponding wavelength interval. The spectral portion is blocked.
  • the first predetermined wavelength interval may be [T1, T2], wherein the value of T1 is in the interval [600 nm, 800 nm], and the value of T2 is in the interval [750 nm, 1100 nm]. ]Inside. It can be understood that by increasing the blocking function of the spectrum of the predetermined wavelength interval of the incident light, the RGB channel and the IR channel in the image sensor 120 are filtered in the infrared band (650 nm - 1100 nm) in response to the spectral region having a large difference.
  • the image signal formed by the image sensor 120 can be restored to a precise color signal and a brightness signal sensitive to the infrared band (650-1100 nm) by a simple operation.
  • the gray portion is a partial spectrum that needs to be filtered out.
  • IR characterizes the infrared signal
  • R characterizes the red light signal
  • G characterizes the green light signal
  • B characterizes the blue light signal.
  • FIG. 5 is merely an example and does not have any limitation. However, due to the manufacturing process and the like, the actual filtering curve is usually not steep as shown in FIG. 5, but there is a slope.
  • the Bayer array that is, the BAYER array
  • the Bayer array schematic diagram can be seen as shown in FIG. 2, that is, the red, green, and blue dot matrix information is output in a mosaic manner. Since the Bayer array-based image sensor only has RGB three channels, the infrared spectrum cannot be obtained. Therefore, in order to obtain a fused image of the visible light image and the infrared image, it is necessary to simultaneously acquire infrared light and visible light through prism splitting, different optical sensors, etc., and the structure is complicated. .
  • the image sensor utilized in the embodiments of the present application is an image sensor having four types of photosensitive channels, so that an image signal collected by one image sensor can be parsed out of a color signal and a luminance signal that is sensitive to an infrared band.
  • the image sensor having the red, green and blue RGB channels and the infrared IR channel provided by the embodiments of the present application is simply referred to as an RGBIR image sensor.
  • FIG. 3 shows an array diagram corresponding to the RGBIR image sensor.
  • the RGBIR image sensor includes four types of photosensitive channels, namely, an RGB channel and an IR channel.
  • the IR channel is sensitive to the infrared band. Channels; RGB channels can be sensitive to both visible and infrared, and RGB channels are primarily used to illuminate visible light.
  • the arrays shown in FIG. 3 are merely illustrative and should not be construed as limiting the embodiments of the present application.
  • the arrays corresponding to the RGBIR image sensors have various structures, and can be applied to the embodiments of the present application.
  • the color signal may be formed by subtracting the value of the IR parameter corresponding to the corresponding pixel position by the value of the R channel, the G channel, or the B channel, where the IR parameter value is the value of the IR channel corresponding to the pixel position.
  • the product of the preset correction value It can be understood that by subtracting the value of the IR parameter corresponding to the corresponding pixel position from the values of the R channel, the G channel or the B channel traversed, that is, removing the infrared component in the color signal, the infrared component in the color signal can be avoided.
  • the RGB three types of signal components are crosstalked to enhance the image effect under low illumination. It should be emphasized that the preset correction value can be set according to the actual situation.
  • the preset correction value can be generally set to 1.
  • the preset correction value can be set to 0 according to actual conditions. Any integer or fraction in 1024, and those skilled in the art can understand that the value of the preset correction value is not limited thereto.
  • the image processor 130 parses the image signal into a color signal and a brightness signal that is sensitive to the infrared band includes:
  • Step a1 performing an interpolation operation on the IR channel in the image signal to generate a luminance signal that is the same as the input resolution and is sensitive to the infrared wavelength band, wherein the input resolution is a resolution of the image signal;
  • step a2 the image signal is traversed, and the value of the R channel, the G channel or the B channel traversed is subtracted from the value of the IR parameter corresponding to the corresponding pixel position, and the IR parameter value is the value of the IR channel corresponding to the pixel position and the pre- Set the product of the correction value;
  • step a3 the R channel, the G channel, and the B channel in the image signal are respectively interpolated to generate a color signal having the same resolution as the input.
  • each small square corresponds to one pixel, and the resolution of the image signal generated by the RGBIR image sensor is 8*8. Since the IR channel in the image signal is interpolated, a luminance signal that is the same as the input resolution and is sensitive to the infrared wavelength band is generated. Therefore, for FIG. 3, the luminance generated by interpolating the IR channel in the image signal is generated. The resolution of the signal is also 8*8, and the interpolation result of the IR channel can be seen in Figure 4. Moreover, the interpolation algorithm used for the interpolation of the IR channel in the image signal may be bilinear, or bicubic, etc., and the specific interpolation process is not limited herein.
  • the resolution of the image signal generated by the RGBIR image sensor is related to the array structure, and 8*8 is only the resolution corresponding to the RGBIR image sensor having the array structure shown in FIG. 3, and should not constitute an embodiment of the present application. Limited.
  • each small square may be R, may be B, or may be G, when traversing the image signal, it will traverse to the R channel, G channel or B channel, and then The value of the R channel, G channel, or B channel traversed is subtracted from the IR parameter value corresponding to the corresponding pixel position to remove the infrared component.
  • the interpolation algorithm used for interpolating the R channel, the G channel, and the B channel in the image signal may be bilinear or bicubic, etc., and the interpolation algorithm used for interpolating the IR channel and the interpolation for the RGB channel may be the same.
  • the image processor 130 parses the image signal into a color signal and a specific implementation of the luminance signal that is sensitive to the infrared band, and is merely exemplary and should not be construed as limiting the embodiments of the present application. .
  • the image processor 130 performs a fusion process on the color signal and the luminance signal to obtain a specific implementation manner of the fused image. The following two are combined in detail.
  • the image processor 130 combines the color signal and the luminance signal to obtain a processed image, which may include:
  • Y (R*w1+G*w2+B*w3)/(w1+w2+w3)
  • R is an R channel corresponding to the pixel
  • G is the value of the G channel corresponding to the pixel
  • B is the value of the B channel corresponding to the pixel
  • w1, w2 and w3 are the weight values
  • Step b3 performing color noise reduction processing on the reference channel values K1, K2, and K3;
  • Step b4 the luminance signal Y′ on the corresponding pixel is merged with the reference channel value K1-K3 after the color noise reduction, to generate the RGB three-channel values R′, G′ and B′ after the fusion, to obtain the merged image;
  • the image processor 130 combines the color signal and the luminance signal to obtain a processed image, which may include:
  • Step c1 converting the color signal into a YUV signal, wherein the YUV signal is a brightness and a chrominance signal;
  • Step c2 extracting a chrominance UV component in the YUV signal
  • Step c3 combining the UV component with the luminance signal to form a new YUV signal
  • step c4 the formed new YUV signal is determined as a fused image, or the formed new YUV signal is converted into an RGB signal, and the converted RGB signal is determined as a fused image.
  • Y means brightness (Luminance or Luma), that is, gray level value
  • U and “V” means color (Chrominance or Chroma)
  • the UV component may be subjected to noise reduction processing to remove color noise, thereby improving the image quality of the fused image, wherein the noise reduction processing method may Including but not limited to Gaussian filtering.
  • the mutual conversion of the YUV signal and the color signal can be implemented by any one of the related art conversion algorithms, and the UV component can be extracted from the YUV signal and the UV component can be combined with the luminance signal.
  • the image processor 130 which is described above, performs the fusion processing of the color signal and the luminance signal to obtain a specific implementation of the fused image, which is only an example and should not be construed as limiting the embodiments of the present application.
  • the image processor 130 may first optimize the color signal and the luminance signal, and then perform fusion processing on the optimized color signal and the optimized processed luminance signal to obtain a fused image.
  • the optimization processing of the color signal may include: performing low-pass filtering processing on the color signal to implement noise reduction on the color signal; and optimizing processing of the luminance signal may include: performing high-pass filtering processing on the luminance signal to implement The edge of the luminance signal is enhanced.
  • the light collecting device blocks the spectrum of the first predetermined wavelength interval in the incident light, so that the portion with a large quantum efficiency contrast of each channel in the near-infrared band is filtered, thereby simplifying the color of the image processor.
  • the operation of removing the infrared component in the signal improves the fusion efficiency.
  • the image fusion device provided by the embodiment of the present invention performs image acquisition by using an image sensor having four types of photosensitive channels, and the infrared light and the visible light are simultaneously acquired when the optical system needs special design, and the structural complexity is greatly reduced. This allows the scope of use to be expanded.
  • the light collecting device 110 may include: a band stop filter and a first type of optical lens, in order to block the spectrum of the first predetermined wavelength interval in the incident light;
  • the band rejection filter is configured to block a spectrum of the first predetermined wavelength interval in the light transmitted by the first type of optical lens to obtain target light.
  • the resistive filter may be a coating integrated on the first type of optical lens by coating; or the resistive filter may be a patch disposed on the first type of optical lens, etc. .
  • the full-spectrum transmission mode is a mode in which the full-band spectrum is transmitted, that is, the spectrum of any band is not blocked; and the first-type optical lens transmits the incident light in a full-spectrum transmission manner.
  • the band of light transmitted by the first type of optical lens is the same as the band of the incident light, that is, the first type of optical lens does not block the spectrum of any band.
  • the light harvesting device 110 includes a second type of optical lens capable of blocking the spectrum of the first predetermined wavelength interval.
  • the specific implementation of the optical collection device 110 is not limited to the embodiment of the present application.
  • the difference in response between the RGB channels in the second predetermined wavelength interval of the infrared band is lower than a second predetermined threshold to ensure The image fusion quality is improved by removing the infrared component, and the second predetermined wavelength interval is [T3, T4], wherein T4 is greater than T3, and T3 is greater than or equal to 750 nm, and T4 is less than or equal to 1100 nm.
  • the second predetermined threshold may be set according to the actual situation, and the embodiment of the present application is not limited herein.
  • the specific structure of the image sensor 120 may be various, which is not limited in the embodiment of the present application. : Add specific optical components such as filters and the like to the image sensor.
  • the image sensor 120 may perform multiple exposure acquisitions in the same frame time.
  • single or multiple exposures can be manually set.
  • the image processor 130 parses the image signal into a color signal and a luminance signal that is sensitive to the infrared band, and may include:
  • the color signal is parsed
  • the image signal formed by the second type of exposure is used to analyze and obtain a luminance signal that is sensitive to the infrared band.
  • the first type of exposure and the second type of exposure may be the same exposure duration or different exposure durations, and when the first type of exposure and the second type of exposure are different exposure durations, the exposure time of the first type of exposure may be less than the second The exposure time of the exposure type, of course, the exposure time of the first type of exposure can also be greater than the exposure time of the second type of exposure, which is reasonable.
  • the image signal formed by using the first type of exposure may be:
  • the image signal formed by the second type of exposure is used to analyze the brightness signal that is sensitive to the infrared band.
  • the IR channel in the image signal formed by the second type of exposure is interpolated to generate a luminance signal that is the same as the input resolution and is sensitive to the infrared wavelength band, wherein the input resolution is the resolution that the image signal has.
  • interpolation of any channel involved in the above process may adopt a bilinear, bicubic interpolation algorithm, of course, without being limited thereto.
  • the image signal formed by the short exposure can be used to analyze and obtain the color signal
  • the image signal formed by the long exposure is used to analyze and obtain the brightness signal to ensure the image quality.
  • the exposure of the first type of exposure is performed.
  • the duration is less than the exposure time of the second type of exposure.
  • an image fusion device provided by an embodiment of the present application may further include:
  • the signal controller 140 is configured to adjust the image sensor 120 to form an image signal that meets a predetermined brightness requirement.
  • the signal controller may be specifically configured to perform brightness statistics on the image signal formed by the image sensor 120, and adjust the image sensor 120 to form an image signal that meets a predetermined brightness requirement according to the statistical result.
  • the signal controller 140 may perform the following steps: (a) generating an initial brightness adjustment signal and transmitting the image sensor 120; (b) averaging the average brightness of the image signal generated by the image sensor 120, that is, adding all the pixel values and (c) comparing the average brightness with the reference value, if the difference is within the predetermined range, maintaining the value of the current brightness control signal unchanged; if the difference is outside the predetermined range and greater than the reference value, then the brightness is lowered The value of the control signal; if the difference is outside the predetermined range and less than the reference value, the value of the brightness control signal is adjusted.
  • the signal controller 140 can periodically send a predetermined brightness control signal to the image sensor 120, and the predetermined brightness control signal is a control signal set based on a predetermined brightness requirement.
  • the signal controller 140 can also be used to control the image sensor 120 to switch between single exposure acquisition and multiple exposure acquisition. It is to be noted that the specific implementation of the signal controller 140 that is used to control the image sensor 120 to form an image signal that meets the predetermined brightness requirement is merely an example and should not be construed as limiting the embodiments of the present application.
  • an image fusion device provided by an embodiment of the present application may further include: an infrared light filling device 150;
  • the signal controller 140 is further configured to control the infrared filler 150 to perform infrared fill light on the image sensor.
  • the signal controller 140 can detect the gain value g in the brightness adjustment signal.
  • g is greater than the threshold T1
  • the fill light control signal is set to 1
  • the infrared fill light is turned on
  • g is smaller than At the threshold T2
  • the fill light control signal is set to 0, and the infrared fill light is turned off, wherein T1 is greater than T2.
  • T1 and T2 may be set according to actual conditions, which are not limited herein; in addition, the signal controller 140 given above controls the infrared supplement 150 to perform infrared on the image sensor.
  • the specific implementation of the fill light is merely illustrative and should not be construed as limiting the embodiments of the present application.
  • the embodiment of the present application provides an image fusion device.
  • an image fusion device provided by an embodiment of the present application may include:
  • the light collecting device 210, the image processor 230, and the image sensor 220 having four types of photosensitive channels, the four types of photosensitive channels include: red, green and blue RGB channels and full-band W channels;
  • the light collecting device 210 is configured to block a spectrum of a first predetermined wavelength interval in the incident light to obtain a target light, wherein the first predetermined wavelength interval is: a difference in response between the RGB channel and the W channel in the infrared sensor in the image sensor 220 a wavelength interval of a spectrum above a first predetermined threshold;
  • the image sensor 220 is configured to convert the target light into an image signal through the RGB channel and the W channel;
  • the image processor 230 is configured to parse the image signal into a color signal and a brightness signal, and fuse the color signal and the brightness signal to obtain a fused image, wherein the color signal is based on an R channel and a G channel. Or the signal obtained by the B channel analysis.
  • the portion of the quantum efficiency contrast of each channel in the near-infrared band is filtered out, thereby simplifying the image processor to remove the infrared component from the color signal.
  • the operation improves the fusion efficiency.
  • the specific value of the first predetermined threshold may be set according to actual conditions, and is not limited herein. It should be emphasized that, among the three differences of the values of the R channel or the G channel or the B channel and the values of the W channel, respectively, as long as any one of the differences is higher than the first predetermined threshold, the light collecting device 210 may be corresponding to the corresponding wavelength interval. The spectral portion is blocked.
  • the first predetermined wavelength interval may be [T1, T2], wherein the value of T1 is in the interval [600 nm, 800 nm], and the value of T2 is in the interval [750 nm, 1100 nm]. ]Inside. It can be understood that by increasing the blocking function of the spectrum of the predetermined wavelength interval of the incident light, the RGB channel and the W channel of the image sensor 220 are filtered in the infrared band (650 nm - 1100 nm) in response to the spectral region with a large difference. Thereby, the image signal formed by the image sensor 220 can be restored to a precise color signal and a brightness signal by a simple operation.
  • the gray portion is a partial spectrum that needs to be filtered out.
  • W represents a full-band spectral signal
  • R represents a red light signal
  • G represents a green light signal
  • B represents a blue light signal.
  • FIG. 13 is merely an example and does not have any limitation. However, due to the manufacturing process and the like, the actual filtering curve is usually not steep as shown in FIG. 13, but there is a slope.
  • the Bayer array that is, the BAYER array
  • the Bayer array diagram can be seen as shown in FIG. 2, that is, the red, green, and blue dot matrix information is output in a mosaic manner. Since the Bayer array-based image sensor only has RGB three channels, the infrared spectrum cannot be obtained. Therefore, in order to obtain a fused image of the visible light image and the infrared image, it is necessary to simultaneously acquire infrared light and visible light through prism splitting, different optical sensors, etc., and the structure is complicated. .
  • the image sensor utilized in the embodiments of the present application is an image sensor having four types of photosensitive channels, so that an image signal collected by one image sensor can be parsed out of a color signal and a luminance signal.
  • the image sensor with red, green and blue RGB channels and full band channels provided by the embodiments of the present application is named as an RGBW image sensor.
  • FIG. 9 shows a schematic diagram of an array corresponding to the RGBW image sensor.
  • the RGBW image sensor includes four types of photosensitive channels, namely, RGB channels and W channels.
  • the W channel is a full-band photosensitive channel. Since the W channel is full-band sensitized, the W channel can be sensitive to the infrared band, and the W channel can be used as the brightness channel; while the RGB channel can be sensitive to both the visible band and the infrared band, and the RGB channel is mainly used for visible light. Band sensitivity.
  • the arrays shown in FIG. 9 are merely illustrative and should not be construed as limiting the embodiments of the present application. In addition, those skilled in the art can understand that, in practical applications, the arrays corresponding to the RGBW image sensors have various structures, and can be applied to the embodiments of the present application.
  • the color signal may be formed by subtracting the IR parameter value corresponding to the corresponding pixel position from the value of the R channel, the G channel, or the B channel, where the IR parameter value is an IR value corresponding to the pixel position and the second value.
  • the product of the preset correction value which is the value calculated using the predetermined calculation formula. It can be understood that by subtracting the value of the IR parameter corresponding to the corresponding pixel position from the values of the R channel, the G channel or the B channel traversed, that is, removing the infrared component in the color signal, the infrared component in the color signal can be avoided.
  • the RGB three types of signal components are crosstalked to enhance the image effect under low illumination.
  • the first preset correction value and the second preset correction value may be set according to actual conditions.
  • the first preset correction value can be set to 2
  • the first preset correction value can be set to any integer or decimal of 0 to 1024 according to actual conditions, and those skilled in the art can be used.
  • the value of the first preset correction value is not limited thereto; similarly, the second preset correction value can be generally set to 1.
  • the second preset correction can be corrected according to actual conditions. The value is set to any integer or fraction from 0 to 1024, but it will be understood by those skilled in the art that the value of the second preset correction value is not limited thereto.
  • the process of the image processor 230 parsing the image signal into a color signal and a brightness signal includes:
  • Step d1 performing interpolation on the W channel in the image signal to generate a luminance signal having the same resolution as the input, wherein the input resolution is a resolution of the image signal;
  • Step d2 respectively performing interpolation operations on the R channel, the G channel, and the B channel in the image signal to generate image signals of the same channel as the input resolution;
  • step d3 the image signals of each channel are traversed, and the IR value of each pixel position is calculated by using a predetermined formula, wherein the predetermined formula is (R+G+BW)/n, R is the value of the R channel corresponding to the pixel, and G is the pixel corresponding The value of the G channel, B is the value of the B channel corresponding to the pixel, W is the value of the W channel corresponding to the pixel, and n is the first preset correction value;
  • step d4 the image signals of each channel are traversed, and the value of the R channel, the G channel or the B channel traversed is subtracted from the value of the IR parameter corresponding to the corresponding pixel position to generate a color signal, and the IR parameter value is an IR corresponding to the pixel position.
  • the product of the value and the second preset correction value is a value of the R channel, the G channel or the B channel traversed.
  • each small square corresponds to one pixel, and the resolution of the image signal generated by the RGBW image sensor is 8*8. Since the W channel in the image signal is interpolated to generate a luminance signal having the same resolution as the input, therefore, for FIG. 9, the resolution of the luminance signal generated by interpolating the W channel in the image signal is also It is 8*8, and Fig. 10 shows a schematic diagram of the interpolation result of the W channel. Moreover, the interpolation algorithm used for the W channel interpolation in the image signal may be bilinear, or bicubic, etc., and the specific interpolation process is not limited herein.
  • the resolution of the image signal generated by the RGBW image sensor is related to the array structure, and 8*8 is only the resolution corresponding to the RGBW image sensor having the array structure shown in FIG. 9, and should not constitute an embodiment of the present application. Limited.
  • the image signal is separately obtained by using FIG. 9 as an example.
  • the resolution of the image signals of each channel formed by interpolating the R channel, the G channel and the B channel is 8*8; and the interpolation algorithm used for interpolating the R channel, the G channel and the B channel in the image signal may be double Linear or bicubic, etc., and the interpolation algorithm used for W channel interpolation and RGB channel interpolation may be the same or different, and is not limited herein.
  • the specific implementation of the image processor 230 for parsing the image signal into a color signal and a luminance signal is merely exemplary and should not be construed as limiting the embodiments of the present application.
  • the image processor 230 performs a fusion process of the color signal and the luminance signal to obtain a specific implementation manner of the fused image. The following two are combined in detail.
  • the image processor 230 combines the color signal and the luminance signal to obtain a processed image, including:
  • G is the value of the G channel corresponding to the pixel
  • B is the value of the B channel corresponding to the pixel
  • w1, w2 and w3 are the weight values
  • Step e3 performing color noise reduction processing on the reference channel values K1, K2, and K3;
  • step e4 the luminance signal Y′ on the corresponding pixel is merged with the reference channel value K1-K3 after the color noise reduction, to generate the RGB three-channel values R′, G′ and B′ after the fusion, to obtain the merged image;
  • the image processor 230 combines the color signal and the luminance signal to obtain a processed image, including:
  • Step f1 converting the color signal into a YUV signal, wherein the YUV signal is a brightness and a chrominance signal;
  • Step f2 extracting a chrominance UV component in the YUV signal
  • Step f3 combining the UV component with the luminance signal to form a new YUV signal
  • step f4 the formed new YUV signal is determined as the fused image, or the formed new YUV signal is converted into an RGB signal, and the converted RGB signal is determined as the fused image.
  • Y means brightness (Luminance or Luma), that is, gray level value
  • U and “V” means color (Chrominance or Chroma)
  • the UV component may be subjected to noise reduction processing to remove color noise, thereby improving the image quality of the fused image, wherein the noise reduction processing method may Including but not limited to Gaussian filtering.
  • the mutual conversion of the YUV signal and the color signal can be implemented by any one of the related art conversion algorithms, and the UV component can be extracted from the YUV signal and the UV component can be combined with the luminance signal.
  • the image processor 230 which is described above, performs the fusion processing of the color signal and the luminance signal to obtain a specific implementation of the fused image, which is only an example and should not be construed as limiting the embodiment of the present application.
  • the image processor 230 can first optimize the color signal and the luminance signal, and then fuse the optimized color signal and the optimized processed luminance signal to obtain a fused image.
  • the optimization processing of the color signal may include: performing low-pass filtering processing on the color signal to implement noise reduction on the color signal; and optimizing processing of the luminance signal may include: performing high-pass filtering processing on the luminance signal to implement The edge of the luminance signal is enhanced.
  • the light collecting device blocks the spectrum of the first predetermined wavelength interval in the incident light, so that the portion with a large quantum efficiency contrast of each channel in the near-infrared band is filtered, thereby simplifying the color of the image processor.
  • the operation of removing the infrared component in the signal improves the fusion efficiency.
  • the image fusion device provided by the embodiment of the present invention performs image acquisition by using an image sensor having four types of photosensitive channels, and the infrared light and the visible light are simultaneously acquired when the optical system needs special design, and the structural complexity is greatly reduced. This allows the scope of use to be expanded.
  • the light collecting device 210 includes: a band stop filter and a first type of optical lens, in order to block the spectrum of the first predetermined wavelength interval in the incident light;
  • the band rejection filter is configured to block a spectrum of the first predetermined wavelength interval in the light transmitted by the first type of optical lens to obtain target light.
  • the resistive filter may be a coating integrated on the first type of optical lens by coating; or the resistive filter may be a patch disposed on the first type of optical lens, etc. .
  • the full-spectrum transmission mode is a mode in which the full-band spectrum is transmitted, that is, the spectrum of any band is not blocked; and the first-type optical lens transmits the incident light in a full-spectrum transmission manner.
  • the band of light transmitted by the first type of optical lens is the same as the band of the incident light, that is, the first type of optical lens does not block the spectrum of any band.
  • the light collecting device 210 may include a second type of optical lens capable of blocking the spectrum of the first predetermined wavelength interval.
  • the specific implementation of the above-mentioned light collecting device 210 is merely an example and should not be construed as limiting the embodiments of the present application.
  • the difference in response between the RGB channels in the second predetermined wavelength interval of the infrared band is lower than a second predetermined threshold.
  • the image fusion quality is improved by ensuring accurate reduction of the color after removing the infrared component, wherein the second predetermined wavelength interval is [T3, T4], wherein T4 is greater than T3, and T3 is greater than or equal to 750 nm, and T4 is less than or equal to 1100 nm.
  • the second predetermined threshold may be set according to the actual situation, and the embodiment of the present application is not limited herein.
  • the specific structure of the image sensor 220 may be various, which is not limited in the embodiment of the present application. : Add specific optical components such as filters and the like to the image sensor.
  • the image sensor 220 performs multiple exposure acquisitions in the same frame time.
  • single or multiple exposures can be manually set.
  • the image processor 230 parses the image signal into a color signal and a luminance signal, which may include:
  • the color signal is parsed
  • the luminance signal is obtained by analyzing the image signal formed by the second type of exposure.
  • the first type of exposure and the second type of exposure may be the same exposure duration or different exposure durations, and when the first type of exposure and the second type of exposure are different exposure durations, the exposure time of the first type of exposure may be less than the second The exposure time of the exposure type, of course, the exposure time of the first type of exposure can also be greater than the exposure time of the second type of exposure, which is reasonable.
  • the image signal formed by using the first type of exposure may be:
  • the image signals of the respective channels are traversed, and the IR value of each pixel position is calculated by a predetermined formula, wherein the predetermined formula is (R+G+BW)/n, R is the value of the R channel corresponding to the pixel, and G is the G corresponding to the pixel.
  • the value of the channel, B is the value on the B channel corresponding to the pixel, W is the value of the W channel corresponding to the pixel, and n is the first preset correction value;
  • the resolution of the brightness signal may specifically include:
  • the W channel interpolation in the image signal formed by the second type of exposure generates a luminance signal having the same resolution as the input, wherein the input resolution is the resolution of the image signal.
  • interpolation of any channel involved in the above process may adopt a bilinear, bicubic interpolation algorithm, of course, without being limited thereto.
  • an image fusion device provided by the embodiment of the present application may further include:
  • the signal controller 240 is configured to adjust the image sensor 220 to form an image signal that meets a predetermined brightness requirement.
  • the signal controller 240 may be specifically configured to perform brightness statistics on the image signal formed by the image sensor 220, and adjust the image sensor 220 to form an image signal that meets a predetermined brightness requirement according to the statistical result. . Specifically, the signal controller 240 may perform the following steps: (a) generating an initial brightness adjustment signal and transmitting the image sensor 220; (b) averaging the average brightness of the image signal generated by the image sensor 220, that is, adding all the pixel values and (c) comparing the average brightness with the reference value, if the difference is within the predetermined range, maintaining the value of the current brightness control signal unchanged; if the difference is outside the predetermined range and greater than the reference value, then the brightness is lowered The value of the control signal; if the difference is outside the predetermined range and less than the reference value, the value of the brightness control signal is adjusted.
  • the signal controller 240 can periodically send a predetermined brightness control signal to the image sensor 220, and the predetermined brightness control signal is a control signal set based on a predetermined brightness requirement.
  • the signal controller 240 can also be used to control the image sensor 220 to switch between single exposure acquisition and multiple exposure acquisition. It is to be noted that the specific implementation of the signal controller 240 that is used to control the image sensor 220 to form an image signal that meets the predetermined brightness requirement is merely an example and should not be construed as limiting the embodiments of the present application.
  • an image fusion device provided by an embodiment of the present application may further include: an infrared light filling device 250;
  • the signal controller 240 is further configured to control the infrared filler 250 to perform infrared fill light on the image sensor.
  • the signal controller 240 can detect the gain value g in the brightness adjustment signal.
  • g is greater than the threshold T1
  • the fill light control signal is set to 1
  • the infrared fill light is turned on
  • g is smaller than At the threshold T2
  • the fill light control signal is set to 0, and the infrared fill light is turned off, wherein T1 is greater than T2.
  • T1 and T2 may be set according to actual conditions, which are not limited herein; in addition, the signal controller 240 given above controls the infrared filler 250 to perform infrared on the image sensor.
  • the specific implementation of the fill light is merely illustrative and should not be construed as limiting the embodiments of the present application.
  • the embodiment of the present application further provides an image fusion device, which is applied to the image fusion device provided by the first aspect of the embodiment of the present application, where the image fusion device has Four types of photosensitive channels, the four types of photosensitive channels include: red, green and blue RGB channels and infrared IR channels; as shown in FIG. 14, the method may include the following steps:
  • S1401 Blocking a spectrum of a first predetermined wavelength interval in the incident light to obtain a target light; wherein the first predetermined wavelength interval is: a difference in response of the RGB channel and the IR channel in the infrared band of the image sensor of the image fusion device a wavelength interval of a spectrum above a first predetermined threshold;
  • RGB channel and the IR channel to convert the target light into an image signal may be implemented in any one of the related technologies, which is not limited herein.
  • the specific structure of the image fusion device may refer to the content described in the embodiment shown in the first aspect, and details are not described herein.
  • the first predetermined threshold may be set according to actual conditions, and is not limited herein.
  • the first predetermined wavelength interval may be [T1, T2], wherein the value of T1 is in the interval [600 nm, 800 nm], and the value of T2 is in the interval [750 nm, 1100 nm]. ]Inside.
  • the color signal may be formed by subtracting the value of the IR parameter corresponding to the corresponding pixel position by the value of the R channel, the G channel, or the B channel, where the IR parameter value is the value of the IR channel corresponding to the pixel position.
  • the product of the preset correction value It can be understood that by subtracting the value of the IR parameter corresponding to the corresponding pixel position from the values of the R channel, the G channel or the B channel traversed, that is, removing the infrared component in the color signal, the infrared component in the color signal can be avoided.
  • the RGB three types of signal components are crosstalked to enhance the image effect under low illumination. It should be emphasized that the preset correction value can be set according to the actual situation.
  • the preset correction value can be generally set to 1.
  • the preset correction value can be set to 0 according to actual conditions. Any integer or fraction in 1024, and those skilled in the art can understand that the value of the preset correction value is not limited thereto.
  • the step of parsing the image signal into a color signal and a brightness signal that is sensitive to an infrared band may include:
  • Step a1 performing an interpolation operation on the IR channel in the image signal to generate a luminance signal that is the same as the input resolution and is sensitive to the infrared wavelength band, wherein the input resolution is a resolution of the image signal;
  • step a2 the image signal is traversed, and the value of the R channel, the G channel or the B channel traversed is subtracted from the value of the IR parameter corresponding to the corresponding pixel position, and the IR parameter value is the value of the IR channel corresponding to the pixel position and the pre- Set the product of the correction value;
  • step a3 the R channel, the G channel, and the B channel in the image signal are respectively interpolated to generate a color signal having the same resolution as the input.
  • the step of performing the fusion processing on the color signal and the luminance signal to obtain the merged image may include:
  • Y (R*w1+G*w2+B*w3)/(w1+w2+w3)
  • R is an R channel corresponding to the pixel
  • G is the value of the G channel corresponding to the pixel
  • B is the value of the B channel corresponding to the pixel
  • w1, w2 and w3 are the weight values
  • Step b3 performing color noise reduction processing on the reference channel values K1, K2, and K3;
  • Step b4 the luminance signal Y′ on the corresponding pixel is merged with the reference channel value K1-K3 after the color noise reduction, to generate the RGB three-channel values R′, G′ and B′ after the fusion, to obtain the merged image;
  • the step of performing the fusion processing on the color signal and the luminance signal to obtain a merged image includes:
  • Step c1 converting the color signal into a YUV signal, wherein the YUV signal is a brightness and a chrominance signal;
  • Step c2 extracting a chrominance UV component in the YUV signal
  • Step c3 combining the UV component with the luminance signal to form a new YUV signal
  • step c4 the formed new YUV signal is determined as the fused image, or the formed new YUV signal is converted into an RGB signal, and the converted RGB signal is determined as the fused image.
  • the image fusion device may perform multiple exposure acquisitions in the same frame time in the process of converting the target light into an image signal, wherein single or multiple exposures may be manually set, of course, not Limited to this.
  • the process of parsing the image signal into a color signal and a brightness signal that is sensitive to the infrared band may include:
  • the color signal is parsed
  • the image signal formed by the second type of exposure is used to analyze and obtain a luminance signal that is sensitive to the infrared band.
  • the first type of exposure and the second type of exposure may be the same exposure duration or different exposure durations, and when the first type of exposure and the second type of exposure are different exposure durations, the exposure time of the first type of exposure may be less than the second The exposure time of the exposure type, of course, the exposure time of the first type of exposure can also be greater than the exposure time of the second type of exposure, which is reasonable.
  • the image signal formed by using the first type of exposure may be:
  • the image signal formed by the second type of exposure is used to analyze the brightness signal that is sensitive to the infrared band.
  • the IR channel in the image signal formed by the second type of exposure is interpolated to generate a luminance signal that is the same as the input resolution and is sensitive to the infrared wavelength, wherein the input resolution is the resolution that the image signal has.
  • interpolation of any channel involved in the above process may adopt a bilinear, bicubic interpolation algorithm, of course, without being limited thereto.
  • the image signal formed by the short exposure can be used to analyze and obtain the color signal
  • the image signal formed by the long exposure is used to analyze and obtain the brightness signal to ensure the image quality.
  • the exposure of the first type of exposure is performed.
  • the duration is less than the exposure time of the second type of exposure.
  • the image fusion method provided by the embodiment of the present invention can reduce the calculation amount of removing the infrared component from the color signal, improve the fusion efficiency, and achieve the purpose of acquiring the dual-band image through the simple image fusion device.
  • the image fusion device provided by the second aspect of the present disclosure is further provided with an image fusion device, which is applied to the image fusion device provided by the second aspect of the embodiment of the present application, where the image fusion device has
  • the four types of photosensitive channels include: red, green and blue RGB channels and full-band W channels; as shown in FIG. 15, the method may include the following steps:
  • the implementation of the RGB channel and the W channel to convert the target light into an image signal may be implemented in any implementation manner of the related art, which is not limited herein.
  • S1503 parsing the image signal into a color signal and a brightness signal, and merging the color signal and the brightness signal to obtain a fused image, wherein the color signal is obtained by analyzing the R channel, the G channel, or the B channel. signal.
  • the portion of the quantum efficiency contrast of each channel in the near-infrared band is filtered out, thereby simplifying the image processor to remove the infrared component from the color signal.
  • the operation improves the fusion efficiency.
  • the specific value of the first predetermined threshold may be set according to actual conditions, and is not limited herein.
  • the first predetermined wavelength interval may be [T1, T2], wherein the value of T1 is in the interval [600 nm, 800 nm], and the value of T2 is in the interval [750 nm, 1100 nm]. ]Inside.
  • the specific structure of the image fusion device can refer to the content described in the embodiment shown in the second aspect, and details are not described herein.
  • the color signal may be formed by subtracting the IR parameter value corresponding to the corresponding pixel position from the value of the R channel, the G channel, or the B channel, where the IR parameter value is an IR value corresponding to the pixel position and the second value.
  • the product of the preset correction value which is the value calculated using the predetermined calculation formula.
  • the first preset correction value and the second preset correction value may be set according to actual conditions.
  • the first preset correction value can be set to 2
  • the first preset correction value can be set to any integer or decimal of 0 to 1024 according to actual conditions, and those skilled in the art can be used.
  • the value of the first preset correction value is not limited thereto; similarly, the second preset correction value can be generally set to 1.
  • the second preset correction can be corrected according to actual conditions. The value is set to any integer or fraction from 0 to 1024, but it will be understood by those skilled in the art that the value of the second preset correction value is not limited thereto.
  • the step of parsing the image signal into a color signal and a brightness signal includes:
  • Step d1 performing interpolation on the W channel in the image signal to generate a luminance signal having the same resolution as the input, wherein the input resolution is a resolution of the image signal;
  • Step d2 respectively performing interpolation operations on the R channel, the G channel, and the B channel in the image signal to generate image signals of the same channel as the input resolution;
  • step d3 the image signals of each channel are traversed, and the IR value of each pixel position is calculated by using a predetermined formula, wherein the predetermined formula is (R+G+BW)/n, R is the value of the R channel corresponding to the pixel, and G is the pixel corresponding The value of the G channel, B is the value of the B channel corresponding to the pixel, W is the value of the W channel corresponding to the pixel, and n is the first preset correction value;
  • step d4 the image signals of each channel are traversed, and the value of the R channel, the G channel or the B channel traversed is subtracted from the value of the IR parameter corresponding to the corresponding pixel position to generate a color signal, and the IR parameter value is an IR corresponding to the pixel position.
  • the product of the value and the second preset correction value is a value of the R channel, the G channel or the B channel traversed.
  • the step of performing the fusion processing on the color signal and the luminance signal to obtain the merged image may include:
  • G is the value of the G channel corresponding to the pixel
  • B is the value of the B channel corresponding to the pixel
  • w1, w2 and w3 are the weight values
  • Step e3 performing color noise reduction processing on the reference channel values K1, K2, and K3;
  • step e4 the luminance signal Y′ on the corresponding pixel is merged with the reference channel value K1-K3 after the color noise reduction, to generate the RGB three-channel values R′, G′ and B′ after the fusion, to obtain the merged image;
  • the step of performing the fusion processing on the color signal and the brightness signal to obtain the merged image may include:
  • Step f1 converting the color signal into a YUV signal, wherein the YUV signal is a brightness and a chrominance signal;
  • Step f2 extracting a chrominance UV component in the YUV signal
  • Step f3 combining the UV component with the luminance signal to form a new YUV signal
  • step f4 the formed new YUV signal is determined as the fused image, or the formed new YUV signal is converted into an RGB signal, and the converted RGB signal is determined as the fused image.
  • the image fusion device may perform multiple exposure acquisitions in the same frame time in the process of converting the target light into an image signal, wherein single or multiple exposures may be manually set, of course, not Limited to this.
  • the process of parsing the image signal into a color signal and a brightness signal that is sensitive to the infrared band may include:
  • the color signal is parsed
  • the luminance signal is obtained by analyzing the image signal formed by the second type of exposure.
  • the first type of exposure and the second type of exposure may be the same exposure duration or different exposure durations, and when the first type of exposure and the second type of exposure are different exposure durations, the exposure time of the first type of exposure may be less than the second The exposure time of the exposure type, of course, the exposure time of the first type of exposure can also be greater than the exposure time of the second type of exposure, which is reasonable.
  • the image signal formed by using the first type of exposure may be:
  • the image signals of the respective channels are traversed, and the IR value of each pixel position is calculated by a predetermined formula, wherein the predetermined formula is (R+G+BW)/n, R is the value of the R channel corresponding to the pixel, and G is the G corresponding to the pixel.
  • the value of the channel, B is the value on the B channel corresponding to the pixel, W is the value of the W channel corresponding to the pixel, and n is the first preset correction value;
  • the resolution of the brightness signal may specifically include:
  • the W channel interpolation in the image signal formed by the second type of exposure generates a luminance signal having the same resolution as the input, wherein the input resolution is the resolution of the image signal.
  • interpolation of any channel involved in the above process may adopt a bilinear, bicubic interpolation algorithm, of course, without being limited thereto.
  • the image fusion method provided by the embodiment of the present invention can reduce the calculation amount of removing the infrared component from the color signal, improve the fusion efficiency, and achieve the purpose of acquiring the dual-band image through the simple image fusion device.
  • the embodiment of the present application further provides a storage medium for storing executable program code, the executable program code being executed to perform: any one of the foregoing third aspects The method steps of the image fusion method provided by the embodiment.
  • the image fusion device provided by the first aspect, the calculation amount of removing the infrared component from the color signal can be reduced, and the fusion efficiency is improved, and at the same time, the implementation is achieved.
  • the embodiment of the present application provides a storage medium for storing executable code, where the executable code is executed to perform: any one of the foregoing method embodiments provides Method steps of the image fusion method.
  • the image fusion device provided by the second aspect, the calculation amount of removing the infrared component from the color signal can be reduced, the fusion efficiency is improved, and the structure is realized.
  • a simple image fusion device for the purpose of acquiring dual-band images.
  • the embodiment of the present application provides an application program for performing, at runtime, the method steps of the image fusion method provided by any one of the foregoing method embodiments.
  • the application program provided by the embodiment of the present application when running on the image fusion device provided in the first aspect, can reduce the calculation amount of removing the infrared component from the color signal, improve the fusion efficiency, and realize the simple structure through the structure.
  • Image fusion device for the purpose of acquiring dual band images.
  • the embodiment of the present application provides an application program for performing, at runtime, the method steps of the image fusion method provided by any one of the foregoing method embodiments.
  • the application program provided by the embodiment of the present application when running on the image fusion device provided in the second aspect, can reduce the calculation amount of removing the infrared component from the color signal, improve the fusion efficiency, and realize the simple structure through the structure.
  • Image fusion device for the purpose of acquiring dual band images.
  • first class “second class”, “third class”, “fourth class”, “first” and “second” involved in the embodiments of the present application are only used for naming. Differentiate objects of the same type so that they are more convenient and clear when referring to them, and do not have any limiting meaning.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Color Television Image Signal Generators (AREA)
  • Studio Devices (AREA)

Abstract

本申请实施例提供了图像融合设备及图像融合方法。该图像融合设备包括:光采集装置、图像处理器,以及具有红绿蓝RGB通道和红外IR通道的图像传感器;光采集装置对入射光中第一预定波长区间的光谱进行阻挡,得到目标光;第一预定波长区间为:该图像传感器中RGB通道与IR通道在红外波段上响应差别高于第一预定阈值的光谱的波长区间;图像传感器通过RGB通道和IR通道,将目标光转换为图像信号;图像处理器将图像信号解析为色彩信号和对红外波段感光的亮度信号,并将色彩信号和亮度信号进行融合处理,得到融合后图像。通过本方案,可以简化图像处理器从色彩信号中去除红外成分的运算,提高融合效率。

Description

图像融合设备和图像融合方法
本申请要求于2017年02月10日提交中国专利局、申请号为201710074203.9发明名称为“图像融合设备和图像融合方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像处理技术领域,特别是涉及图像融合设备和图像融合方法。
背景技术
在低照度场景下,为了保证所采集的图像涵盖较多的图像信息,通常需要分光融合的图像采集设备来采集图像。具体的,分光融合的图像采集设备采集图像的基本原理为:采集可见光信号所对应的可见光图像以及采集红外光信号所对应的红外图像,并将可见光图像和红外图像进行融合,得到融合后的图像,其中,融合后的图像为双波段图像,相对于属于单波段的可见光图像和红外图像中的任一种图像而言,体现出的图像信息更多。
分光融合的图像采集设备具体包括:半反半透镜、可见光传感器、红外光传感器、配准单元和融合单元,具体的,该半反半透镜用于将入射光分解为可见光和红外光,该可见光传感器用于对可见光感光形成可见光图像,该红外光传感器用于对红外光感光形成红外图像,该配准单元用于消除红外图像和可见光图像之间的位置偏差,该融合单元用于对配准单元所形成的位置校正后的红外图像与可见光图像进行加权融合,其中,融合得到形成的融合后的图像为图像采集设备的输出图像。
尽管分光融合的图像采集设备能够得到融合后图像,但是,由于可见光图像中会掺杂红外成分,使得图像处理器在接收到可见光图像时,从可见光图像(即色彩信号)中去除红外成分的运算量较大,导致融合效率较低。
发明内容
本申请实施例的目的在于提供图像融合设备和图像融合方法,以降低图像处理器从色彩信号中去除红外成分的运算量,提高融合效率。具体技术方案如下:
第一方面,本申请实施例提供了一种图像融合设备,包括:
光采集装置、图像处理器,以及具有四类感光通道的图像传感器,所述四类感光通道包括:红绿蓝RGB通道和红外IR通道;
所述光采集装置,用于对入射光中第一预定波长区间的光谱进行阻挡,得到目标光;其中,所述第一预定波长区间为:所述图像传感器中RGB通道与IR通道在红外波段上响应差别高于第一预定阈值的光谱的波长区间;
所述图像传感器,用于通过所述RGB通道和所述IR通道,将所述目标光转换为图像信号;
所述图像处理器,用于将所述图像信号解析为色彩信号和对红外波段感光的亮度信号,并将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像,其中,所述色彩信号为基于R通道、G通道或B通道解析得到的信号。
第二方面,本申请实施例还提供了一种图像融合设备,包括:
光采集装置、图像处理器,以及具有四类感光通道的图像传感器,所述四类感光通道包括:红绿蓝RGB通道和全波段W通道;
所述光采集装置,用于对入射光中第一预定波长区间的光谱进行阻挡,得到目标光,所述第一预定波长区间为:所述图像传感器中RGB通道与W通道在红外波段上响应差别高于第一预定阈值的光谱的波长区间;
所述图像传感器,用于通过所述RGB通道和所述W通道,将所述目标光转换为图像信号;
所述图像处理器,用于将所述图像信号解析为色彩信号和亮度信号,并将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像,其中,所述色彩信号为基于R通道、G通道或B通道解析得到的信号。
第三方面,本申请实施例还提供了一种图像融合方法,应用于图像融合设备,所述图像融合设备具有四类感光通道,所述四类感光通道包括:红绿蓝RGB通道和红外IR通道;所述方法包括:
对入射光中第一预定波长区间的光谱进行阻挡,得到目标光;其中,所述第一预定波长区间为:所述图像融合设备的图像传感器中RGB通道与IR通 道在红外波段上响应差别高于第一预定阈值的光谱的波长区间;
通过所述RGB通道和所述IR通道,将所述目标光转换为图像信号;
将所述图像信号解析为色彩信号和对红外波段感光的亮度信号,并将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像,其中,所述色彩信号为基于R通道、G通道或B通道解析得到的信号。
第四方面,本申请实施例还提供了一种图像融合方法,应用于图像融合设备,所述图像融合设备具有四类感光通道,所述四类感光通道包括:红绿蓝RGB通道和全波段W通道;所述方法包括:
对入射光中第一预定波长区间的光谱进行阻挡,得到目标光,所述第一预定波长区间为:所述图像融合设备的图像传感器中RGB通道与W通道在红外波段上响应差别高于第一预定阈值的光谱的波长区间;
通过所述RGB通道和所述W通道,将所述目标光转换为图像信号;
将所述图像信号解析为色彩信号和亮度信号,并将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像,其中,所述色彩信号为基于R通道、G通道或B通道解析得到的信号。
第五方面,本申请实施例提供了一种存储介质,用于存储可执行代码,所述可执行代码用于在运行时执行:本申请实施例第三方面所提供的图像融合方法的方法步骤。
第六方面,本申请实施例提供了一种存储介质,用于存储可执行代码,所述可执行代码用于在运行时执行:本申请实施例第四方面所提供的图像融合方法的方法步骤。
第七方面,本申请实施例提供了一种应用程序,用于在运行时执行:本申请实施例第三方面所提供的图像融合方法的方法步骤。
第八方面,本申请实施例提供了一种应用程序,用于在运行时执行:本申请实施例第四方面所提供的图像融合方法的方法步骤。
可见,本申请实施例中,光采集装置对入射光中第一预定波长区间的光谱进行阻挡,使得近红外波段中各通道量子效率反差较大的部分被滤除,从 而简化图像处理器从色彩信号中去除红外成分的运算,提高了融合效率。并且,本申请实施例所提供的图像融合设备通过具有四类感光通道的图像传感器进行图像采集,相比在光学系统上需要特殊设计才能同时获取红外光与可见光,结构复杂度极大降低,从而使得使用范围能够得到扩展。
另外,通过本申请实施例所提供的图像融合方法,可以降低从色彩信号中去除红外成分的运算量,提高了融合效率,同时,实现了通过结构简单的图像融合设备来采集双波段图像的目的。
附图说明
为了更清楚地说明本申请实施例和技术方案,下面对实施例和所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例第一方面所提供的一种图像融合设备的结构示意图;
图2为拜耳阵列示意图;
图3为RGBIR图像传感器所对应的阵列示意图;
图4为IR通道插值所对应的插值结果的示意图;
图5为光谱阻挡的原理示意图;
图6为本申请实施例第一方面所提供的一种图像融合设备的另一结构示意图;
图7为本申请实施例第一方面所提供的一种图像融合设备的另一结构示意图;
图8为本申请实施例第二方面所提供的一种图像融合设备的结构示意图;
图9为RGBW图像传感器所对应的阵列示意图;
图10为W通道插值所对应的插值结果的示意图;
图11为本申请实施例第二方面所提供的一种图像融合设备的另一结构示意图;
图12为本申请实施例第二方面所提供的一种图像融合设备的另一结构示意图;
图13为光谱阻挡的另一原理示意图;
图14为本申请实施所提供的应用于第一方面所提供的一种图像融合设备的一种图像融合方法的流程图;
图15为本申请实施例所提供的应用于第二方面所提供的一种图像融合设备的一种图像融合方法的流程图。
具体实施方式
为使本申请的目的、技术方案、及优点更加清楚明白,以下参照附图并举实施例,对本申请进一步详细说明。显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
第一方面,为了解决图像处理器从色彩信号中去除红外成分的运算量大,导致融合效率不高的问题,本申请实施例提供了一种图像融合设备。
如图1所示,本申请实施例所提供的一种图像融合设备,可以包括:
光采集装置110、图像处理器130,以及具有四类感光通道的图像传感器120,该四类感光通道包括:红绿蓝RGB通道和红外IR通道;
该光采集装置110,用于对入射光中第一预定波长区间的光谱进行阻挡,得到目标光;其中,所述第一预定波长区间为:该图像传感器120中RGB通道与IR通道在红外波段上响应差别高于第一预定阈值的光谱的波长区间;
该图像传感器120,用于通过该RGB通道和该IR通道,将该目标光转换为图像信号;
该图像处理器130,用于将该图像信号解析为色彩信号以及对红外波段感光的亮度信号,并将该色彩信号和该亮度信号进行融合处理,得到融合后图像,其中,该色彩信号为基于R通道、G通道或B通道解析得到的信号。
需要强调的是,通过对入射光中第一预定波长区间的光谱进行阻挡,使 得近红外波段中各通道量子效率反差较大的部分被滤除,从而简化图像处理器从色彩信号中去除红外成分的运算,提高了融合效率。其中,第一预定阈值的具体值可以根据实际情况设定,在此不做限定。需要强调的是,R通道或G通道或B通道的值分别与IR通道的值的三个差量中,只要任意一个差量高于第一预定阈值,光采集装置110可对相应波长区间的光谱部分进行阻挡。
可选地,为了让红外波段和可见光波段通过,该第一预定波长区间可以为[T1,T2],其中,T1的值处于区间[600nm,800nm]内,T2的值处于区间[750nm,1100nm]内。可以理解的是,通过增加对入射光预定波长区间的光谱的阻挡的功能,使得图像传感器120中,RGB通道和IR通道在红外波段(650nm-1100nm)上响应差别较大的光谱区域滤除,从而使得图像传感器120感光形成的图像信号通过简单运算即可还原出精准的色彩信号,以及对红外波段(650-1100nm)感光的亮度信号。如图5所示,灰色部分为需要被阻挡滤除的部分光谱,图5中,IR表征红外信号、R表征红光信号、G表征绿光信号、B表征蓝光信号。需要强调的是,图5仅仅是示例,并不具有任何限定意义,而由于制作工艺等原因,实际滤除曲线通常不会如图5所示的陡峭程度,但是会存在坡度。本领域技术人员可以理解的是,拜耳阵列,即BAYER阵列,是图像传感器的一种数据格式,拜耳阵列示意图可以参见图2所示,即以马赛克方式输出红、绿、蓝点阵信息。由于基于拜耳阵列的图像传感器仅仅具有RGB三通道,无法得到红外光谱,因此,为了得到可见光图像和红外图像的融合图像,需要通过棱镜分光、不同光学传感器等同时获取红外光和可见光,结构较为复杂。而为了降低结构复杂度,本申请实施例所利用的图像传感器为具有四类感光通道的图像传感器,以使得一个图像传感器采集的图像信号能够被解析出色彩信号以及对红外波段感光的亮度信号。为了引用方便,对于本申请实施例所提供的具有红绿蓝RGB通道和红外IR通道的图像传感器简称为RGBIR图像传感器。
具体的,图3给出了RGBIR图像传感器所对应的阵列示意图,如图3所示,RGBIR图像传感器包括四类感光通道,即RGB通道和IR通道,具体的,IR通道为对红外波段感光的通道;而RGB通道既可以对可见光波段感光又可以对红外波段感光,并且,RGB通道主要用于对可见光波段感光。需要强调的是,图3所给出的阵列仅仅作为示例性说明,并不应该构成对本申请实施 例的限定。另外,本领域技术人员可以理解的是,实际应用中,RGBIR图像传感器所对应的阵列的结构多种多样,且均可以应用于本申请实施例。
需要说明的是,该色彩信号具体可以由R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值后所形成,该IR参数值为对应像素位置的IR通道的值与预设修正值的乘积。可以理解的是,通过将所遍历的R通道、G通道或B通道的值分别减去与对应像素位置相应的IR参数值,即去除色彩信号中的红外成分,可以避免色彩信号中红外成分与RGB三类信号成分进行串扰,提升低照度下的图像效果。需要强调的是,该预设修正值可以根据实际情况进行设定,举例而言,该预设修正值通常可以设为1,当然,可以根据实际情况,将该预设修正值设为0至1024中的任一整数或小数,而本领域技术人员可以理解的是,预设修正值的取值并不局限于此。可选地,在一种具体实现方式中,该图像处理器130将该图像信号解析为色彩信号以及对红外波段感光的亮度信号过程包括:
步骤a1,对该图像信号中的IR通道进行插值运算,生成与输入分辨率相同的、且对红外波段感光的亮度信号,其中,该输入分辨率为所述图像信号具有的分辨率;
步骤a2,遍历该图像信号,将所遍历到的R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值,该IR参数值为对应像素位置的IR通道的值与预设修正值的乘积;
步骤a3,分别对该图像信号中R通道、G通道和B通道插值,生成与该输入分辨率相同的色彩信号。
仍以图3所示的阵列为例,如图3所示,每一个小方格对应一个像素点,RGBIR图像传感器所生成的图像信号的分辨率为8*8。由于对该图像信号中的IR通道插值,生成与输入分辨率相同的、且对红外波段感光的亮度信号,因此,对于图3而言,对该图像信号中的IR通道插值后所生成的亮度信号的分辨率也为8*8,其中,IR通道的插值结果可以参见图4。并且,对该图像信号中的IR通道插值所采用的插值算法可以为双线性,或双三次等等,具体的插值过程在此不做限定。需要强调的是,RGBIR图像传感器所生成的图像信号的分辨率与阵列结构相关,8*8仅仅是具有图3所示阵列结构的RGBIR图 像传感器对应的分辨率,并不应该构成对本申请实施例的限定。
另外,如图3所示,由于每个小方格内可能是R、可能是B,也可能是G,因此,在遍历该图像信号,会遍历到R通道、G通道或B通道,进而将所遍历到的R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值,以去除红外成分。并且,分别对该图像信号中R通道、G通道和B通道插值所采用的插值算法可以为双线性或双三次等等,对IR通道插值与对RGB通道插值所采用的插值算法可以相同也可以不同,在此不做限定;类似的,由于分别对该图像信号中R通道、G通道和B通道插值,生成与该输入分辨率相同的色彩信号,因此,以图3为例,对该图像信号中的RGB通道插值后所生成的色彩信号的分辨率也为8*8。
需要强调的是,上述所给出的该图像处理器130将该图像信号解析为色彩信号以及对红外波段感光的亮度信号的具体实现方式,仅仅作为示例,并不应该构成对本申请实施例的限定。
可以理解的是,在获得色彩信号和亮度信号后,该图像处理器130将该色彩信号和亮度信号进行融合处理从而得到融合后图像的具体实现方式存在多种。下面结合两种进行详细介绍。
在一种具体实现方式中,该图像处理器130将该色彩信号和亮度信号进行融合处理,得到融合后图像的过程,可以包括:
步骤b1,根据该色彩信号,计算各个像素对应的辅助值Y,其中,Y=(R*w1+G*w2+B*w3)/(w1+w2+w3),R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道的值,w1、w2和w3为权重值;
步骤b2,计算该色彩信号中各个通道值与辅助值Y的比例,得到各个像素对应的参考通道值K1、K2和K3,其中,K1=R/Y,K2=G/Y,K3=B/Y;
步骤b3,对该参考通道值K1、K2和K3进行色彩降噪处理;
步骤b4,将对应像素上的亮度信号Y’与色彩降噪后的参考通道值K1-K3进行融合处理,生成融合后的RGB三通道值R’、G’和B’,得到融合后图像;其中,R’=K1*Y’;G’=K2*Y’;B’=K3*Y’。
其中,本申请实施例不对权重值w1、w2和w3的值进行限定;色彩降噪处理可以采用的方式包括但不局限于高斯滤波。举例而言,假设w1=1、w2=1,w3=1,此时,Y=(R+G+B)/3。
在另一种具体实现方式中,该图像处理器130将该色彩信号和亮度信号进行融合处理,得到融合后图像的过程,可以包括:
步骤c1,将该色彩信号转换为YUV信号,该YUV信号为明亮度及色度信号;
步骤c2,提取该YUV信号中的色度UV分量;
步骤c3,对UV分量与该亮度信号进行组合,形成新的YUV信号;
步骤c4,将所形成的新的YUV信号确定为融合后图像,或者,将所形成的新的YUV信号转换为RGB信号,并将转换为的RGB信号确定融合后图像。
可以理解的是,对于YUV格式而言,“Y”表示明亮度(Luminance或Luma),也就是灰阶值;而“U”和“V”表示的则是色度(Chrominance或Chroma),作用是描述影像色彩及饱和度,用于指定像素的颜色。而在提取到UV分量之后,在对UV分量与亮度信号进行组合之前,可以对UV分量进行降噪处理,以去除色彩噪声,从而提高融合后图像的图片质量,其中,降噪处理的方式可以包括但不局限于高斯滤波。需要强调的是,YUV信号和色彩信号的相互转换可以采用相关技术中任一种转换算法来实现,并且,可以采用从YUV信号中提取UV分量以及对UV分量与该亮度信号进行组合。
需要强调的是,上述所给出的该图像处理器130将该色彩信号和亮度信号进行融合处理,得到融合后图像的具体实现方式,仅仅作为示例,并不应该构成对本申请实施例的限定。另外,可以理解的是,该图像处理器130可以首先对该色彩信号和亮度信号进行优化处理,进而对优化处理后的色彩信号和优化处理后的亮度信号进行融合处理,得到融合后图像,举例而言,对色彩信号的优化处理可以包括:对色彩信号进行低通滤波处理,以实现对色彩信号进行降噪;而对亮度信号的优化处理可以包括:对亮度信号进行高通滤波处理,以实现亮度信号的边缘增强。
可见,本申请实施例中,光采集装置对入射光中第一预定波长区间的光 谱进行阻挡,使得近红外波段中各通道量子效率反差较大的部分被滤除,从而简化图像处理器从色彩信号中去除红外成分的运算,提高了融合效率。并且,本申请实施例所提供的图像融合设备通过具有四类感光通道的图像传感器进行图像采集,相比在光学系统上需要特殊设计才能同时获取红外光与可见光,结构复杂度极大降低,从而使得使用范围能够得到扩展。
为了实现对入射光中第一预定波长区间的光谱进行阻挡,在一种具体实现方式中,该光采集装置110可以包括:带阻滤光片和第一类光学镜头;
该第一类光学镜头,用于以全光谱透射方式将入射光透射至该带阻滤光片;
该带阻滤光片,用于将该第一类光学镜头所透过的光中的该第一预定波长区间的光谱进行阻挡,以得到目标光。
具体的,带阻滤光片可以为通过镀膜方式集成在该第一类光学镜头上的涂层;或者,带阻滤光片可以为设置于该第一类光学镜头上的贴片,等等。并且,需要说明的是,所述的全光谱透射方式为全波段光谱均被透射的方式,即任何波段的光谱均不被阻挡;而由于第一类光学镜头以全光谱透射方式将入射光透射至该带阻滤光片,因此,该第一类光学镜头所透过的光的波段与入射光的波段相同,即第一类光学镜头未对任何波段的光谱进行阻挡。
为了实现对入射光中第一预定波长的光谱进行阻挡,在另一种实现方式中,该光采集装置110包括:能够阻挡第一预定波长区间的光谱的第二类光学镜头。
需要强调的是,为了实现对入射光中第一预定波长区间的光谱进行阻挡,上述所给出的光采集装置110的具体实现方式仅仅作为示例,并不应该构成对本申请实施例的限定。
可选地,在一种具体实现方式中,该图像传感器120将目标光转换为图像信号时,在红外波段的第二预定波长区间内RGB通道间的响应差别低于第 二预定阈值,以保证去除红外成分后颜色的准确还原,从而提高图像融合质量,其中,该第二预定波长区间为[T3,T4],其中,T4大于T3,且T3大于等于750nm,T4小于等于1100nm。举例而言,如图5所示,图像传感器120将目标光转换为图像信号时,灰色区域右侧波段内RGB通道间的响应满足了一定约束。需要强调的是,该第二预定阈值可以根据实际情况进行设定,本申请实施例在此不做限定。
需要说明的是,为了在红外波段的第二预定波长区间内RGB通道间的响应差别低于第二预定阈值,该图像传感器120的具体结构可以多种多样,本申请实施例并不作限定,如:在图像传感器中增加特定的光学元件,如滤光片等等。
可选地,该图像传感器120将该目标光转换为图像信号的过程中,可以在同一帧时间内进行多次曝光采集。对于该图像传感器120而言,在一种实现方式中,单次或多次曝光可以通过人工方式设定。
并且,对于图像传感器120在同一帧时间内进行多次曝光采集而言,该图像处理器130将该图像信号解析为色彩信号以及对红外波段感光的亮度信号的过程,可以包括:
利用第一类曝光形成的图像信号,解析得到色彩信号;
利用第二类曝光形成的图像信号,解析得到对红外波段感光的亮度信号。
其中,第一类曝光和第二类曝光可以为相同曝光时长也可以为不同曝光时长,当第一类曝光和第二类曝光为不同曝光时长时,第一类曝光的曝光时长可以小于第二类曝光的曝光时长,当然,第一类曝光的曝光时长也可以大于第二类曝光的曝光时长,这都是合理的。
具体的,利用第一类曝光形成的图像信号,解析得到色彩信号具体可以包括:
对第一类曝光形成的图像信号中的IR通道进行插值运算,得到各个像素位置的IR通道的值;
遍历该第一类曝光形成的图像信号,将所遍历到的R通道、G通道或B通道的值减去对应像素位置相应的IR参数值,所述IR参数值为对应像素位置的IR通道的值与预设修正值的乘积;
分别对该第一类曝光形成的图像信号中R通道、G通道和B通道进行插值运算,生成与所述输入分辨率相同的色彩信号;
相应地,利用第二类曝光形成的图像信号,解析得到对红外波段感光的亮度信号具体可以包括:
对第二类曝光形成的图像信号中的IR通道进行插值运算,生成与输入分辨率相同的、且对红外波段感光的亮度信号,其中,所述输入分辨率为该图像信号具有的分辨率。
上述过程所涉及的任一通道的插值可以采用采用双线性、双三次插值算法,当然并不局限于此。
并且,为了保证红外信号充足,可以利用短曝光形成的图像信号来解析得到色彩信号,而利用长曝光形成的图像信号来解析得到亮度信号,以保证图像质量,此时,第一类曝光的曝光时长小于第二类曝光的曝光时长。
可选地,在图1所示的实施例的基础上,如图6所示,本申请实施例所提供的一种图像融合设备还可以包括:
信号控制器140,用于调控该图像传感器120形成符合预定亮度要求的图像信号。
具体的,在一种实现方式中,该信号控制器可以具体用于对该图像传感器120所形成的图像信号进行亮度统计,并根据统计结果调控该图像传感器120形成符合预定亮度要求的图像信号。具体的,该信号控制器140可以执行如下步骤:(a)生成初始亮度调控信号,并发送图像传感器120;(b)统计图像传感器120所生成图像信号的平均亮度,即将所有像素值相加并求平均;(c)将平均亮度与参考值相比较,若差值在预定范围内,则维持当前亮度调控信号的数值不变;若差值在预定范围之外且大于参考值,则下调亮度调控信号 的数值;若差值在预定范围之外且小于参考值,则上调亮度调控信号的数值。
当然,在另一种实现方式中,该信号控制器140可以定时向图像传感器120发送预定的亮度调控信号,该预定的亮度调控信号为基于预定亮度要求所设定的调控信号。
可以理解的是,该信号控制器140还可以用于控制该图像传感器120在单次曝光采集和多次曝光采集之间切换。需要强调的是,上述所给出的信号控制器140调控该图像传感器120形成符合预定亮度要求的图像信号的具体实现方式仅仅作为示例,并不应该构成对本申请实施例的限定。
可选地,在低照度环境下,为了保证融合后图像的亮度较为理想,信噪比较高,可以对该环境进行补光照明,即补充红外光。基于该处理思想,如图7所示,本申请实施例所提供的一种图像融合设备还可以包括:红外补光器150;
该信号控制器140还用于控制该红外补光器150对该图像传感器进行红外补光。
具体的,在一种具体实现方式中,信号控制器140可以检测亮度调控信号中的增益值g,当g大于阈值T1时,补光控制信号置为1,开启红外补光,而当g小于阈值T2时,补光控制信号置为0,关闭红外补光,其中,T1大于T2。
需要强调的是,T1和T2的具体值可以根据实际情况来设定,在此不做限定;另外,上述所给出的该信号控制器140控制该红外补光器150对该图像传感器进行红外补光的具体实现方式仅仅作为示例性说明,并不应该构成对本申请实施例的限定。
第二方面,为了解决图像处理器从色彩信号中去除红外成分的运算量大,导致融合效率不高的问题,本申请实施例提供了一种图像融合设备。
如图8所示,本申请实施例所提供的一种图像融合设备,可以包括:
光采集装置210、图像处理器230,以及具有四类感光通道的图像传感器220,所述四类感光通道包括:红绿蓝RGB通道和全波段W通道;
该光采集装置210,用于对入射光中第一预定波长区间的光谱进行阻挡,得到目标光,该第一预定波长区间为:该图像传感器220中RGB通道与W通道在红外波段上响应差别高于第一预定阈值的光谱的波长区间;
所述图像传感器220,用于通过该RGB通道和该W通道,将该目标光转换为图像信号;
该图像处理器230,用于将该图像信号解析为色彩信号和亮度信号,并将该色彩信号和该亮度信号进行融合处理,得到融合后图像,其中,该色彩信号为基于R通道、G通道或B通道解析得到的信号。
需要强调的是,通过对入射光中第一预定波长区间的光谱进行阻挡,使得近红外波段中各通道量子效率反差较大的部分被滤除,从而简化图像处理器从色彩信号中去除红外成分的运算,提高了融合效率。其中,第一预定阈值的具体值可以根据实际情况设定,在此不做限定。需要强调的是,R通道或G通道或B通道的值分别与W通道的值的三个差量中,只要任意一个差量高于第一预定阈值,光采集装置210可对相应波长区间的光谱部分进行阻挡。
可选地,为了让红外波段和可见光波段通过,该第一预定波长区间可以为[T1,T2],其中,T1的值处于区间[600nm,800nm]内,T2的值处于区间[750nm,1100nm]内。可以理解的是,通过增加对入射光预定波长区间的光谱的阻挡的功能,使得图像传感器220中,RGB通道和W通道在红外波段(650nm-1100nm)上响应差别较大的光谱区域滤除,从而使得图像传感器220感光形成的图像信号通过简单运算即可还原出精准的色彩信号,以及亮度信号。如图13所示,灰色部分为需要被阻挡滤除的部分光谱,图13中,W表征全波段光谱信号、R表征红光信号、G表征绿光信号、B表征蓝光信号。需要强调的是,图13仅仅是示例,并不具有任何限定意义,而由于制作工艺等原因,实际滤除曲线通常不会如图13所示的陡峭程度,但是会存在坡度。
本领域技术人员可以理解的是,拜耳阵列,即BAYER阵列,是图像传感器的一种数据格式,拜耳阵列示意图可以参见图2所示,即以马赛克方式输 出红、绿、蓝点阵信息。由于基于拜耳阵列的图像传感器仅仅具有RGB三通道,无法得到红外光谱,因此,为了得到可见光图像和红外图像的融合图像,需要通过棱镜分光、不同光学传感器等同时获取红外光和可见光,结构较为复杂。而为了降低结构复杂度,本申请实施例所利用的图像传感器为具有四类感光通道的图像传感器,以使得一个图像传感器采集的图像信号能够被解析出色彩信号以及亮度信号。为了引用方便,对于本申请实施例所提供的具有红绿蓝RGB通道和全波段通道的图像传感器命名为RGBW图像传感器。
而图9给出了RGBW图像传感器所对应的阵列示意图图,如图9所示,RGBW图像传感器包括四类感光通道,即RGB通道和W通道,具体的,W通道为全波段感光的通道,由于W通道为全波段感光,因此,W通道可以对红外波段感光,W通道可以作为亮度通道;而RGB通道既可以对可见光波段感光又可以对红外波段感光,并且,RGB通道主要用于对可见光波段感光。需要强调的是,图9所给出的阵列仅仅作为示例性说明,并不应该构成对本申请实施例的限定。另外,本领域技术人员可以理解的是,实际应用中,RGBW图像传感器所对应的阵列的结构多种多样,且均可以应用于本申请实施例。
需要说明的是,该色彩信号具体可以由R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值后所形成,该IR参数值为对应像素位置的IR值与第二预设修正值的乘积,该IR值为采用预定计算公式所计算得到的值。可以理解的是,通过将所遍历的R通道、G通道或B通道的值分别减去与对应像素位置相应的IR参数值,即去除色彩信号中的红外成分,可以避免色彩信号中红外成分与RGB三类信号成分进行串扰,提升低照度下的图像效果。并且,需要强调的是,该第一预设修正值和第二预设修正值可以根据实际情况进行设定。举例而言,该第一预设修正值通常可以设为2,当然,可以根据实际情况,将该第一预设修正值设为0至1024中的任一整数或小数,而本领域技术人员可以理解的是,第一预设修正值的取值并不局限于此;类似的,该第二预设修正值通常可以设为1,当然,可以根据实际情况,将该第二预设修正值设为0至1024中的任一整数或小数,但本领域技术人员可以理解的是,第二预设修正值的取值并不局限于此。可选地,在一种具体实现方式中,该图像处理器230将该图像信号解析为色彩信号和亮度信号的过程包括:
步骤d1,对该图像信号中的W通道进行插值运算,生成与输入分辨率相同的亮度信号,其中,该输入分辨率为该图像信号所具有的分辨率;
步骤d2,分别对该图像信号中R通道、G通道和B通道进行插值运算,生成与该输入分辨率相同的各通道图像信号;
步骤d3,遍历各通道图像信号,采用预定公式计算各像素位置的IR值,其中,该预定公式为(R+G+B-W)/n,R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道的值,W为像素对应的W通道的值,n为第一预设修正值;
步骤d4,遍历各通道图像信号,将所遍历到的R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值,生成色彩信号,该IR参数值为对应像素位置的IR值与第二预设修正值的乘积。
仍以图9所示的阵列为例,如图9所示,每一个小方格对应一个像素点,RGBW图像传感器所生成的图像信号的分辨率为8*8。由于对该图像信号中的W通道进行插值运算,生成与输入分辨率相同的亮度信号,因此,对于图9而言,对该图像信号中的W通道插值后所生成的亮度信号的分辨率也为8*8,其中,图10给出了W通道的插值结果的示意图。并且,对该图像信号中的W通道插值所采用的插值算法可以为双线性,或双三次等等,具体的插值过程在此不做限定。需要强调的是,RGBW图像传感器所生成的图像信号的分辨率与阵列结构相关,8*8仅仅是具有图9所示阵列结构的RGBW图像传感器对应的分辨率,并不应该构成对本申请实施例的限定。
另外,类似的,由于分别对该图像信号中R通道、G通道和B通道进行插值运算,生成与该输入分辨率相同的各通道图像信号,因此,以图9为例,分别对该图像信号中R通道、G通道和B通道插值后所形成的各通道图像信号的分辨率为8*8;并且,对该图像信号中R通道、G通道和B通道插值所采用的插值算法可以为双线性或双三次等等,而对W通道插值与对RGB通道插值所采用的插值算法可以相同也可以不同,在此不做限定。
需要强调的是,上述所给出的该图像处理器230将该图像信号解析为色彩信号和亮度信号的具体实现方式,仅仅作为示例,并不应该构成对本申请 实施例的限定。
可以理解的是,在获得色彩信号和亮度信号后,该图像处理器230将该色彩信号和亮度信号进行融合处理从而得到融合后图像的具体实现方式存在多种。下面结合两种进行详细介绍。
在一种具体实现方式中,该图像处理器230将该色彩信号和该亮度信号进行融合处理,得到融合后图像的过程,包括:
步骤e1,根据该色彩信号,计算各个像素对应的辅助值Y,其中,Y=(R*w1+G*w2+B*w3)/(w1+w2+w3),R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道的值,w1、w2和w3为权重值;
步骤e2,计算该色彩信号中各个通道值与辅助值Y的比例,得到各个像素对应的参考通道值K1、K2和K3,其中,K1=R/Y,K2=G/Y,K3=B/Y;
步骤e3,对该参考通道值K1、K2和K3进行色彩降噪处理;
步骤e4,将对应像素上的亮度信号Y’与色彩降噪后的参考通道值K1-K3进行融合处理,生成融合后的RGB三通道值R’、G’和B’,得到融合后图像;其中,R’=K1*Y’;G’=K2*Y’;B’=K3*Y’。
其中,本申请实施例不对权重值w1、w2和w3的值进行限定;色彩降噪处理可以采用的方式包括但不局限于高斯滤波。举例而言,假设w1=1、w2=1,w3=1,此时,Y=(R+G+B)/3。
在另一种具体实现方式中,该图像处理器230将该色彩信号和该亮度信号进行融合处理,得到融合后图像的过程,包括:
步骤f1,将该色彩信号转换为YUV信号,该YUV信号为明亮度及色度信号;
步骤f2,提取该YUV信号中的色度UV分量;
步骤f3,对该UV分量与该亮度信号进行组合,形成新的YUV信号;
步骤f4,将所形成的新的YUV信号确定为融合后图像,或者,将所形成的新的YUV信号转换为RGB信号,并将转换为的RGB信号确定为融合后图 像。
可以理解的是,对于YUV格式而言,“Y”表示明亮度(Luminance或Luma),也就是灰阶值;而“U”和“V”表示的则是色度(Chrominance或Chroma),作用是描述影像色彩及饱和度,用于指定像素的颜色。而在提取到UV分量之后,在对UV分量与亮度信号进行组合之前,可以对UV分量进行降噪处理,以去除色彩噪声,从而提高融合后图像的图片质量,其中,降噪处理的方式可以包括但不局限于高斯滤波。需要强调的是,YUV信号和色彩信号的相互转换可以采用相关技术中任一种转换算法来实现,并且,可以采用从YUV信号中提取UV分量以及对UV分量与该亮度信号进行组合。
需要强调的是,上述所给出的该图像处理器230将该色彩信号和亮度信号进行融合处理,得到融合后图像的具体实现方式,仅仅作为示例,并不应该构成对本申请实施例的限定。另外,可以理解的是,该图像处理器230可以首先对该色彩信号和亮度信号进行优化处理,进而对优化处理后的色彩信号和优化处理后的亮度信号进行融合处理,得到融合后图像,举例而言,对色彩信号的优化处理可以包括:对色彩信号进行低通滤波处理,以实现对色彩信号进行降噪;而对亮度信号的优化处理可以包括:对亮度信号进行高通滤波处理,以实现亮度信号的边缘增强。
可见,本申请实施例中,光采集装置对入射光中第一预定波长区间的光谱进行阻挡,使得近红外波段中各通道量子效率反差较大的部分被滤除,从而简化图像处理器从色彩信号中去除红外成分的运算,提高了融合效率。并且,本申请实施例所提供的图像融合设备通过具有四类感光通道的图像传感器进行图像采集,相比在光学系统上需要特殊设计才能同时获取红外光与可见光,结构复杂度极大降低,从而使得使用范围能够得到扩展。
为了实现对入射光中第一预定波长区间的光谱进行阻挡,在一种具体实现方式中,该光采集装置210包括:带阻滤光片和第一类光学镜头;
该第一类光学镜头,用于以全光谱透射方式将入射光透射至该带阻滤光片;
该带阻滤光片,用于将该第一类光学镜头所透过的光中的该第一预定波长区间的光谱进行阻挡,以得到目标光。
具体的,带阻滤光片可以为通过镀膜方式集成在该第一类光学镜头上的涂层;或者,带阻滤光片可以为设置于该第一类光学镜头上的贴片,等等。并且,需要说明的是,所述的全光谱透射方式为全波段光谱均被透射的方式,即任何波段的光谱均不被阻挡;而由于第一类光学镜头以全光谱透射方式将入射光透射至该带阻滤光片,因此,该第一类光学镜头所透过的光的波段与入射光的波段相同,即第一类光学镜头未对任何波段的光谱进行阻挡。为了实现对入射光中第一预定波长的光谱进行阻挡,在另一种实现方式中,该光采集装置210可以包括:能够阻挡该第一预定波长区间的光谱的第二类光学镜头。
需要强调的是,为了实现对入射光中第一预定波长区间的光谱进行阻挡,上述所给出的光采集装置210的具体实现方式仅仅作为示例,并不应该构成对本申请实施例的限定。
可选地,在一种具体实现方式中,该图像传感器220将该目标光转换为图像信号时,在红外波段的第二预定波长区间内RGB通道间的响应差别低于第二预定阈值,以保证去除红外成分后颜色的准确还原,从而提高图像融合质量,其中,该第二预定波长区间为[T3,T4],其中,T4大于T3,且T3大于等于750nm,T4小于等于1100nm。举例而言,如图13所示,图像传感器220将目标光转换为图像信号时,灰色区域右侧波段内RGB通道间的响应满足了一定约束。需要强调的是,该第二预定阈值可以根据实际情况进行设定,本申请实施例在此不做限定。
需要说明的是,为了在红外波段的第二预定波长区间内RGB通道间的响应差别低于第二预定阈值,该图像传感器220的具体结构可以多种多样,本申请实施例并不作限定,如:在图像传感器中增加特定的光学元件,如滤光片等等。
可选地,该图像传感器220将该目标光转换为图像信号的过程中,在同一帧时间内进行多次曝光采集。对于该图像传感器220而言,在一种实现方式中,单次或多次曝光可以通过人工方式设定。
并且,对于图像传感器220在同一帧时间内进行多次曝光采集而言,该图像处理器230将该图像信号解析为色彩信号和亮度信号的过程,可以包括:
利用第一类曝光形成的图像信号,解析得到色彩信号;
利用第二类曝光形成的图像信号,解析得到亮度信号。
其中,第一类曝光和第二类曝光可以为相同曝光时长也可以为不同曝光时长,当第一类曝光和第二类曝光为不同曝光时长时,第一类曝光的曝光时长可以小于第二类曝光的曝光时长,当然,第一类曝光的曝光时长也可以大于第二类曝光的曝光时长,这都是合理的。
具体的,利用第一类曝光形成的图像信号,解析得到色彩信号具体可以包括:
分别对第一类曝光形成的图像信号中R通道、G通道和B通道插值,生成与输入分辨率相同的各通道图像信号;
遍历该各通道图像信号,采用预定公式计算各像素位置的IR值,其中,该预定公式为(R+G+B-W)/n,R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道上的值,W为像素对应的W通道的值,n为第一预设修正值;
遍历该各通道图像信号,将所遍历到的R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值,生成色彩信号,其中,该IR参数值为对应像素位置的IR值与第二预设修正值的乘积;
相应地,利用第二类曝光形成的图像信号,解析得到亮度信号具体可以包括:
对第二类曝光形成的图像信号中的W通道插值,生成与输入分辨率相同的亮度信号,其中,该输入分辨率为该图像信号所具有的分辨率。
上述过程所涉及的任一通道的插值可以采用采用双线性、双三次插值算 法,当然并不局限于此。
可选地,在图8所示的实施例的基础上,如图11所示,本申请实施例所提供的一种图像融合设备还可以包括:
信号控制器240,用于调控该图像传感器220形成符合预定亮度要求的图像信号。
具体的,在一种实现方式中,该信号控制器240可以具体用于对该图像传感器220所形成的图像信号进行亮度统计,并根据统计结果调控该图像传感器220形成符合预定亮度要求的图像信号。具体的,该信号控制器240可以执行如下步骤:(a)生成初始亮度调控信号,并发送图像传感器220;(b)统计图像传感器220所生成图像信号的平均亮度,即将所有像素值相加并求平均;(c)将平均亮度与参考值相比较,若差值在预定范围内,则维持当前亮度调控信号的数值不变;若差值在预定范围之外且大于参考值,则下调亮度调控信号的数值;若差值在预定范围之外且小于参考值,则上调亮度调控信号的数值。
当然,在另一种实现方式中,该信号控制器240可以定时向图像传感器220发送预定的亮度调控信号,该预定的亮度调控信号为基于预定亮度要求所设定的调控信号。
可以理解的是,该信号控制器240还可以用于控制该图像传感器220在单次曝光采集和多次曝光采集之间切换。需要强调的是,上述所给出的信号控制器240调控该图像传感器220形成符合预定亮度要求的图像信号的具体实现方式仅仅作为示例,并不应该构成对本申请实施例的限定。
可选地,在低照度环境下,为了保证融合后图像的亮度较为理想,信噪比较高,可以对该环境进行补光照明,即补充红外光。基于该处理思想,如图12所示,本申请实施例所提供的一种图像融合设备还可以包括:红外补光器250;
该信号控制器240还用于控制该红外补光器250对该图像传感器进行红外补光。
具体的,在一种具体实现方式中,信号控制器240可以检测亮度调控信号中的增益值g,当g大于阈值T1时,补光控制信号置为1,开启红外补光,而当g小于阈值T2时,补光控制信号置为0,关闭红外补光,其中,T1大于T2。
需要强调的是,T1和T2的具体值可以根据实际情况来设定,在此不做限定;另外,上述所给出的该信号控制器240控制该红外补光器250对该图像传感器进行红外补光的具体实现方式仅仅作为示例性说明,并不应该构成对本申请实施例的限定。
第三方面,基于上述第一方面所提供的图像融合设备,本申请实施例还提供了一种图像融合方法,应用于本申请实施例第一方面所提供的图像融合设备,该图像融合设备具有四类感光通道,该四类感光通道包括:红绿蓝RGB通道和红外IR通道;如图14所示,所述方法可以包括如下步骤:
S1401,对入射光中第一预定波长区间的光谱进行阻挡,得到目标光;其中,所述第一预定波长区间为:该图像融合设备的图像传感器中RGB通道与IR通道在红外波段上响应差别高于第一预定阈值的光谱的波长区间;
S1402,通过该RGB通道和该IR通道,将该目标光转换为图像信号;
其中,通过该RGB通道和IR通道,将该目标光转换为图像信号的具体实现方式可以采用相关技术中的任一实现方式中,在此不做限定。
S1403,将该图像信号解析为色彩信号和对红外波段感光的亮度信号,并将该色彩信号和该亮度信号进行融合处理,得到融合后图像,其中,该色彩信号为基于R通道、G通道或B通道解析得到的信号。
其中,图像融合设备的具体结构可以参照第一方面所示实施例所描述的内容,在此不做赘述。
需要强调的是,通过对入射光中第一预定波长区间的光谱进行阻挡,使 得近红外波段中各通道量子效率反差较大的部分被滤除,从而简化图像处理器从色彩信号中去除红外成分的运算,提高了融合效率。其中,第一预定阈值的具体值可以根据实际情况设定,在此不做限定。可选地,为了让红外波段和可见光波段通过,该第一预定波长区间可以为[T1,T2],其中,T1的值处于区间[600nm,800nm]内,T2的值处于区间[750nm,1100nm]内。
需要说明的是,该色彩信号具体可以由R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值后所形成,该IR参数值为对应像素位置的IR通道的值与预设修正值的乘积。可以理解的是,通过将所遍历的R通道、G通道或B通道的值分别减去与对应像素位置相应的IR参数值,即去除色彩信号中的红外成分,可以避免色彩信号中红外成分与RGB三类信号成分进行串扰,提升低照度下的图像效果。需要强调的是,该预设修正值可以根据实际情况进行设定,举例而言,该预设修正值通常可以设为1,当然,可以根据实际情况,将该预设修正值设为0至1024中的任一整数或小数,而本领域技术人员可以理解的是,预设修正值的取值并不局限于此。
具体的,在一种具体实现方式中,所述将该图像信号解析为色彩信号和对红外波段感光的亮度信号的步骤,可以包括:
步骤a1,对该图像信号中的IR通道进行插值运算,生成与输入分辨率相同、且对红外波段感光的亮度信号,其中,该输入分辨率为所述图像信号具有的分辨率;
步骤a2,遍历该图像信号,将所遍历到的R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值,该IR参数值为对应像素位置的IR通道的值与预设修正值的乘积;
步骤a3,分别对该图像信号中R通道、G通道和B通道插值,生成与该输入分辨率相同的色彩信号。
其中,关于本实施例的步骤a1-a3的具体描述内容可以参照上述第一方面所提供实施例在介绍图像融合设备时所给出的相应描述内容,在此不做赘述。
需要说明的是,将该色彩信号和亮度信号进行融合处理从而得到融合后图像的具体实现方式存在多种。下面结合两种进行详细介绍。
具体的,在一种具体实现方式中,所述将该色彩信号和该亮度信号进行融合处理,得到融合后图像的步骤,可以包括:
步骤b1,根据该色彩信号,计算各个像素对应的辅助值Y,其中,Y=(R*w1+G*w2+B*w3)/(w1+w2+w3),R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道的值,w1、w2和w3为权重值;
步骤b2,计算该色彩信号中各个通道值与辅助值Y的比例,得到各个像素对应的参考通道值K1、K2和K3,其中,K1=R/Y,K2=G/Y,K3=B/Y;
步骤b3,对该参考通道值K1、K2和K3进行色彩降噪处理;
步骤b4,将对应像素上的亮度信号Y’与色彩降噪后的参考通道值K1-K3进行融合处理,生成融合后的RGB三通道值R’、G’和B’,得到融合后图像;其中,R’=K1*Y’;G’=K2*Y’;B’=K3*Y’。
其中,关于本实施例的步骤b1-b4的具体描述内容可以参照上述第一方面所提供实施例在介绍图像融合设备时所给出的相应描述内容,在此不做赘述。
具体的,在另一种具体实现方式中,所述将该色彩信号和该亮度信号进行融合处理,得到融合后图像的步骤,包括:
步骤c1,将该色彩信号转换为YUV信号,该YUV信号为明亮度及色度信号;
步骤c2,提取该YUV信号中的色度UV分量;
步骤c3,对该UV分量与该亮度信号进行组合,形成新的YUV信号;
步骤c4,将所形成的新的YUV信号确定为融合后图像,或者,将所形成的新的YUV信号转换为RGB信号,并将转换为的RGB信号确定为融合后图像。
其中,关于本实施例的步骤c1-c4的具体描述内容可以参照上述第一方面所提供实施例在介绍图像融合设备时所给出的相应描述内容,在此不做赘述。
另外,该图像融合设备在将该目标光转换为图像信号的过程中,可以在同一帧时间内进行多次曝光采集,其中,单次或多次曝光可以通过人工方式 设定,当然,并不局限于此。
并且,对于图像融合设备在同一帧时间内进行多次曝光采集而言,将该图像信号解析为色彩信号以及对红外波段感光的亮度信号的过程,可以包括:
利用第一类曝光形成的图像信号,解析得到色彩信号;
利用第二类曝光形成的图像信号,解析得到对红外波段感光的亮度信号。
其中,第一类曝光和第二类曝光可以为相同曝光时长也可以为不同曝光时长,当第一类曝光和第二类曝光为不同曝光时长时,第一类曝光的曝光时长可以小于第二类曝光的曝光时长,当然,第一类曝光的曝光时长也可以大于第二类曝光的曝光时长,这都是合理的。
具体的,利用第一类曝光形成的图像信号,解析得到色彩信号具体可以包括:
对第一类曝光形成的图像信号中的IR通道进行插值运算,得到各个像素位置的IR通道的值;
遍历该第一类曝光形成的图像信号,将所遍历到的R通道、G通道或B通道的值减去对应像素位置相应的IR参数值,该IR参数值为对应像素位置的IR通道的值与预设修正值的乘积;
分别对该第一类曝光形成的图像信号中R通道、G通道和B通道进行插值运算,生成与所述输入分辨率相同的色彩信号;
相应地,利用第二类曝光形成的图像信号,解析得到对红外波段感光的亮度信号具体可以包括:
对第二类曝光形成的图像信号中的IR通道进行插值运算,生成与输入分辨率相同的、且对红外波段感光的亮度信号,其中,该输入分辨率为该图像信号具有的分辨率。
上述过程所涉及的任一通道的插值可以采用采用双线性、双三次插值算法,当然并不局限于此。
并且,为了保证红外信号充足,可以利用短曝光形成的图像信号来解析 得到色彩信号,而利用长曝光形成的图像信号来解析得到亮度信号,以保证图像质量,此时,第一类曝光的曝光时长小于第二类曝光的曝光时长。
通过本申请实施例所提供的图像融合方法,可以降低从色彩信号中去除红外成分的运算量,提高了融合效率,同时,实现了通过结构简单的图像融合设备来采集双波段图像的目的。
第四方面,基于上述第二方面所提供的图像融合设备,本申请实施例还提供了一种图像融合方法,应用于本申请实施例第二方面所提供的图像融合设备,该图像融合设备具有四类感光通道,所述四类感光通道包括:红绿蓝RGB通道和全波段W通道;如图15所示,所述方法可以包括如下步骤:
S1501,对入射光中第一预定波长区间的光谱进行阻挡,得到目标光,该第一预定波长区间为:该图像融合设备的图像传感器中RGB通道与W通道在红外波段上响应差别高于第一预定阈值的光谱的波长区间;
S1502,通过该RGB通道和该W通道,将该目标光转换为图像信号;
其中,通过该RGB通道和W通道,将该目标光转换为图像信号的具体实现方式可以采用相关技术中的任一实现方式中,在此不做限定。
S1503,将该图像信号解析为色彩信号和亮度信号,并将该色彩信号和该亮度信号进行融合处理,得到融合后图像,其中,该色彩信号为基于R通道、G通道或B通道解析得到的信号。
需要强调的是,通过对入射光中第一预定波长区间的光谱进行阻挡,使得近红外波段中各通道量子效率反差较大的部分被滤除,从而简化图像处理器从色彩信号中去除红外成分的运算,提高了融合效率。其中,第一预定阈值的具体值可以根据实际情况设定,在此不做限定。
可选地,为了让红外波段和可见光波段通过,该第一预定波长区间可以为[T1,T2],其中,T1的值处于区间[600nm,800nm]内,T2的值处于区间[750nm,1100nm]内。
其中,图像融合设备的具体结构可以参照第二方面所示实施例所描述的 内容,在此不做赘述。
需要说明的是,该色彩信号具体可以由R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值后所形成,该IR参数值为对应像素位置的IR值与第二预设修正值的乘积,该IR值为采用预定计算公式所计算得到的值。其中,通过将所遍历的R通道、G通道或B通道的值分别减去与对应像素位置相应的IR参数值,即去除色彩信号中的红外成分,可以避免色彩信号中红外成分与RGB三类信号成分进行串扰,提升低照度下的图像效果。并且,需要强调的是,该第一预设修正值和第二预设修正值可以根据实际情况进行设定。举例而言,该第一预设修正值通常可以设为2,当然,可以根据实际情况,将该第一预设修正值设为0至1024中的任一整数或小数,而本领域技术人员可以理解的是,第一预设修正值的取值并不局限于此;类似的,该第二预设修正值通常可以设为1,当然,可以根据实际情况,将该第二预设修正值设为0至1024中的任一整数或小数,但本领域技术人员可以理解的是,第二预设修正值的取值并不局限于此。
可选地,在一种具体实现方式中,所述将该图像信号解析为色彩信号和亮度信号的步骤,包括:
步骤d1,对该图像信号中的W通道进行插值运算,生成与输入分辨率相同的亮度信号,其中,该输入分辨率为该图像信号所具有的分辨率;
步骤d2,分别对该图像信号中R通道、G通道和B通道进行插值运算,生成与该输入分辨率相同的各通道图像信号;
步骤d3,遍历各通道图像信号,采用预定公式计算各像素位置的IR值,其中,该预定公式为(R+G+B-W)/n,R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道的值,W为像素对应的W通道的值,n为第一预设修正值;
步骤d4,遍历各通道图像信号,将所遍历到的R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值,生成色彩信号,该IR参数值为对应像素位置的IR值与第二预设修正值的乘积。
其中,关于本实施例的步骤d1-d4的具体描述内容可以参照上述第二方面 所提供实施例在介绍图像融合设备时所给出的相应描述内容,在此不做赘述。
需要说明的是,将该色彩信号和亮度信号进行融合处理从而得到融合后图像的具体实现方式存在多种。下面结合两种进行详细介绍。
具体的,在一种具体实现方式中,所述将该色彩信号和该亮度信号进行融合处理,得到融合后图像的步骤,可以包括:
步骤e1,根据该色彩信号,计算各个像素对应的辅助值Y,其中,Y=(R*w1+G*w2+B*w3)/(w1+w2+w3),R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道的值,w1、w2和w3为权重值;
步骤e2,计算所述色彩信号中各个通道值与辅助值Y的比例,得到各个像素对应的参考通道值K1、K2和K3,其中,K1=R/Y,K2=G/Y,K3=B/Y;
步骤e3,对该参考通道值K1、K2和K3进行色彩降噪处理;
步骤e4,将对应像素上的亮度信号Y’与色彩降噪后的参考通道值K1-K3进行融合处理,生成融合后的RGB三通道值R’、G’和B’,得到融合后图像;其中,R’=K1*Y’;G’=K2*Y’;B’=K3*Y’。
其中,关于本实施例的步骤e1-e4的具体描述内容可以参照上述第二方面所提供实施例在介绍图像融合设备时所给出的相应描述内容,在此不做赘述。
具体的,在另一种具体实现方式中,所述将该色彩信号和该亮度信号进行融合处理,得到融合后图像的步骤,可以包括:
步骤f1,将该色彩信号转换为YUV信号,该YUV信号为明亮度及色度信号;
步骤f2,提取该YUV信号中的色度UV分量;
步骤f3,对该UV分量与该亮度信号进行组合,形成新的YUV信号;
步骤f4,将所形成的新的YUV信号确定为融合后图像,或者,将所形成的新的YUV信号转换为RGB信号,并将转换为的RGB信号确定为融合后图像。
其中,关于本实施例的步骤f1-f4的具体描述内容可以参照上述第二方面 所提供实施例在介绍图像融合设备时所给出的相应描述内容,在此不做赘述。
另外,该图像融合设备在将该目标光转换为图像信号的过程中,可以在同一帧时间内进行多次曝光采集,其中,单次或多次曝光可以通过人工方式设定,当然,并不局限于此。
并且,对于图像融合设备在同一帧时间内进行多次曝光采集而言,将该图像信号解析为色彩信号以及对红外波段感光的亮度信号的过程,可以包括:
利用第一类曝光形成的图像信号,解析得到色彩信号;
利用第二类曝光形成的图像信号,解析得到亮度信号。
其中,第一类曝光和第二类曝光可以为相同曝光时长也可以为不同曝光时长,当第一类曝光和第二类曝光为不同曝光时长时,第一类曝光的曝光时长可以小于第二类曝光的曝光时长,当然,第一类曝光的曝光时长也可以大于第二类曝光的曝光时长,这都是合理的。
具体的,利用第一类曝光形成的图像信号,解析得到色彩信号具体可以包括:
分别对第一类曝光形成的图像信号中R通道、G通道和B通道插值,生成与输入分辨率相同的各通道图像信号;
遍历该各通道图像信号,采用预定公式计算各像素位置的IR值,其中,该预定公式为(R+G+B-W)/n,R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道上的值,W为像素对应的W通道的值,n为第一预设修正值;
遍历该各通道图像信号,将所遍历到的R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值,生成色彩信号,其中,该IR参数值为对应像素位置的IR值与第二预设修正值的乘积;
相应地,利用第二类曝光形成的图像信号,解析得到亮度信号具体可以包括:
对第二类曝光形成的图像信号中的W通道插值,生成与输入分辨率相同的亮度信号,其中,该输入分辨率为该图像信号所具有的分辨率。
上述过程所涉及的任一通道的插值可以采用采用双线性、双三次插值算法,当然并不局限于此。
通过本申请实施例所提供的图像融合方法,可以降低从色彩信号中去除红外成分的运算量,提高了融合效率,同时,实现了通过结构简单的图像融合设备来采集双波段图像的目的。
相应于上述第三方面提供的图像融合方法,本申请实施例还提供了一种存储介质,用于存储可执行程序代码,该可执行程序代码被运行以执行:上述第三方面中任一方法实施例提供的图像融合方法的方法步骤。
本申请实施例提供的存储介质中存储的可执行程序代码被第一方面提供的图像融合设备执行后,可以降低从色彩信号中去除红外成分的运算量,提高了融合效率,同时,实现了通过结构简单的图像融合设备来采集双波段图像的目的。
相应于上述第四方面提供的图像融合方法,本申请实施例提供了一种存储介质,用于存储可执行代码,该可执行代码被运行以执行:上述第四方面中任一方法实施例提供的图像融合方法的方法步骤。
本申请实施例提供的存储介质中存储的可执行代码被第二方面提供的图像融合设备执行后,可以降低从色彩信号中去除红外成分的运算量,提高了融合效率,同时,实现了通过结构简单的图像融合设备来采集双波段图像的目的。
相应于上述第三方面提供的图像融合方法,本申请实施例提供了一种应用程序,用于在运行时执行:上述第三方面中任一方法实施例提供的图像融合方法的方法步骤。
本申请实施例提供的应用程序,当其在第一方面提供的图像融合设备上运行时,可以降低从色彩信号中去除红外成分的运算量,提高了融合效率,同时,实现了通过结构简单的图像融合设备来采集双波段图像的目的。
相应于上述第四方面提供的图像融合方法,本申请实施例提供了一种应 用程序,用于在运行时执行:上述第四方面中任一方法实施例提供的图像融合方法的方法步骤。
本申请实施例提供的应用程序,当其在第二方面提供的图像融合设备上运行时,可以降低从色彩信号中去除红外成分的运算量,提高了融合效率,同时,实现了通过结构简单的图像融合设备来采集双波段图像的目的。
另外,本申请实施例所涉及的“第一类”、“第二类”、“第三类”、“第四类”、“第一”和“第二”等词汇仅仅用于从命名上区分同一类型的对象,以使得引用这些对象时更加方便且清楚,并不具有任何限定意义。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
本说明书中的各个实施例均采用相关的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于系统实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
以上所述仅为本申请的较佳实施例而已,并不用以限制本申请,凡在本申请的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本申请保护的范围之内。

Claims (47)

  1. 一种图像融合设备,其特征在于,包括:
    光采集装置、图像处理器,以及具有四类感光通道的图像传感器,所述四类感光通道包括:红绿蓝RGB通道和红外IR通道;
    所述光采集装置,用于对入射光中第一预定波长区间的光谱进行阻挡,得到目标光;其中,所述第一预定波长区间为:所述图像传感器中RGB通道与IR通道在红外波段上响应差别高于第一预定阈值的光谱的波长区间;
    所述图像传感器,用于通过所述RGB通道和所述IR通道,将所述目标光转换为图像信号;
    所述图像处理器,用于将所述图像信号解析为色彩信号和对红外波段感光的亮度信号,并将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像,其中,所述色彩信号为基于R通道、G通道或B通道解析得到的信号。
  2. 根据权利要求1所述的图像融合设备,其特征在于,所述色彩信号由R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值后所形成,所述IR参数值为对应像素位置的IR通道的值与预设修正值的乘积。
  3. 根据权利要求2所述的图像融合设备,其特征在于,所述图像处理器将所述图像信号解析为色彩信号和亮度信号过程包括:
    对所述图像信号中的IR通道进行插值运算,生成与输入分辨率相同的、且对红外波段感光的亮度信号,其中,所述输入分辨率为所述图像信号具有的分辨率;
    遍历所述图像信号,将所遍历到的R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值,所述IR参数值为对应像素位置的IR通道的值与预设修正值的乘积;
    分别对所述图像信号中R通道、G通道和B通道插值,生成与所述输入分辨率相同的色彩信号。
  4. 根据权利要求1所述的图像融合设备,其特征在于,所述图像处理器将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像的过程,包 括:
    根据所述色彩信号,计算各个像素对应的辅助值Y,其中,Y=(R*w1+G*w2+B*w3)/(w1+w2+w3),R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道的值,w1、w2和w3为权重值;
    计算所述色彩信号中各个通道值与辅助值Y的比例,得到各个像素对应的参考通道值K1、K2和K3,其中,K1=R/Y,K2=G/Y,K3=B/Y;
    对所述参考通道值K1、K2和K3进行色彩降噪处理;
    将对应像素上的亮度信号Y’与色彩降噪后的参考通道值K1-K3进行融合处理,生成融合后的RGB三通道值R’、G’和B’,得到融合后图像;其中,R’=K1*Y’;G’=K2*Y’;B’=K3*Y’。
  5. 根据权利要求1所述的图像融合设备,其特征在于,所述图像处理器将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像的过程,包括:
    将所述色彩信号转换为YUV信号,所述YUV信号为明亮度及色度信号;
    提取所述YUV信号中的色度UV分量;
    对所述UV分量与所述亮度信号进行组合,形成新的YUV信号;
    将所形成的新的YUV信号确定为融合后图像,或者,将所形成的新的YUV信号转换为RGB信号,并将转换为的RGB信号确定为融合后图像。
  6. 根据权利要求1-5任一项所述的图像融合设备,其特征在于,所述光采集装置包括:带阻滤光片和第一类光学镜头;
    所述第一类光学镜头,用于以全光谱透射方式将入射光透射至所述带阻滤光片;
    所述带阻滤光片,用于将所述第一类光学镜头所透过的光中的所述第一预定波长区间的光谱进行阻挡,以得到目标光。
  7. 根据权利要求1-5任一项所述的图像融合设备,其特征在于,所述光采集装置包括:能够阻挡所述第一预定波长区间的光谱的第二类光学镜头。
  8. 根据权利要求1-5任一项所述的图像融合设备,其特征在于,所述第一预定波长区间为[T1,T2],其中,T1的值处于区间[600nm,800nm]内,T2的值处于区间[750nm,1100nm]内。
  9. 根据权利要求1-5任一项所述的图像融合设备,其特征在于,所述图像传感器将所述目标光转换为图像信号时,在红外波段的第二预定波长区间内RGB通道间的响应差别低于第二预定阈值。
  10. 根据权利要求9所述的图像融合设备,其特征在于,所述第二预定波长区间为[T3,T4],其中,T4大于T3,且T3大于等于750nm,T4小于等于1100nm。
  11. 根据权利要求1-5任一项所述的图像融合设备,其特征在于,所述图像传感器将所述目标光转换为图像信号的过程中,在同一帧时间内进行多次曝光采集。
  12. 根据权利要求11所述的图像融合设备,其特征在于,所述图像处理器将所述图像信号解析为色彩信号以及对红外波段感光的亮度信号的过程,包括:
    利用第一类曝光形成的图像信号,解析得到色彩信号;
    利用第二类曝光形成的图像信号,解析得到对红外波段感光的亮度信号。
  13. 根据权利要求12所述的图像融合设备,其特征在于,所述第一类曝光的曝光时长小于第二类曝光。
  14. 根据权利要求1-5任一项所述的图像融合设备,其特征在于,还包括:信号控制器,用于调控所述图像传感器形成符合预定亮度要求的图像信号。
  15. 根据权利要求14所述的图像融合设备,其特征在于,所述信号控制器具体用于对所述图像传感器所形成的图像信号进行亮度统计,并根据统计结果调控所述图像传感器形成符合预定亮度要求的图像信号。
  16. 根据权利要求14所述的图像融合设备,其特征在于,所述信号控制器还用于控制所述图像传感器在单次曝光采集和多次曝光采集之间切换。
  17. 根据权利要求14所述的图像融合设备,其特征在于,还包括:红外补光器;
    所述信号控制器还用于控制所述红外补光器对所述图像传感器进行红外补光。
  18. 一种图像融合设备,其特征在于,包括:
    光采集装置、图像处理器,以及具有四类感光通道的图像传感器,所述四类感光通道包括:红绿蓝RGB通道和全波段W通道;
    所述光采集装置,用于对入射光中第一预定波长区间的光谱进行阻挡,得到目标光,所述第一预定波长区间为:所述图像传感器中RGB通道与W通道在红外波段上响应差别高于第一预定阈值的光谱的波长区间;
    所述图像传感器,用于通过所述RGB通道和所述W通道,将所述目标光转换为图像信号;
    所述图像处理器,用于将所述图像信号解析为色彩信号和亮度信号,并将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像,其中,所述色彩信号为基于R通道、G通道或B通道解析得到的信号。
  19. 根据权利要求18所述的图像融合设备,其特征在于,所述色彩信号由R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值后所形成,所述IR参数值为对应像素位置的IR值与第二预设修正值的乘积,所述IR值为采用预定计算公式所计算得到的值。
  20. 根据权利要求19所述的图像融合设备,其特征在于,所述图像处理器将所述图像信号解析为色彩信号和亮度信号的过程包括:
    对所述图像信号中的W通道进行插值运算,生成与输入分辨率相同的亮度信号,其中,所述输入分辨率为所述图像信号所具有的分辨率;
    分别对所述图像信号中R通道、G通道和B通道进行插值运算,生成与所述输入分辨率相同的各通道图像信号;
    遍历所述各通道图像信号,采用预定公式计算各像素位置的IR值,其中,所述预定公式为(R+G+B-W)/n,R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道的值,W为像素对应的W通道的值,n为第一预设修正值;
    遍历所述各通道图像信号,将所遍历到的R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值,生成色彩信号,所述IR参数值为对应像素位置的IR值与第二预设修正值的乘积。
  21. 根据权利要求18所述的图像融合设备,其特征在于,所述图像处理器将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像的过程,包括:
    根据所述色彩信号,计算各个像素对应的辅助值Y,其中,Y=(R*w1+G*w2+B*w3)/(w1+w2+w3),R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道的值,w1、w2和w3为权重值;
    计算所述色彩信号中各个通道值与辅助值Y的比例,得到各个像素对应的参考通道值K1、K2和K3,其中,K1=R/Y,K2=G/Y,K3=B/Y;
    对所述参考通道值K1、K2和K3进行色彩降噪处理;
    将对应像素上的亮度信号Y’与色彩降噪后的参考通道值K1-K3进行融合处理,生成融合后的RGB三通道值R’、G’和B’,得到融合后图像;其中,R’=K1*Y’;G’=K2*Y’;B’=K3*Y’。
  22. 根据权利要求18所述的图像融合设备,其特征在于,所述图像处理器将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像的过程,包括:
    将所述色彩信号转换为YUV信号,所述YUV信号为明亮度及色度信号;
    提取所述YUV信号中的色度UV分量;
    对所述UV分量与所述亮度信号进行组合,形成新的YUV信号;
    将所形成的新的YUV信号确定为融合后图像,或者,将所形成的新的YUV信号转换为RGB信号,并将转换为的RGB信号确定为融合后图像。
  23. 根据权利要求18-22任一项所述的图像融合设备,其特征在于,所述光采集装置包括:带阻滤光片和第一类光学镜头;
    所述第一类光学镜头,用于以全光谱透射方式将入射光透射至所述带阻 滤光片;
    所述带阻滤光片,用于将所述第一类光学镜头所透过的光中的所述第一预定波长区间的光谱进行阻挡,以得到目标光。
  24. 根据权利要求18-22任一项所述的图像融合设备,其特征在于,所述光采集装置包括:能够阻挡所述第一预定波长区间的光谱的第二类光学镜头。
  25. 根据权利要求18-22任一项所述的图像融合设备,其特征在于,所述第一预定波长区间为[T1,T2],其中,T1的值处于区间[600nm,800nm]内,T2的值处于区间[750nm,1100nm]内。
  26. 根据权利要求18-22任一项所述的图像融合设备,其特征在于,所述图像传感器将所述目标光转换为图像信号时,在红外波段的第二预定波长区间内RGB通道间的响应差别低于第二预定阈值。
  27. 根据权利要求26所述的图像融合设备,其特征在于,所述第二预定波长区间为[T3,T4],其中,T4大于T3,且T3大于等于750nm,T4小于等于1100nm。
  28. 根据权利要求18-22任一项所述的图像融合设备,其特征在于,所述图像传感器将所述目标光转换为图像信号的过程中,在同一帧时间内进行多次曝光采集。
  29. 根据权利要求28所述的图像融合设备,其特征在于,所述图像处理器将所述图像信号解析为色彩信号和亮度信号的过程,包括:
    利用第一类曝光形成的图像信号,解析得到色彩信号;
    利用第二类曝光形成的图像信号,解析得到亮度信号。
  30. 根据权利要求18-22任一项所述的图像融合设备,其特征在于,还包括:信号控制器,用于调控所述图像传感器形成符合预定亮度要求的图像信号。
  31. 根据权利要求30所述的图像融合设备,其特征在于,所述信号控制器具体用于对所述图像传感器所形成的图像信号进行亮度统计,并根据统计结果调控所述图像传感器形成符合预定亮度要求的图像信号。
  32. 根据权利要求30所述的图像融合设备,其特征在于,所述信号控制器还用于控制所述图像传感器在单次曝光采集和多次曝光采集之间切换。
  33. 根据权利要求30所述的图像融合设备,其特征在于,还包括:红外补光器;所述信号控制器还用于控制所述红外补光器对所述图像传感器进行红外补光。
  34. 一种图像融合方法,其特征在于,应用于图像融合设备,所述图像融合设备具有四类感光通道,所述四类感光通道包括:红绿蓝RGB通道和红外IR通道;所述方法包括:
    对入射光中第一预定波长区间的光谱进行阻挡,得到目标光;其中,所述第一预定波长区间为:所述图像融合设备的图像传感器中RGB通道与IR通道在红外波段上响应差别高于第一预定阈值的光谱的波长区间;通过所述RGB通道和所述IR通道,将所述目标光转换为图像信号;
    将所述图像信号解析为色彩信号和对红外波段感光的亮度信号,并将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像,其中,所述色彩信号为基于R通道、G通道或B通道解析得到的信号。
  35. 根据权利要求34所述的方法,其特征在于,所述色彩信号由R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值后所形成,所述IR参数值为对应像素位置的IR通道的值与预设修正值的乘积。
  36. 根据权利要求35所述的方法,其特征在于,所述将所述图像信号解析为色彩信号和对红外波段感光的亮度信号的步骤,包括:
    对所述图像信号中的IR通道进行插值运算,生成与输入分辨率相同、且对红外波段感光的亮度信号,其中,所述输入分辨率为所述图像信号具有的分辨率;
    遍历所述图像信号,将所遍历到的R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值,所述IR参数值为对应像素位置的IR通道的值与预设修正值的乘积;
    分别对所述图像信号中R通道、G通道和B通道插值,生成与所述输入分 辨率相同的色彩信号。
  37. 根据权利要求34所述的方法,其特征在于,所述将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像的步骤,包括:
    根据所述色彩信号,计算各个像素对应的辅助值Y,其中,Y=(R*w1+G*w2+B*w3)/(w1+w2+w3),R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道的值,w1、w2和w3为权重值;
    计算所述色彩信号中各个通道值与辅助值Y的比例,得到各个像素对应的参考通道值K1、K2和K3,其中,K1=R/Y,K2=G/Y,K3=B/Y;
    对所述参考通道值K1、K2和K3进行色彩降噪处理;
    将对应像素上的亮度信号Y’与色彩降噪后的参考通道值K1-K3进行融合处理,生成融合后的RGB三通道值R’、G’和B’,得到融合后图像;其中,R’=K1*Y’;G’=K2*Y’;B’=K3*Y’。
  38. 根据权利要求34所述的方法,其特征在于,所述将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像的步骤,包括:
    将所述色彩信号转换为YUV信号,所述YUV信号为明亮度及色度信号;
    提取所述YUV信号中的色度UV分量;
    对所述UV分量与所述亮度信号进行组合,形成新的YUV信号;
    将所形成的新的YUV信号确定为融合后图像,或者,将所形成的新的YUV信号转换为RGB信号,并将转换为的RGB信号确定为融合后图像。
  39. 一种图像融合方法,其特征在于,应用于图像融合设备,所述图像融合设备具有四类感光通道,所述四类感光通道包括:红绿蓝RGB通道和全波段W通道;所述方法包括:
    对入射光中第一预定波长区间的光谱进行阻挡,得到目标光,所述第一预定波长区间为:所述图像融合设备的图像传感器中RGB通道与W通道在红外波段上响应差别高于第一预定阈值的光谱的波长区间;
    通过所述RGB通道和所述W通道,将所述目标光转换为图像信号;
    将所述图像信号解析为色彩信号和亮度信号,并将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像,其中,所述色彩信号为基于R通道、G通道或B通道解析得到的信号。
  40. 根据权利要求39所述的方法,其特征在于,所述色彩信号由R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值后所形成,所述IR参数值为对应像素位置的IR值与预设修正值的乘积,所述IR值为采用预定计算公式所计算得到的值。
  41. 根据权利要求40所述的方法,其特征在于,所述将所述图像信号解析为色彩信号和亮度信号的步骤,包括:
    对所述图像信号中的W通道进行插值运算,生成与输入分辨率相同的亮度信号,其中,所述输入分辨率为所述图像信号所具有的分辨率;
    分别对所述图像信号中R通道、G通道和B通道进行插值运算,生成与所述输入分辨率相同的各通道图像信号;
    遍历所述各通道图像信号,采用预定公式计算各像素位置的IR值,其中,所述预定公式为(R+G+B-W)/n,R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道的值,W为像素对应的W通道的值,n为第一预设修正值;
    遍历所述各通道图像信号,将所遍历到的R通道、G通道或B通道的值减去与对应像素位置相应的IR参数值,生成色彩信号,所述IR参数值为对应像素位置的IR值与第二预设修正值的乘积。
  42. 根据权利要求39所述的方法,其特征在于,所述将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像的步骤,包括:
    根据所述色彩信号,计算各个像素对应的辅助值Y,其中,Y=(R*w1+G*w2+B*w3)/(w1+w2+w3),R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道的值,w1、w2和w3为权重值;
    计算所述色彩信号中各个通道值与辅助值Y的比例,得到各个像素对应的参考通道值K1、K2和K3,其中,K1=R/Y,K2=G/Y,K3=B/Y;
    对所述参考通道值K1、K2和K3进行色彩降噪处理;
    将对应像素上的亮度信号Y’与色彩降噪后的参考通道值K1-K3进行融合处理,生成融合后的RGB三通道值R’、G’和B’,得到融合后图像;其中,R’=K1*Y’;G’=K2*Y’;B’=K3*Y’。
  43. 根据权利要求39所述的方法,其特征在于,所述将所述色彩信号和所述亮度信号进行融合处理,得到融合后图像的步骤,包括:
    将所述色彩信号转换为YUV信号,所述YUV信号为明亮度及色度信号;
    提取所述YUV信号中的色度UV分量;
    对所述UV分量与所述亮度信号进行组合,形成新的YUV信号;
    将所形成的新的YUV信号确定为融合后图像,或者,将所形成的新的YUV信号转换为RGB信号,并将转换为的RGB信号确定为融合后图像。
  44. 一种存储介质,其特征在于,用于存储可执行程序代码,所述可执行程序代码被运行以执行:权利要求34-38中任一项所述的图像融合方法。
  45. 一种存储介质,其特征在于,用于存储可执行程序代码,所述可执行程序代码被运行以执行:权利要求39-43中任一项所述的图像融合方法。
  46. 一种应用程序,其特征在于,所述应用程序用于在运行时执行:权利要求34-38中任一项所述的图像融合方法。
  47. 一种应用程序,其特征在于,所述应用程序用于在运行时执行:权利要求39-43中任一项所述的图像融合方法。
PCT/CN2018/074093 2017-02-10 2018-01-25 图像融合设备和图像融合方法 WO2018145575A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/481,397 US11049232B2 (en) 2017-02-10 2018-01-25 Image fusion apparatus and image fusion method
EP18750832.0A EP3582490B1 (en) 2017-02-10 2018-01-25 Image fusion apparatus and image fusion method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710074203.9 2017-02-10
CN201710074203.9A CN108419062B (zh) 2017-02-10 2017-02-10 图像融合设备和图像融合方法

Publications (1)

Publication Number Publication Date
WO2018145575A1 true WO2018145575A1 (zh) 2018-08-16

Family

ID=63107165

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/074093 WO2018145575A1 (zh) 2017-02-10 2018-01-25 图像融合设备和图像融合方法

Country Status (4)

Country Link
US (1) US11049232B2 (zh)
EP (1) EP3582490B1 (zh)
CN (2) CN111988587B (zh)
WO (1) WO2018145575A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112163627A (zh) * 2020-10-09 2021-01-01 北京环境特性研究所 目标物体的融合图像生成方法、装置及系统
CN112819738A (zh) * 2021-01-19 2021-05-18 合肥英睿系统技术有限公司 红外图像融合方法、装置、计算机设备和存储介质
CN114697584A (zh) * 2020-12-31 2022-07-01 杭州海康威视数字技术股份有限公司 一种图像处理系统及图像处理方法
CN114936174A (zh) * 2022-06-09 2022-08-23 中国兵器工业计算机应用技术研究所 基于地面无人平台的图像处理与融合计算方法
CN115297268A (zh) * 2020-01-22 2022-11-04 杭州海康威视数字技术股份有限公司 一种成像系统及图像处理方法

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017099616A (ja) * 2015-12-01 2017-06-08 ソニー株式会社 手術用制御装置、手術用制御方法、およびプログラム、並びに手術システム
WO2017149932A1 (ja) * 2016-03-03 2017-09-08 ソニー株式会社 医療用画像処理装置、システム、方法及びプログラム
CN111988587B (zh) * 2017-02-10 2023-02-07 杭州海康威视数字技术股份有限公司 图像融合设备和图像融合方法
CN108419061B (zh) * 2017-02-10 2020-10-02 杭州海康威视数字技术股份有限公司 基于多光谱的图像融合设备、方法及图像传感器
CN109508588A (zh) * 2017-09-15 2019-03-22 杭州海康威视数字技术股份有限公司 监控方法、装置、系统、电子设备及计算机可读存储介质
KR102412278B1 (ko) * 2017-11-06 2022-06-24 삼성전자 주식회사 보색관계의 필터 어레이를 포함하는 카메라 모듈 및 그를 포함하는 전자 장치
US11641441B2 (en) * 2018-05-25 2023-05-02 Fluke Corporation Optical gas imaging systems and method compatible with uncooled thermal imaging cameras
TWI694721B (zh) * 2018-10-08 2020-05-21 瑞昱半導體股份有限公司 紅外線串擾補償方法及其裝置
CN109377468A (zh) * 2018-10-09 2019-02-22 湖南源信光电科技股份有限公司 基于多特征的红外辐射和偏振图像的伪彩色融合方法
CN111433810A (zh) 2018-12-04 2020-07-17 深圳市大疆创新科技有限公司 目标图像的获取方法、拍摄装置和无人机
CN110493532B (zh) * 2018-12-12 2021-06-29 杭州海康威视数字技术股份有限公司 一种图像处理方法和系统
CN111951200B (zh) * 2019-05-15 2023-11-14 杭州海康威视数字技术股份有限公司 摄像设备、图像融合方法、装置和存储介质
CN110493494B (zh) * 2019-05-31 2021-02-26 杭州海康威视数字技术股份有限公司 图像融合装置及图像融合方法
CN110490041B (zh) * 2019-05-31 2022-03-15 杭州海康威视数字技术股份有限公司 人脸图像采集装置及方法
CN110493491B (zh) * 2019-05-31 2021-02-26 杭州海康威视数字技术股份有限公司 一种图像采集装置及摄像方法
CN110493492B (zh) 2019-05-31 2021-02-26 杭州海康威视数字技术股份有限公司 图像采集装置及图像采集方法
CN110493537B (zh) * 2019-06-14 2022-08-02 杭州海康威视数字技术股份有限公司 图像采集装置及图像采集方法
CN110519489B (zh) * 2019-06-20 2021-04-06 杭州海康威视数字技术股份有限公司 图像采集方法及装置
CN110290370B (zh) * 2019-07-05 2020-09-18 上海富瀚微电子股份有限公司 图像处理方法及装置
CN112217962B (zh) * 2019-07-10 2022-04-05 杭州海康威视数字技术股份有限公司 摄像机及图像生成方法
CN112241668A (zh) * 2019-07-18 2021-01-19 杭州海康威视数字技术股份有限公司 图像处理方法、装置及设备
CN110574367A (zh) * 2019-07-31 2019-12-13 华为技术有限公司 一种图像传感器和图像感光的方法
CN110507283A (zh) * 2019-09-17 2019-11-29 福州鑫图光电有限公司 视网膜相机及其实现方法
CN111402306A (zh) * 2020-03-13 2020-07-10 中国人民解放军32801部队 一种基于深度学习的微光/红外图像彩色融合方法及系统
CN113469160B (zh) * 2020-03-30 2024-03-12 浙江宇视科技有限公司 一种不可见光分量的去除方法、装置、存储介质及设备
CN113940052B (zh) * 2020-04-29 2023-01-20 华为技术有限公司 摄像机及获取图像的方法
CN111507930B (zh) * 2020-06-18 2023-10-10 杭州海康威视数字技术股份有限公司 图像融合方法、装置、存储介质和计算机设备
CN111741277B (zh) * 2020-07-13 2022-04-29 深圳市汇顶科技股份有限公司 图像处理的方法和图像处理装置
WO2022027469A1 (zh) * 2020-08-06 2022-02-10 深圳市汇顶科技股份有限公司 图像处理方法、装置和存储介质
CN114143443B (zh) * 2020-09-04 2024-04-05 聚晶半导体股份有限公司 双传感器摄像系统及其摄像方法
US11689822B2 (en) * 2020-09-04 2023-06-27 Altek Semiconductor Corp. Dual sensor imaging system and privacy protection imaging method thereof
WO2022103429A1 (en) * 2020-11-12 2022-05-19 Innopeak Technology, Inc. Image fusion with base-detail decomposition and flexible color and details adjustment
CN114612369A (zh) * 2020-12-04 2022-06-10 深圳超多维科技有限公司 图像融合方法、装置及电子设备
CN114697586B (zh) * 2020-12-31 2023-12-29 杭州海康威视数字技术股份有限公司 一种图像处理系统、近红外光补光控制方法及装置
CN112884688B (zh) * 2021-02-03 2024-03-29 浙江大华技术股份有限公司 一种图像融合方法、装置、设备及介质
CN113115012B (zh) * 2021-04-06 2022-09-13 展讯通信(上海)有限公司 图像处理方法及相关装置
US20230034109A1 (en) * 2021-07-15 2023-02-02 Samsung Electronics Co., Ltd. Apparatus and method for interband denoising and sharpening of images
CN113781326A (zh) * 2021-08-11 2021-12-10 北京旷视科技有限公司 解马赛克方法、装置、电子设备及存储介质
CN113824936B (zh) * 2021-09-23 2024-02-09 合肥埃科光电科技股份有限公司 色彩滤波阵列线扫描相机色彩插值方法、装置及设备
CN114693580B (zh) * 2022-05-31 2022-10-18 荣耀终端有限公司 图像处理方法及其相关设备
CN114782502B (zh) * 2022-06-16 2022-11-04 浙江宇视科技有限公司 一种多光谱多传感器协同处理方法及装置、存储介质
CN115409754B (zh) * 2022-11-02 2023-03-24 深圳深知未来智能有限公司 一种基于图像区域有效性的多曝光图像融合方法及系统
CN117115061B (zh) * 2023-09-11 2024-04-09 北京理工大学 一种多模态图像融合方法、装置、设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8619143B2 (en) * 2010-03-19 2013-12-31 Pixim, Inc. Image sensor including color and infrared pixels
CN105323567A (zh) * 2014-07-11 2016-02-10 恒景科技股份有限公司 色彩处理系统及装置
CN105514132A (zh) * 2014-10-08 2016-04-20 全视技术有限公司 具有信号分离的颜色滤波器阵列的双模图像传感器及其方法
CN105704463A (zh) * 2014-12-09 2016-06-22 意法半导体(R&D)有限公司 使用具有组合的rgb和ir感测的像素的图像传感器

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4695550B2 (ja) * 2006-06-22 2011-06-08 富士フイルム株式会社 固体撮像装置およびその駆動方法
US8520970B2 (en) * 2010-04-23 2013-08-27 Flir Systems Ab Infrared resolution and contrast enhancement with fusion
CN102687502B (zh) * 2009-08-25 2015-07-08 双光圈国际株式会社 减少彩色图像中的噪声
US8408821B2 (en) * 2010-10-12 2013-04-02 Omnivision Technologies, Inc. Visible and infrared dual mode imaging system
JP2014515587A (ja) * 2011-06-01 2014-06-30 ザ ボード オブ トラスティーズ オブ ザ レランド スタンフォード ジュニア ユニバーシティー デジタル画像装置用の画像処理パイプラインの学習
WO2013090922A1 (en) * 2011-12-16 2013-06-20 Tenebraex Corporation Systems and methods for creating full-color image in low light
US9173570B2 (en) * 2012-04-12 2015-11-03 Thomas Nathan Millikan Viewing and processing multispectral images
US20140218598A1 (en) * 2013-02-07 2014-08-07 Power Lens Technology Inc. Lens assembly applicable to an image sensor
US9497429B2 (en) * 2013-03-15 2016-11-15 Pelican Imaging Corporation Extended color processing on pelican array cameras
US9692992B2 (en) * 2013-07-01 2017-06-27 Omnivision Technologies, Inc. Color and infrared filter array patterns to reduce color aliasing
CN108769502B (zh) * 2014-06-24 2020-10-30 麦克赛尔株式会社 摄像处理装置以及摄像处理方法
CN106664378B (zh) * 2014-08-20 2020-05-19 松下知识产权经营株式会社 固体摄像装置以及相机
JP6568719B2 (ja) 2014-08-29 2019-08-28 株式会社 日立産業制御ソリューションズ 撮像方法及び撮像装置
CN104683767B (zh) * 2015-02-10 2018-03-06 浙江宇视科技有限公司 透雾图像生成方法及装置
KR102287944B1 (ko) * 2015-12-22 2021-08-09 삼성전자주식회사 영상 출력 장치 및 영상 출력 방법
EP3433816A1 (en) * 2016-03-22 2019-01-30 URU, Inc. Apparatus, systems, and methods for integrating digital media content into other digital media content
WO2018120238A1 (zh) * 2016-12-30 2018-07-05 华为技术有限公司 用于处理文档的设备、方法和图形用户界面
CN111988587B (zh) * 2017-02-10 2023-02-07 杭州海康威视数字技术股份有限公司 图像融合设备和图像融合方法
CN108419061B (zh) * 2017-02-10 2020-10-02 杭州海康威视数字技术股份有限公司 基于多光谱的图像融合设备、方法及图像传感器

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8619143B2 (en) * 2010-03-19 2013-12-31 Pixim, Inc. Image sensor including color and infrared pixels
CN105323567A (zh) * 2014-07-11 2016-02-10 恒景科技股份有限公司 色彩处理系统及装置
CN105514132A (zh) * 2014-10-08 2016-04-20 全视技术有限公司 具有信号分离的颜色滤波器阵列的双模图像传感器及其方法
CN105704463A (zh) * 2014-12-09 2016-06-22 意法半导体(R&D)有限公司 使用具有组合的rgb和ir感测的像素的图像传感器

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3582490A4

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115297268A (zh) * 2020-01-22 2022-11-04 杭州海康威视数字技术股份有限公司 一种成像系统及图像处理方法
CN115297268B (zh) * 2020-01-22 2024-01-05 杭州海康威视数字技术股份有限公司 一种成像系统及图像处理方法
CN112163627A (zh) * 2020-10-09 2021-01-01 北京环境特性研究所 目标物体的融合图像生成方法、装置及系统
CN112163627B (zh) * 2020-10-09 2024-01-23 北京环境特性研究所 目标物体的融合图像生成方法、装置及系统
CN114697584A (zh) * 2020-12-31 2022-07-01 杭州海康威视数字技术股份有限公司 一种图像处理系统及图像处理方法
CN114697584B (zh) * 2020-12-31 2023-12-26 杭州海康威视数字技术股份有限公司 一种图像处理系统及图像处理方法
CN112819738A (zh) * 2021-01-19 2021-05-18 合肥英睿系统技术有限公司 红外图像融合方法、装置、计算机设备和存储介质
CN112819738B (zh) * 2021-01-19 2024-01-02 合肥英睿系统技术有限公司 红外图像融合方法、装置、计算机设备和存储介质
CN114936174A (zh) * 2022-06-09 2022-08-23 中国兵器工业计算机应用技术研究所 基于地面无人平台的图像处理与融合计算方法
CN114936174B (zh) * 2022-06-09 2024-01-30 中国兵器工业计算机应用技术研究所 基于地面无人平台的图像处理与融合计算方法

Also Published As

Publication number Publication date
US20190378258A1 (en) 2019-12-12
CN108419062A (zh) 2018-08-17
CN111988587A (zh) 2020-11-24
EP3582490A1 (en) 2019-12-18
CN108419062B (zh) 2020-10-02
CN111988587B (zh) 2023-02-07
EP3582490A4 (en) 2020-02-26
US11049232B2 (en) 2021-06-29
EP3582490B1 (en) 2021-07-07

Similar Documents

Publication Publication Date Title
WO2018145575A1 (zh) 图像融合设备和图像融合方法
WO2018145576A1 (zh) 基于多光谱的图像融合设备、方法及图像传感器
US11252345B2 (en) Dual-spectrum camera system based on a single sensor and image processing method
WO2019119842A1 (zh) 图像融合方法、装置、电子设备及计算机可读存储介质
KR102287944B1 (ko) 영상 출력 장치 및 영상 출력 방법
WO2017202061A1 (zh) 一种图像透雾方法及实现图像透雾的图像采集设备
JP4346634B2 (ja) 目標物検出装置
CN110493532B (zh) 一种图像处理方法和系统
US8248496B2 (en) Image processing apparatus, image processing method, and image sensor
CN110493583B (zh) 图像处理方法、装置、电子设备及计算机可读存储介质
US9936172B2 (en) Signal processing device, signal processing method, and signal processing program for performing color reproduction of an image
US9131199B2 (en) Imaging apparatus capable of generating an appropriate color image
EP3688977B1 (en) Generating a monochrome image
KR20180118432A (ko) 감도 개선을 위한 영상 처리 장치 및 방법
US8704925B2 (en) Image sensing apparatus including a single-plate image sensor having five or more brands
Rush et al. X3 sensor characteristics
JP7299762B2 (ja) 画像処理装置および方法、撮像装置、プログラム
US9185377B1 (en) Color processing system and apparatus
WO2024016288A1 (zh) 拍摄装置以及控制方法
JP3976562B2 (ja) 色補正マトリクス決定方法および装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18750832

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018750832

Country of ref document: EP

Effective date: 20190910