WO2022036539A1 - 一种多相机色彩一致性校正方法和装置 - Google Patents

一种多相机色彩一致性校正方法和装置 Download PDF

Info

Publication number
WO2022036539A1
WO2022036539A1 PCT/CN2020/109722 CN2020109722W WO2022036539A1 WO 2022036539 A1 WO2022036539 A1 WO 2022036539A1 CN 2020109722 W CN2020109722 W CN 2020109722W WO 2022036539 A1 WO2022036539 A1 WO 2022036539A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
image
parameter
color mapping
determining
Prior art date
Application number
PCT/CN2020/109722
Other languages
English (en)
French (fr)
Inventor
王月红
郑炳坤
尹玄武
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2020/109722 priority Critical patent/WO2022036539A1/zh
Priority to EP20949751.0A priority patent/EP4195662A4/en
Priority to CN202080103950.0A priority patent/CN116158087A/zh
Publication of WO2022036539A1 publication Critical patent/WO2022036539A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/603Colour correction or control controlled by characteristics of the picture signal generator or the picture reproducer
    • H04N1/6052Matching two or more picture signal generators or two or more picture reproducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6077Colour balance, e.g. colour cast correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6083Colour correction or control controlled by factors external to the apparatus
    • H04N1/6086Colour correction or control controlled by factors external to the apparatus by scene illuminant, i.e. conditions at the time of picture capture, e.g. flash, optical filter used, evening, cloud, daylight, artificial lighting, white point measurement, colour temperature
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control

Definitions

  • the present application relates to the field of image processing, and in particular, to a multi-camera color consistency correction method and device.
  • multiple cameras are often used to shoot the same scene, or more and more products and systems are equipped with multiple cameras to shoot images.
  • multiple cameras with different focal lengths and different characteristics are mounted on terminal devices such as mobile phones, which can provide users with high-quality images.
  • the colors of the images actually obtained by the cameras of each camera may be different.
  • the degree of color difference varies with the changes of lighting and shooting scenes. Therefore, in practical applications, it is necessary to adjust the colors of images obtained by multiple cameras in real time according to changes in lighting, shooting scenes, etc. Colors remain the same.
  • the present application provides a multi-camera color consistency correction method and device, which can adjust the color consistency of images obtained by multiple cameras under different illuminations.
  • a multi-camera color consistency correction method comprising: acquiring a first image captured by a first camera and a second image captured by a second camera; At least one color mapping parameter is determined in the color mapping parameter, the image information includes at least one of the color information of the first image and the ambient light source of the first image, and the color mapping parameter indicates the image captured by the first camera and the image captured by the second camera.
  • the N color mapping parameters are in one-to-one correspondence with the determined N standard light sources, and N is a positive integer; perform color consistency correction on the second image according to at least one color mapping parameter to obtain a corrected image.
  • the color mapping parameters are calibrated under different standard light sources, and the color mapping parameters corresponding to each standard light source can be obtained.
  • color consistency correction is performed on the second image in combination with at least one color mapping parameter, which can realize real-time color correction under different lighting conditions and improve the accuracy of color consistency correction.
  • the method before determining at least one color mapping parameter from the N color mapping parameters, the method further includes: determining a first calibration image and a second calibration image under each standard light source, the first calibration image
  • the calibration image and the second calibration image are color card images generated according to the spectral response curves of the first camera and the second camera respectively; the color mapping parameters corresponding to each standard light source are determined according to the first calibration image and the second calibration image.
  • calibration images under different light source conditions can be simulated and generated, which reduces the time cost of calibration data shooting, reduces the unstable factors introduced by shooting, and improves the stability and accuracy of color mapping parameter calibration.
  • performing color consistency correction on the second image according to the at least one color mapping parameter includes: determining a common image area of the first image and the second image; according to the common image area and the at least one color mapping parameter determining a color compensation parameter; determining a white balance compensation parameter according to the common image area; performing color consistency correction on the second image according to the white balance parameter and the color compensation parameter.
  • the gray area of the second image can be corrected according to the white balance compensation parameter, and the color part of the second image can be corrected according to the color compensation parameter.
  • the gray area of the second image and Colored areas are all corrected for color consistency, improving the effect of color correction.
  • determining the common image area of the first image and the second image includes: determining the search area according to the relative positions and field of view of the first camera and the second camera; determining the common image area according to the search area .
  • the range of image matching can be determined in combination with the calibration information of the cameras, so as to improve the accuracy of image matching and the search efficiency.
  • determining the color compensation parameter according to the common image area and the at least one color mapping parameter includes: respectively applying N color mapping parameters to the common image area in the second image to obtain N third images Calculate the color difference of the common image area in the first image and each third image respectively; Determine at least one color mapping parameter according to the color difference, and the at least one color mapping parameter is the color corresponding to the third image with the smallest color difference at least one mapping parameters; determining a target color mapping parameter according to at least one color mapping parameter.
  • the target color mapping parameter is a weighted value of at least one color mapping parameter; determining a color compensation parameter according to the target color mapping parameter.
  • determining the color compensation parameter according to the common image area and the at least one color mapping parameter includes: determining the ambient light source according to the white balance gain of the common image area in the first image; determining at least one standard according to the ambient light source at least one color mapping parameter corresponding to the light source, the difference between the at least one standard light source and the ambient light source is the smallest; the target color mapping parameter is determined according to the at least one color mapping parameter, and the target color mapping parameter is a weighted value of the at least one color mapping parameter; according to the target color mapping parameter The colormap parameters determine the color compensation parameters.
  • the color compensation parameters can be determined in various ways, and a plurality of related color mapping parameters can be fused according to actual lighting conditions, and the target color mapping parameters determined in this way are more accurate.
  • the color compensation parameter is the target color mapping parameter, or the color compensation parameter is the product of the target color mapping parameter, the white balance gain of the first image, and the color restoration parameter.
  • determining the white balance compensation parameter according to the common image area includes: respectively determining a weighted average value or a weighted color histogram of the pixels of the common image area in the first image in three color channels; The weighted average value or weighted color histogram of the pixels in the common image area in the second image in the three color channels; the white balance compensation parameter is determined according to the weighted average value or the weighted color histogram of the three color channels.
  • the method before determining the white balance compensation parameter according to the common image area, the method further includes: dividing the common image area into M blocks according to the spatial position, color similarity and edge information of the common image area , M is a positive integer; determining the white balance compensation parameter according to the common image area includes: respectively determining the weighted average or weighted color histogram of the image blocks of the common image area in the first image in three color channels; respectively determining the second image The weighted average value or the weighted color histogram of the image blocks in the common image area in the three color channels; the white balance compensation parameter is determined according to the weighted average value or the weighted color histogram of the three color channels.
  • a multi-camera color consistency correction device comprising: an acquisition module for acquiring a first image captured by a first camera and a second image captured by a second camera;
  • the image information indicated by the first image determines at least one color mapping parameter from the N color mapping parameters, the image information includes at least one of the color information of the first image and the ambient light source of the first image, and the color mapping parameter indicates the first color mapping parameter.
  • the color conversion relationship between the image captured by the camera and the image captured by the second camera, the N color mapping parameters are in one-to-one correspondence with the N standard light sources, and N is a positive integer;
  • the correction module is used for performing the second image according to at least one color mapping parameter. Color consistency correction for corrected images.
  • the color mapping parameters are calibrated under different standard light sources, and the color mapping parameters corresponding to each standard light source can be obtained.
  • color consistency correction is performed on the second image in combination with at least one color mapping parameter, which can realize real-time color correction under different lighting conditions and improve the accuracy of color consistency correction.
  • the determining module before determining at least one color mapping parameter from the N color mapping parameters, is specifically configured to: determine the first calibration image and the second calibration image under each standard light source, the A calibration image and a second calibration image are color card images generated according to the spectral response curves of the first camera and the second camera respectively; the color mapping parameters corresponding to each standard light source are determined according to the first calibration image and the second calibration image.
  • calibration images under different light source conditions can be simulated and generated, which reduces the time cost of calibration data shooting, reduces the unstable factors introduced by shooting, and improves the stability and accuracy of color mapping parameter calibration.
  • the determining module is specifically configured to: determine a common image area of the first image and the second image; determine a color compensation parameter according to the common image area and at least one color mapping parameter; determine a white color according to the common image area Balance compensation parameter; the correction module is specifically used for: performing color consistency correction on the second image according to the white balance parameter and the color compensation parameter.
  • White balance compensation can correct the gray area of the second image, and color compensation can correct the color part of the second image. By combining white balance compensation and color compensation, both the gray area and the color area of the second image can be obtained. Color consistency correction improves the effect of color correction.
  • the determining module is specifically configured to: determine the search area according to the relative position and field of view of the first camera and the second camera; and determine the common image area according to the search area.
  • the range of image matching can be determined in combination with the calibration information of the cameras, so as to improve the accuracy of image matching and the search efficiency.
  • the determining module is specifically configured to: respectively apply N color mapping parameters to the common image area in the second image to obtain N third images; respectively calculate the common images in the first image The color difference between the region and each third image; at least one color mapping parameter is determined according to the color difference, and the at least one color mapping parameter is the color mapping parameter corresponding to at least one third image with the smallest color difference; determined according to the at least one color mapping parameter A target color mapping parameter, where the target color mapping parameter is a weighted value of at least one color mapping parameter; the color compensation parameter is determined according to the target color mapping parameter.
  • the determining module is specifically configured to: determine the ambient light source according to the white balance result gain of the common image area in the first image; determine at least one color mapping parameter corresponding to at least one standard light source according to the ambient light source , the difference between the at least one standard light source and the ambient light source is the smallest; a target color mapping parameter is determined according to at least one color mapping parameter, and the target color mapping parameter is a weighted value of the at least one color mapping parameter; and a color compensation parameter is determined according to the target color mapping parameter.
  • the color compensation parameters can be determined in various ways, and the relevant color mapping parameters can be fused according to the actual lighting conditions, and the target color mapping parameters determined in this way are more accurate.
  • the color compensation parameter is the target color mapping parameter, or the color compensation parameter is the product of the target color mapping parameter, the white balance gain of the first image, and the color restoration parameter.
  • the determining module is specifically configured to: respectively determine the weighted average value or the weighted color histogram of the pixels in the common image area in the first image in the three color channels; The weighted average value or the weighted color histogram of the pixels in the common image area in the three color channels; the white balance compensation parameter is determined according to the weighted value or the weighted color histogram of the three color channels.
  • the determining module is further configured to: according to the spatial position, color similarity and edge information of the common image area, divide the common image area into M blocks, where M is a positive integer; respectively determine the weighted average value or weighted color histogram of the image blocks of the common image area in the first image in the three color channels; respectively determine that the image blocks of the common image area in the second image are in The weighted average value of the three color channels or the weighted color histogram; the white balance compensation parameter is determined according to the weighted value of the three color channels or the weighted color histogram.
  • a computer-readable medium stores program code for execution by a device, the program code including a color matching method for executing the first aspect or any implementation manner of the first aspect sex correction method.
  • a computer program product comprising: computer program code, when the computer program code is run on a computer, the computer program code enables the computer to execute the first aspect or any implementation manner of the first aspect Color consistency correction method in .
  • a fifth aspect provides a chip, the chip includes a processor and a data interface, the processor reads an instruction stored in a memory through the data interface, and executes the first aspect or any one of the implementations of the first aspect. color consistency correction method.
  • the chip may further include a memory, the memory stores instructions, the processor is used to execute the instructions stored in the memory, and when the instructions are executed, the processor is used to execute the first aspect or the first aspect.
  • the color consistency correction method in any one of the implementations.
  • an apparatus comprising: a processor and a memory, the memory is used for storing the computer program code, when the computer program code is executed on the processor, the apparatus causes the apparatus to perform the first aspect or The color consistency correction method in any one of the implementation manners of the first aspect.
  • FIG. 2 is a schematic flowchart of a multi-camera color consistency correction method according to an embodiment of the present application
  • FIG. 3 is a schematic flowchart of a method for determining a color mapping parameter according to an embodiment of the present application
  • FIG. 4 is a schematic flowchart of another method for determining color mapping parameters according to an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of a second image color correction method according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of image matching according to an embodiment of the present application.
  • FIG. 7 is a schematic flowchart of another second image color correction method according to an embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a multi-camera color consistency correction device according to an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of another multi-camera color consistency correction apparatus according to an embodiment of the present application.
  • the image acquired by one camera in the multi-camera is usually used as the main image (for example, the first image), and the other images are used as auxiliary images (for example, the second image).
  • the mapping relationship between the parameters of the three color channels of the image and the parameters of the three color channels of the second image respectively adjusts the parameters of the three color channels of the second image. But this adjustment is handled individually for the parameters of each color channel.
  • the compensation component of the red (red, R) channel is calculated and applied to the R channel of the second image.
  • the effects of green (green, G) and blue (blue, B) channels are often not considered. In the case of large spectral differences between different cameras, single-channel adjustment is less effective for image color consistency.
  • the color mapping parameters between different cameras are often different, and the color mapping parameters indicate the image obtained by one camera (eg, the first camera) and the image obtained by another camera (eg, the second camera) The relationship between color conversions.
  • the existing multi-camera color correction technology although different lighting conditions are considered when calibrating the color mapping parameters, only one parameter is considered when the color consistency of the image is adjusted. When the lighting changes, multiple images cannot be realized. Color consistency adjustment.
  • Embodiments of the present application provide a multi-camera color consistency correction method and device, which can realize real-time correction of color consistency of images acquired by multiple cameras under different lighting environments.
  • the embodiments of the present application may be applied to a multi-camera (or multi-camera) scenario, where multiple cameras may be located on the same device, for example, a terminal device such as a mobile phone equipped with multiple cameras, a vehicle equipped with multiple cameras, etc.; The cameras may also be located on different devices, for example, multiple cameras on different devices are used to shoot the same scene, etc., which is not limited in this embodiment of the present application.
  • a terminal device such as a mobile phone equipped with multiple cameras, a vehicle equipped with multiple cameras, etc.
  • the cameras may also be located on different devices, for example, multiple cameras on different devices are used to shoot the same scene, etc., which is not limited in this embodiment of the present application.
  • FIG. 1 uses FIG. 1 as an example to introduce an application scenario of the embodiment of the present application.
  • one camera among the plurality of cameras is selected as the main camera (for example, referred to as "first camera"), and the other cameras are selected as auxiliary cameras (for example, referred to as "second camera”).
  • the light reflected by the scene is projected on the two image sensors through the camera lens to form digital image signals, and the two image signals pass through their respective preprocessing modules.
  • White balance gain calculation, color restoration parameter calculation and other operations, and then the two-way image and related parameters enter the color correction module.
  • the color correction module Taking the image of the first camera as the target image effect, in the color correction module, the color compensation parameters and white balance compensation parameters of the second image obtained by the second camera are calculated in combination with various parameters of the first image and the calibrated color mapping parameters. Then, the second image is adjusted according to the calculated color compensation parameters and white balance compensation parameters, and a corrected image that is consistent with the color of the first image can be obtained.
  • post-processing such as gamma transformation and dynamic adjustment may be performed on the two images respectively, so as to obtain the first image with better display effect. an image and a corrected image.
  • FIG. 2 is a schematic flowchart of a multi-camera color consistency correction method according to an embodiment of the present application. As shown in FIG. 2 , the multi-camera color consistency correction method according to the embodiment of the present application includes steps S210 to S220.
  • the N standard light sources correspond to the N color mapping parameters one-to-one.
  • the color map parameter may represent the conversion relationship of the colors of two images (eg, image 1 and image 2).
  • the description methods of the color mapping parameters may include linear transformation matrices, high-order polynomials, neural networks, and the like.
  • the image in the embodiment of the present application adopts the RGB color mode, and each pixel point in the image is described by the values of three color channels of R, G, and B.
  • the colormap parameter can be a 3x3 linear transformation matrix.
  • the color map parameters in matrix form are shown in formula (1):
  • (R 1i G 1i B 1i ) represents the values of the three color channels R, G, and B of the 1-th pixel in the image
  • (R 2i G 2i B 2i ) represents the values of the three color channels of R, G, and B of the i-th pixel in the image 2;
  • the 3 ⁇ 3 matrix T is a color mapping parameter, which represents the color mapping relationship between Image 1 and Image 2.
  • standard light sources may include American window spotlights (A), simulated sunlight (D50), simulated blue sky daylight (D65), simulated northern average sunlight (D75), simulated horizontal daylight (H), and simulated American warm white store Light source (U30), European Japanese and Chinese commercial light source (TL84), American cool white fluorescent light source (CWF) and other light sources.
  • A American window spotlights
  • D50 simulated sunlight
  • D65 simulated blue sky daylight
  • D75 simulated northern average sunlight
  • H simulated horizontal daylight
  • U30 European Japanese and Chinese commercial light source
  • TL84 European Japanese and Chinese commercial light source
  • CWF American cool white fluorescent light source
  • Mode 1 Perform color mapping parameter calibration according to the first calibration image and the second calibration image after white balance, and determine the first color mapping parameter
  • Mode 2 Perform color mapping according to the first calibration image and the second calibration image before white balance parameter calibration, to determine the second color mapping parameter.
  • FIG. 3 is a schematic flowchart of determining the first color mapping parameter in Mode 1.
  • the spectral response curve the spectral sensitivity of the camera can be described, usually with the wavelength as the abscissa and the corresponding spectral response as the ordinate.
  • the calibration image is a color card image used to calibrate the color mapping parameters.
  • the calibration image can be simulated and generated based on the imaging model, spectral response curve, light source spectrum, and 24-color card reflectance spectrum to generate two-channel camera color card images or other benchmarks. image.
  • the first calibration image and the second calibration image are respectively generated according to the spectral corresponding curves of the first camera and the second camera.
  • step S320 the method for calibrating the first color mapping parameter is introduced by taking a 24-color card as an example. In other embodiments, other color cards or other items containing multiple colors that can be used for the reference image may also be used. The calibration of the color mapping parameters is performed.
  • step S330 Perform white balance processing on the first calibration image and the second calibration image respectively.
  • the first calibration image and the second calibration image are respectively generated according to the spectral response curves of the first camera and the second camera, and the corresponding white balance gains are calculated according to the color values of the gray blocks in the two images respectively.
  • the reference value of the light source may also be calculated by using the information of the current light source, and the reference value of the light source may be applied to the two calibration images to perform white balance processing.
  • S340 Calculate the first color mapping parameter. From the two calibrated images after white balance, the values of the three RGB color channels corresponding to each pixel in each image are obtained respectively. Exemplarily, (R 1i 1 G 1i 1 B 1i 1 ) represents the RGB value of the i-th pixel of the first calibration image after white balance, and (R 2i 1 G 2i 1 B 2i 1 ) represents the second after white balance The RGB value of the ith pixel of the calibration image.
  • the least square method is used to calculate the first color mapping parameter T 1 corresponding to the light source.
  • other regression methods can also be used to calculate the first color mapping parameter corresponding to the light source.
  • the above-mentioned steps S310 and S320 may not be performed, and the above-mentioned steps S330 and S330 may be directly performed using the image data captured by the camera as the first calibration image and the second calibration image. Calibration of the first color mapping parameter in step S340.
  • step S210 of the present application for each standard light source of the N standard light sources, the calibration of the color mapping parameters is performed under each standard light source according to the methods of steps S310 to S340 above, and N standard light sources can be determined. N corresponding first color mapping parameters respectively.
  • FIG. 4 is a schematic flowchart of determining the second color mapping parameter in Mode 2.
  • Mode 2 the first calibration image and the second calibration image are obtained directly according to the spectral response, and it is not necessary to perform white balance processing on the calibration image.
  • step S410 is similar to the above-mentioned step S310, and the specific implementation method of step S420 is similar to the above-mentioned step S320, which is not repeated here.
  • the least squares method or other regression method is used to obtain the second color mapping parameter corresponding to the light source T 2 .
  • the above steps S410 and S420 may not be performed, and the images captured by the camera may be directly used as the first calibration image and the second calibration image , and perform the calibration of the second color mapping parameter in step S430.
  • the first camera and the second camera determine the first color mapping parameter according to the white-balanced calibration image, so the first color mapping parameter does not include the adjustment component of the white balance.
  • the first camera and the second camera determine the second color mapping parameter according to the calibration image before white balance, so the second color mapping parameter includes the component of white balance.
  • color mapping parameter calibration using color cards or other reference data, according to the spectrum of different light sources and the spectral response parameters of the camera, calibration images under different light source conditions can be simulated and generated, which reduces the time cost of data capture. At the same time, the unstable factors introduced by shooting are reduced, and the stability and accuracy of color mapping parameter calibration are improved.
  • the process of calibrating the N color mapping parameters introduced in the above step S210 may be performed offline. After the calibration of the N color mapping parameters is completed, the corresponding relationship between the N standard light sources and the N color mapping parameters can be preset. In the image processing device, when the image processing device performs color consistency correction on images obtained by multiple cameras, the calibrated color mapping parameters can be directly used without repeated calibration.
  • At least one color mapping parameter can be selected according to the image information indicated by the first image to perform color consistency correction on the second image, so that the corrected image obtained after correction is the same as the first image. color remains the same.
  • correcting the second image according to at least one of the N color mapping parameters mainly includes image matching and performing color compensation and white balance compensation on the second image. Since there are two ways to calibrate the color map parameter in the process of determining the color map parameter in step S210, correspondingly, in the process of performing color compensation and white balance compensation on the second image, there are also two ways.
  • FIG. 5 is a schematic diagram of a second image color correction process corresponding to the color mapping parameter calibration method 1 .
  • image matching The purpose of image matching is to determine the common image area of the first image and the second image. According to the common image area in the two images, the color difference between the first image and the second image can be determined, and the color compensation parameter and the color compensation parameter can be calculated according to the color difference.
  • the white balance compensation parameter performs color correction on the second image.
  • Various methods can determine the common image area of the two images, such as a method based on feature point matching, a method based on three-dimensional space projection, a method based on template matching, and a method based on machine learning, which is not limited in this embodiment of the present application.
  • FIG. 6 is a schematic diagram of one manner of performing image matching on the first image and the second image.
  • stable and fast image matching can be obtained in combination with the calibration of the cameras.
  • the scaling factors of the two-channel images, the offset of the common image area on the two-channel images, etc. are obtained.
  • the two-way images are scaled according to the above parameters, and the approximate search range of the common image area is determined according to the offset.
  • the search start point and the search end point can be determined on the first image, and image matching is performed within the range of the search start point and the search end point, and finally the first image and the second image are obtained. public image area.
  • the search area of image matching can be narrowed in combination with the calibration information of the cameras, so as to realize fast image matching.
  • the template matching technology can also obtain a better image matching effect in the small image scene, which meets the real-time requirements of color consistency correction.
  • S520 Calculate a white balance compensation parameter of the second image relative to the first image.
  • the white balance compensation parameter can be calculated according to the color information of the common image area of the two images. The purpose of this step is to calculate the color difference of the common image area of the two images, and calculate the relative difference of the second image according to the color difference of the common area of the two images.
  • Step 1 Apply white balance gain to the two-channel images to obtain two-channel images after white balance processing.
  • the first image acquired by the first camera and the second image acquired by the second camera enter the preprocessing module for preprocessing after passing through the image sensor.
  • the preprocessing module can calculate the white balance gain for each image, and the color correction module performs white balance processing on the image according to the white balance gain.
  • Subsequent steps 2 to 5 are based on the white balance processed two-channel images obtained in step 1 to calculate the white balance compensation parameters.
  • Step 2 Divide the common image area of the white balance processed first image and the second image into blocks.
  • the common image area obtained by image matching in step S510 may be divided into blocks.
  • the image in the common image area can be divided into M blocks according to the spatial position, color similarity, edge information, semantic information, etc. of the image, where M is a positive integer, and subsequent calculations are performed in block units.
  • the embodiments of the present application do not limit the manner of dividing the image into blocks.
  • the color value of each image block after the block can be represented by the average value of the three-channel color of all the pixels of the image block.
  • the image block includes 10 pixels, and the color values of the R, G, and B channels of the image block can be respectively represented by the R average value, the G average value, and the B average value of the 10 pixel points.
  • the mode and median of the three-channel color values of all pixels in the image block may also be used to represent the color value of the image block, which is not limited in this embodiment of the present application.
  • the common image area image may not be divided into blocks.
  • Step 2 may not be performed in this case.
  • Step 3 Assign confidence (or "weight") to the image of the public image area.
  • confidence can be assigned to the image blocks of the common image area according to information such as brightness, saturation, and color characteristics of each image. For example, overexposed and oversaturated image blocks may be discarded; for example, overexposed and oversaturated image blocks may be assigned a smaller confidence (or weight), and image blocks that are close to gray may be assigned a larger confidence (or Weights).
  • the image of the common image area is not divided into blocks in step 2 (or step 2 is not performed), in this case, the brightness, saturation, Color characteristics and other information, and assign confidence (or weight) to the pixels of the image in the common image area. For example, overexposed and oversaturated pixels may be discarded; for example, overexposed and oversaturated pixels may be assigned a smaller confidence (or weight), and pixels that are close to gray may be assigned a larger confidence (or weight).
  • all image blocks in the common image area or all pixels in the common image area may be assigned the same weight.
  • the embodiments of the present application do not limit the assignment method of the confidence level (or the weight) and the size of the weight.
  • Step 4 Calculate the color feature of the common image area of the two images.
  • the weighted average value of the image blocks in the common image area of the first image and the second image in the three color channels can be calculated respectively;
  • the R channel color feature of the common image area of the first image is the weighted average of the R values of each image block
  • the G channel color feature of the common image area of the first image is the weighted average of the G values of each image block
  • the B channel color feature of the common image area of the first image is the weighted average of the B values of each image block.
  • the color feature of the common image area of the second image can also be calculated in a similar manner.
  • the weighted color histogram of the image blocks in the common image area of the first image and the second image can be calculated separately; the color histogram can represent the frequency of occurrence of a certain color value in the image.
  • weights are assigned to the three-channel color values of each image block. Taking the public image area of the first image as an example, in the R channel color histogram, the frequency corresponding to each R value is the frequency of the image block. The weighted sum of the number of occurrences of the corresponding R value.
  • the weighted average of the three color channels of the pixels in the common image area of the first image and the second image may be calculated separately; the calculation method is similar to the calculation method after the above-mentioned division, and will not be repeated.
  • the weighted color histograms of the pixels in the common image area of the first image and the second image may be calculated separately, and the calculation method is similar to the calculation method after the above-mentioned division, and will not be repeated.
  • Step 5 Calculate the white balance compensation parameter of the second image for the first image.
  • the color difference can be calculated according to the color feature of the common image area of the two images extracted in step 4, so as to obtain the white balance compensation parameter of the second image relative to the first image.
  • the average weighted values of the three color channels of the common image area images of the two images obtained in step 4 can be compared, and the white balance compensation parameters of the second image relative to the first image can be calculated;
  • the white balance compensation parameter of the second image relative to the first image can be calculated according to the weighted color histogram feature of the common image area of the two images obtained in step 4;
  • the white balance compensation parameter of the second image relative to the first image may be calculated by means of histogram matching.
  • step S530 Calculate, according to the first color mapping parameter, a color compensation parameter of the second image relative to the first image.
  • the color compensation parameters can be calculated according to the color information of the common image area of the two images and the color mapping parameters. Since the first color mapping parameter determined by the method 1 in step S210 does not include the component of white balance compensation, therefore, in the process of color correction of the second image in step S220, it is necessary to first calculate the second image for the first image (step S520), and then calculate the color compensation parameters of the second image to the first image. Specifically, in this step, according to at least one first color mapping parameter, the applicable color mapping parameter in the scene is determined, so that the color difference between the two images after color compensation is minimized.
  • a global search method may be used to find an appropriate first color mapping parameter among the N first color mapping parameters as the first target color mapping parameter, so that the common image area of the second image passes through the first color mapping parameter.
  • the difference between the third image and the common image area of the first image generated after the transformation of the color mapping parameters is minimal. For example, it can be searched according to the method shown in formula (2):
  • N represents the number of standard light sources
  • T i 1 represents the first color mapping parameter corresponding to the ith standard light source in the N standard light sources
  • M represents the number of image blocks in the common image area of the first image and the second image, or the number of pixels
  • the Dis() function is used to calculate the difference of the image, for example, it can be measured by absolute value distance, Euclidean distance or other color difference measures; for example, it can also be measured by converting the image color from RGB space to other color spaces image difference.
  • one first color mapping parameter can be selected from the N first color mapping parameters as the first target color mapping parameter, and the first target color mapping parameter can be used as the color compensation parameter of the second image, so that after the The difference between the second image and the first image after the color compensation parameter is applied is the smallest.
  • multiple first color mapping parameters may also be selected, and the multiple first color mapping parameters are multiple first color mapping parameters that minimize the difference between the common image areas of the third image and the first image, And a plurality of color mapping parameters are fused according to a preset rule to obtain a first target color mapping parameter, and the first target color mapping parameter is used as a color compensation parameter of the second image.
  • the weighted values of the plurality of first color mapping parameters may be used as the first target color mapping parameters, and the first target color mapping parameters may be used as the color compensation parameters of the second image.
  • the ambient light source information of the first image may be determined according to the white balance gain of the first image in step S520, and then one of the N standard light sources closest to the one indicated by the first image may be selected according to the estimated light source information.
  • the light source of the ambient light source, and the first color mapping parameter corresponding to the standard light source is used as the first target color mapping parameter, and the first target color mapping parameter is the color compensation parameter of the second image.
  • the first target color mapping parameter can be used as the color compensation parameter of the second image.
  • the first color mapping parameter is determined according to the white-balanced first image and the second image, therefore, the first target color mapping parameter White balance components are not included.
  • the second image is color corrected.
  • the white balance compensation parameters calculated in step S520 and the color compensation parameters calculated in step S530 are applied to the second image to obtain a color-corrected corrected image.
  • the color of the corrected image obtained after the second image is color corrected may be consistent with the color of the first image.
  • FIG. 7 is a schematic diagram of a second image color correction process corresponding to the color mapping parameter calibration method 2 .
  • image matching image matching.
  • the purpose of image matching is to determine the common image area of the two images for subsequent calculation of color compensation parameters and white balance compensation parameters.
  • the image matching process in this step is similar to the image matching process in the above-mentioned step S510, and details are not repeated here.
  • S720 Calculate a color compensation parameter of the second image relative to the first image.
  • the process of calculating the color compensation parameters consists of two steps:
  • Step 1 Calculate the second target color mapping parameter
  • Step 2 Calculate the color compensation parameter
  • step 1 When calculating the second target color mapping parameter in step 1, according to the scheme described in step S530, according to formula (2), select one of the N second color mapping parameters as the second target color mapping parameter, or select A plurality of second color mapping parameters are fused as second target color mapping parameters. For brevity, detailed description is omitted.
  • Step 2 Calculate the color compensation parameters.
  • the color compensation parameters of the second image can be calculated according to formula (3):
  • CC mat respectively represent the white balance gain and color restoration parameters of the first image, and the white balance gain and color restoration parameters can be obtained from the preprocessing steps shown in FIG. 1;
  • G represents the color compensation parameter, which includes the white balance gain component and the color reproduction parameter component.
  • S730 Calculate a white balance compensation parameter of the second image relative to the first image.
  • the calculation of white balance compensation is performed on the second image after color compensation to cope with complex and changeable scenes.
  • Step 1 Apply white balance gain to the common image area of the first image and the color reproduction parameter CC mat .
  • the white balance gain and color restoration parameters of the first image can be calculated from the preprocessing steps shown in FIG. 1 .
  • Step 2 Apply the color compensation parameter G obtained in step S720 to the common image area of the second image.
  • Step 3 Calculate the white balance compensation parameter according to the common area of the processed first image obtained in step 1 and the common area of the second image after processing obtained in step 2.
  • the calculation method of the white balance compensation is similar to the method described in step S520, and will not be described in detail in order to avoid repetition.
  • this step S730 may also be implemented by the following steps:
  • Step 2 Calculate the white balance compensation parameter according to the common image area of the first image before white balance and the common image area of the processed second image obtained in step 1. By comparing the overall color difference between the common area of the first image before white balance and the common area of the second image after color mapping parameter conversion, the amount of white balance compensation in the scene can be dynamically adjusted.
  • the calculation method of the white balance compensation parameter is similar to the method described in step S520, and will not be described in detail in order to avoid repetition.
  • the second image is color corrected.
  • the color compensation parameters calculated in step S720 and the white balance compensation parameters calculated in step S730 are applied to the second image to obtain a color-corrected corrected image.
  • the color of the corrected image obtained after the second image is color corrected may be consistent with the color of the first image.
  • the color consistency correction for the second image includes color compensation and white balance compensation, and the white balance compensation and color compensation may be performed according to white balance compensation parameters and color compensation parameters, respectively.
  • White balance compensation can correct the consistency of gray areas in the image
  • color compensation can correct the consistency of color areas in the image. Combining the two corrections, the gray area and color area of the image can be corrected better.
  • FIG. 8 is a schematic diagram of a multi-camera color consistency correction device according to an embodiment of the present application.
  • the multi-camera color consistency correction apparatus includes an acquisition module 810 , a determination module 820 and a correction module 830 .
  • the acquiring module 810 is configured to acquire the first image captured by the first camera and the second image captured by the second camera.
  • the determining module 820 is configured to determine at least one color mapping parameter from the N color mapping parameters according to the image information indicated by the first image, where the image information includes at least one of the color information of the first image and the ambient light source of the first image, the color The mapping parameters indicate the color conversion relationship between the image captured by the first camera and the image captured by the second camera, and the N color mapping parameters are in one-to-one correspondence with the N standard light sources.
  • the correction module 830 is configured to perform color consistency correction on the second image according to at least one color mapping parameter to obtain a corrected image, and the corrected image is consistent with the color of the first image.
  • the determining module 820 and the correcting module 830 may implement the function of determining color mapping parameters in step S220 in the above method 200, and performing color consistency correction on the second image according to the color mapping parameters. Specifically, it can be used to implement the color consistency correction method 500 and the method 700 shown in FIGS. 5 to 7 .
  • the determination module 820 and the correction module 830 reference may be made to the descriptions in the above methods, which are not repeated here for brevity.
  • the color consistency correction apparatus shown in FIG. 8 can implement the functions of the correction module shown in FIG. 1 .
  • the determining module 820 can also be used to determine the color mapping parameters in the application scenario shown in FIG. 1 , that is, the function of determining the color mapping parameters in step S210 in the method 200 can be implemented.
  • the determination module 810 can be used to implement the various steps in the color mapping parameter calibration method 300 shown in FIG. 3 and the various steps in the color mapping parameter calibration method 400 shown in FIG. 4 .
  • the determination module 820 can also be used to determine the color mapping parameters in the application scenario shown in FIG. 1 , that is, the function of determining the color mapping parameters in step S210 in the method 200 can be implemented.
  • the determination module 810 can be used to implement the various steps in the color mapping parameter calibration method 300 shown in FIG. 3 and the various steps in the color mapping parameter calibration method 400 shown in FIG. 4 .
  • the determination module 820 reference may be made to the description in the above method, which will not be described in detail for the sake of brevity.
  • the color consistency correction apparatus 800 shown in FIG. 8 only includes an acquisition module 810, a determination module 820 and a correction module 830. In other embodiments, the color consistency correction apparatus may also include other modules or components, such as shown in FIG.
  • the preprocessing module, the postprocessing module, etc. shown in 1 are not limited in this embodiment of the present application.
  • FIG. 9 is a schematic diagram of another multi-camera color consistency correction apparatus according to an embodiment of the present application.
  • the color consistency correction apparatus 900 shown in FIG. 9 includes a memory 910 , a processor 920 , a communication interface 930 and a bus 940 .
  • the memory 910 , the processor 920 , and the communication interface 930 are connected to each other through the bus 940 for communication.
  • the memory 910 may be a read only memory (ROM), a static storage device, a dynamic storage device, or a random access memory (RAM).
  • the memory 910 may store a program, and when the program stored in the memory 910 is executed by the processor 920, the processor 920 is configured to execute each step of the multi-camera color consistency correction method of the embodiment of the present application, for example, execute FIG. 2 to FIG. 7 . the steps shown.
  • the color consistency correction device shown in the embodiment of the present application may be a server, for example, a server in the cloud, or a chip configured in the server in the cloud;
  • the color consistency correction device may be an intelligent terminal, or may be a chip configured in the intelligent terminal.
  • the color consistency correction method disclosed in the above embodiments of the present application may be applied to the processor 920 or implemented by the processor 920 .
  • the processor 920 may be an integrated circuit chip with signal processing capability.
  • each step of the above-mentioned color consistency correction method may be completed by an integrated logic circuit of hardware in the processor 920 or instructions in the form of software.
  • the above-mentioned processor 920 may be a central processing unit (central processing unit, CPU), an image signal processor (image signal processor, ISP), a graphics processing unit (graphics processing unit, GPU), a general-purpose processor, a digital signal processor ( digital signal processor, DSP), application specific integrated circuit (ASIC), off-the-shelf programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the steps of the method disclosed in conjunction with the embodiments of the present application may be directly embodied as executed by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • Software modules can be located in random access memory (RAM), flash memory, read-only memory (ROM), programmable read-only memory or electrically erasable programmable memory, registers, etc. in the storage medium.
  • the storage medium is located in the memory 910, and the processor 920 reads the instructions in the memory 910, and completes each step of the color consistency correction method shown in FIG. 2 to FIG. 7 in the implementation of the present application in combination with its hardware.
  • the communication interface 930 implements communication between the apparatus 900 and other devices or a communication network using a transceiving device such as, but not limited to, a transceiver.
  • Bus 940 may include a pathway for communicating information between various components of color consistency correction apparatus 900 (eg, memory 910, processor 920, communication interface 930).
  • the above-mentioned color consistency correction apparatus 900 only shows a memory, a processor, and a communication interface, in the specific implementation process, those skilled in the art should understand that the color consistency correction apparatus 900 may also include the implementation of normal operation. other necessary devices. Meanwhile, those skilled in the art should understand that the above-mentioned color consistency correction apparatus 900 may further include hardware devices that implement other additional functions according to specific needs. In addition, those skilled in the art should understand that the above-mentioned color consistency correction apparatus 900 may also only include the necessary devices for implementing the embodiments of the present application, and does not necessarily include all the devices shown in FIG. 9 .
  • the embodiments of the present application also provide a computer-readable medium, where the computer-readable medium stores a computer program (also referred to as code, or instruction), when it runs on a computer, so that the computer executes any of the foregoing method embodiments Color consistency correction method in .
  • a computer program also referred to as code, or instruction
  • An embodiment of the present application also provides a chip system, including a memory and a processor, where the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that a device or apparatus equipped with the chip system is installed Perform the method in any of the above method embodiments.
  • the chip system may include an input circuit or interface for sending information or data, and an output circuit or interface for receiving information or data.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in a storage medium or transmitted from one storage medium to another storage medium, for example, the computer instructions may be transmitted from a website site, computer, server or data center over a wire (e.g. coaxial cable, fiber optic cable, etc.).
  • the storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, a data center, etc. that includes one or more available mediums integrated.
  • the available media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, high-density digital video discs (DVDs)), or semiconductor media (eg, solid state disks, SSD)) etc.
  • a component may be, but is not limited to, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a computing device and the computing device may be components.
  • One or more components may reside within a process and/or thread of execution, and a component may be localized on one computer and/or distributed between 2 or more computers.
  • these components can execute from various computer readable media having various data structures stored thereon.
  • a component may, for example, be based on a signal having one or more data packets (eg, data from two components interacting with another component between a local system, a distributed system, and/or a network, such as the Internet interacting with other systems via signals) Communicate through local and/or remote processes.
  • data packets eg, data from two components interacting with another component between a local system, a distributed system, and/or a network, such as the Internet interacting with other systems via signals
  • the disclosed apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of the modules is only a logical function division. In actual implementation, there may be other division methods.
  • multiple modules or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • modules described as separate components may or may not be physically separated, and the components shown as modules may or may not be physical modules, that is, may be located in one place, or may be distributed to multiple network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional module in each embodiment of the present application may be integrated in one processing unit, or each module may exist physically alone, or two or more modules may be integrated in one unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Of Color Television Signals (AREA)
  • Color Television Image Signal Generators (AREA)
  • Image Processing (AREA)

Abstract

本申请提供了一种多相机色彩一致性校正方法和装置,该方法包括:获取第一相机拍摄的第一图像和第二相机拍摄的第二图像;根据所述第一图像指示的图像信息从N个色彩映射参数中确定至少一个色彩映射参数,图像信息包括第一图像的色彩信息和第一图像的环境光源中的至少一种,色彩映射参数指示了第一相机拍摄的图像与第二相机拍摄的图中的色彩转换关系,N个色彩映射参数与N个标准光源一一对应,N为正整数;根据至少一个所述色彩映射参数对第二图像进行色彩一致性校正,以获得校正图像。本申请实施例的多相机色彩一致性校正方法,可以实现不同光照条件下第二图像的实时色彩一致性校正。

Description

一种多相机色彩一致性校正方法和装置 技术领域
本申请涉及图像处理领域,尤其涉及一种多相机色彩一致性校正方法和装置。
背景技术
为了满足用户对高质量成像、全景图、图像拼接和识别等方面的需求,往往采用多个相机拍摄同一场景,或者在越来越多的产品和系统中搭载多个相机进行图像的拍摄。例如,在手机等终端设备上搭载多个不同焦段、不同特性的相机,可以给用户提供高质量的图像。然而由于器件本身的差异、器件视野的差异、调试风格等原因,各个相机的摄像头实际获取到的图像的颜色可能存在差异。色彩的差异程度随着光照、拍摄场景的变化而不同,因此在实际的应用中需要根据光照、拍摄场景等的变化对多个相机获取的图像的颜色实时进行调整,使得多个相机的图像的色彩保持一致。
发明内容
本申请提供一种多相机色彩一致性校正方法和装置,能够实现不同光照下对多个相机获取的图像进行色彩一致性调整。
第一方面,提供了一种多相机色彩一致性校正方法,该方法包括:获取第一相机拍摄的第一图像和第二相机拍摄的第二图像;根据第一图像指示的图像信息从N个色彩映射参数中确定至少一个色彩映射参数,图像信息包括第一图像的色彩信息和第一图像的环境光源中的至少一个,色彩映射参数指示了第一相机拍摄的图像与第二相机拍摄的图像的色彩转换关系,N个色彩映射参数与确定N个标准光源一一对应,N为正整数;根据至少一个色彩映射参数对第二图像进行色彩一致性校正,以获得校正图像。
在不同的标准光源下进行色彩映射参数标定,可以获得与每个标准光源相对应的色彩映射参数。在色彩一致性校正的过程中,结合至少一个色彩映射参数对第二图像进行色彩一致性校正,可以实现在不同的光照条件下的实时色彩校正,提高色彩一致性校正的准确性。
在一种可能的实现方式中,在从N个色彩映射参数中确定至少一个色彩映射参数之前,该方法还包括:确定每个标准光源下的第一标定图像和第二标定图像,该第一标定图像和第二标定图像分别是根据第一相机和第二相机的光谱响应曲线生成的色卡图像;根据第一标定图像和第二标定图像确定每个标准光源对应的色彩映射参数。
根据相机的光谱响应,可以模拟生成不同光源条件下的标定图像,降低了标定数据拍摄的时间成本,同时减少了因拍摄引入的不稳定因素,提高了色彩映射参数标定的稳定性和准确性。
在另一种可能的实现方式中,根据至少一个色彩映射参数对第二图像进行色彩一致性校正包括:确定第一图像和第二图像的公共图像区域;根据公共图像区域和至少一个色彩 映射参数确定色彩补偿参数;根据公共图像区域确定白平衡补偿参数;根据白平衡参数和色彩补偿参数对第二图像进行色彩一致性校正。
根据白平衡补偿参数可以对第二图像的灰色区域进行校正,根据色彩补偿参数可以对第二图像的彩色部分进行校正,通过白平衡补偿和色彩补偿相结合,可以实现第二图像的灰色区域和彩色区域都得到色彩一致性校正,提高了色彩校正的效果。
在另一种可能的实现方式中,确定第一图像和第二图像的公共图像区域包括:根据第一相机和第二相机的相对位置和视野范围,确定搜索区域;根据搜索区域确定公共图像区域。
在多个相机的位置信息等标定信息已知的情况下,可以结合相机的标定信息确定图像匹配的范围,提高图像匹配的准确度与搜索效率。
在另一种可能的实现方式中,根据公共图像区域和至少一个色彩映射参数确定色彩补偿参数包括:分别将N个色彩映射参数施加到第二图像中的公共图像区域,得到N个第三图像;分别计算第一图像中的公共图像区域与每个第三图像的色彩差异;根据色彩差异确定至少一个色彩映射参数,该至少一个色彩映射参数是至少一个色彩差异最小的第三图像对应的色彩映射参数;根据至少一个色彩映射参数确定目标色彩映射参数该目标色彩映射参数是至少一个色彩映射参数的加权值;根据目标色彩映射参数确定色彩补偿参数。
在另一种可能的实现方式中,根据公共图像区域和至少一个色彩映射参数确定色彩补偿参数包括:根据第一图像中的公共图像区域的白平衡增益确定环境光源;根据环境光源确定至少一个标准光源对应的至少一个色彩映射参数,该至少一个标准光源与环境光源的差异最小;根据至少一个色彩映射参数确定目标色彩映射参数,该目标色彩映射参数是至少一个色彩映射参数的加权值;根据目标色彩映射参数确定色彩补偿参数。
可以根据多种方式确定色彩补偿参数,并且可以根据实际的光照条件将多个相关的色彩映射参数进行融合,采用这种方式确定的目标色彩映射参数更加准确。
在另一种可能的实现方式中,色彩补偿参数是目标色彩映射参数,或者,色彩补偿参数是目标色彩映射参数与第一图像的白平衡增益以及色彩还原参数的乘积。
在另一种可能的实现方式中,根据公共图像区域确定白平衡补偿参数包括:分别确定第一图像中的公共图像区域的像素在三个颜色通道的加权平均值或者加权颜色直方图;分别确定第二图像中的公共图像区域的像素在三个颜色通道的加权平均值或者加权颜色直方图;根据三个颜色通道的加权平均值或者加权颜色直方图,确定白平衡补偿参数。
在另一种可能的实现方式中,在根据公共图像区域确定白平衡补偿参数之前,该方法还包括:根据公共图像区域的空间位置、色彩相似性和边缘信息,将公共图像区域划分为M块,M为正整数;根据公共图像区域确定白平衡补偿参数包括:分别确定第一图像中的公共图像区域的图像块在三个颜色通道的加权平均值或者加权颜色直方图;分别确定第二图像中的公共图像区域的图像块在三个颜色通道的加权平均值或者加权颜色直方图;根据三个颜色通道的加权平均值或者加权颜色直方图,确定白平衡补偿参数。
通过对图像分块,可以简化计算,提高色彩一致性校正的效率。
第二方面,提供了一种多相机色彩一致性校正装置,该装置包括:获取模块,用于获取第一相机拍摄的第一图像和第二相机拍摄的第二图像;确定模块,用于根据第一图像指示的图像信息从N个色彩映射参数中确定至少一个色彩映射参数,该图像信息包括第一图 像的色彩信息和第一图像的环境光源中的至少一个,色彩映射参数指示了第一相机拍摄的图像和第二相机拍摄的图像的色彩转换关系,N个色彩映射参数与N个标准光源一一对应N为正整数;校正模块,用于根据至少一个色彩映射参数对第二图像进行色彩一致性校正,以获得校正图像。
在不同的标准光源下进行色彩映射参数标定,可以获得与每个标准光源相对应的色彩映射参数。在色彩一致性校正的过程中,结合至少一个色彩映射参数对第二图像进行色彩一致性校正,可以实现在不同的光照条件下的实时色彩校正,提高色彩一致性校正的准确性。
在一种可能的实现方式中,在从N个色彩映射参数中确定至少一个色彩映射参数之前,确定模块具体用于:确定每个标准光源下的第一标定图像和第二标定图像,该第一标定图像和第二标定图像分别是根据第一相机和第二相机的光谱响应曲线生成的色卡图像;根据第一标定图像和第二标定图像确定每个标准光源对应的色彩映射参数。
根据相机的光谱响应,可以模拟生成不同光源条件下的标定图像,降低了标定数据拍摄的时间成本,同时减少了因拍摄引入的不稳定因素,提高了色彩映射参数标定的稳定性和准确性。
在另一种可能的实现方式中,确定模块具体用于:确定第一图像和第二图像的公共图像区域;根据公共图像区域和至少一个色彩映射参数确定色彩补偿参数;根据公共图像区域确定白平衡补偿参数;校正模块具体用于:根据白平衡参数和色彩补偿参数对第二图像进行色彩一致性校正。
白平衡补偿可以对第二图像的灰色区域进行校正,色彩补偿可以对第二图像的彩色部分进行校正,通过白平衡补偿和色彩补偿相结合,可以实现第二图像的灰色区域和彩色区域都得到色彩一致性校正,提高了色彩校正的效果。
在另一种可能的实现方式中,确定模块具体用于:根据第一相机和第二相机的相对位置和视野范围,确定搜索区域;根据搜索区域确定公共图像区域。
在多个相机的位置信息等标定信息已知的情况下,可以结合相机的标定信息确定图像匹配的范围,提高图像匹配的准确度与搜索效率。
在另一种可能的实现方式中,确定模块具体用于:分别将N个色彩映射参数施加到第二图像中的公共图像区域,得到N个第三图像;分别计算第一图像中的公共图像区域与每个第三图像的色彩差异;根据该色彩差异确定至少一个色彩映射参数,至少一个色彩映射参数是至少一个色彩差异最小的第三图像对应的色彩映射参数;根据至少一个色彩映射参数确定目标色彩映射参数,该目标色彩映射参数是至少一个色彩映射参数的加权值;根据目标色彩映射参数确定色彩补偿参数。
在另一种可能的实现方式中,确定模块具体用于:根据第一图像中的公共图像区域的白平衡结果增益确定环境光源;根据该环境光源确定至少一个标准光源对应的至少一个色彩映射参数,该至少一个标准光源与环境光源的差异最小;根据至少一个色彩映射参数确定目标色彩映射参数,该目标色彩映射参数是至少一个色彩映射参数的加权值;根据目标色彩映射参数确定色彩补偿参数。
可以根据多种方式确定色彩补偿参数,并且可以根据实际的光照条件将相关的色彩映射参数进行融合,采用这种方式确定的目标色彩映射参数更加准确。
在另一种可能的实现方式中,色彩补偿参数是目标色彩映射参数,或者,色彩补偿参数是目标色彩映射参数与第一图像的白平衡增益以及色彩还原参数的乘积。
在另一种可能的实现方式中,确定模块具体用于:分别确定第一图像中的公共图像区域的像素在三个颜色通道的加权平均值或者加权颜色直方图;分别确定第二图像中的公共图像区域的像素在三个颜色通道的加权平均值或者加权颜色直方图;根据三个颜色通道的加权值或者加权颜色直方图,确定白平衡补偿参数。
在另一种可能的实现方式中,在根据公共图像区域确定白平衡补偿参数之前,该确定模块还用于:根据公共图像区域的空间位置、色彩相似性和边缘信息,将公共图像区域划分为M块,M为正整数;分别确定第一图像中的公共图像区域的图像块在三个颜色通道的加权平均值或者加权颜色直方图;分别确定第二图像中的公共图像区域的图像块在三个颜色通道的加权平均值或者加权颜色直方图;根据三个颜色通道的加权值或者加权颜色直方图,确定白平衡补偿参数。
通过对图像分块,可以简化计算,提高色彩一致性校正的效率。
第三方面,提供一种计算机可读介质,该计算机可读介质存储用于设备执行的程序代码,该程序代码包括用于执行第一方面或者第一方面的任意一种实现方式中的色彩一致性校正方法。
第四方面,提供了一种计算机程序产品,该计算机程序产品包括:计算机程序代码,当所述计算机程序代码在计算机上运行时,使得计算机执行第一方面或者第一方面的任意一种实现方式中的色彩一致性校正方法。
第五方面,提供一种芯片,该芯片包括处理器与数据接口,处理器通过所述数据接口读取存储器上存储的指令,执行上述第一方面或第一方面中的任意一种实现方式中的色彩一致性校正方法。
可选地,作为一种实现方式,该芯片还可以包括存储器,存储器中存储有指令,处理器用于执行存储器上存储的指令,当指令被执行时,处理器用于执行第一方面或者第一方面中的任意一种实现方式中的色彩一致性校正方法。
第六方面,提供了一种装置,包括:处理器和存储器,存储器用于存储所述计算机程序代码,当所述计算机程序代码在所述处理器上运行时,使得该装置执行第一方面或者第一方面的任意一种实现方式中的色彩一致性校正方法。
附图说明
图1是本申请实施例的应用场景;
图2是本申请实施例的多相机色彩一致性校正方法的流程示意图;
图3是本申请实施例的一种色彩映射参数确定方法的流程示意图;
图4是本申请实施例的另一种色彩映射参数确定方法的流程示意图;
图5是本申请实施例的一种第二图像色彩校正方法的流程示意图;
图6是本申请实施例的图像匹配示意图;
图7是本申请实施例的另一种第二图像色彩校正方法的流程示意图;
图8是本申请实施例的一种多相机色彩一致性校正装置的结构示意图;
图9是本申请实施例的另一种多相机色彩一致性校正装置的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。可以理解的是,所描述的实施例是本申请一部分的实施例,而不是全部的实施例。
为了满足用户对高质量成像、全景图、图像拼接、识别等方面的需求,通常需要对多个相机拍摄的图像进行处理,例如手机、汽车、电脑,监控系统等,通常搭载多个摄像头(或称多个相机)以满足用户的上述需求。由于器件本身差异、视野差异、调试风格等原因,各个相机实际获取到的图像颜色可能存在差异。色彩差异的程度随着光照、场景的变化而不同,因此在实际应用中需根据光照、场景的变化进行实时调整。
在现有的多相机色彩校正技术中,通常以多相机中的一个相机获取的图像作为主路图像(例如,第一图像),其他图像作为辅路图像(例如,第二图像),通过第一图像的三个颜色通道的参数和第二图像三个颜色通道的参数的映射关系,分别对第二图像的三个颜色通道的参数进行调整。但是这种调整是对每个颜色通道的参数单独进行处理的。例如,计算红色(red,R)通道的补偿分量,并将R通道的补偿分量施加到第二图像的R通道上。在对第二图像的R通道进行色彩补偿时,往往不考虑绿色(green,G)和蓝色(blue,B)通道的影响。在不同相机的光谱差异较大的情况下,单通道的调整对图像色彩一致性的处理效果较差。
另外,在不同的光照条件下,不同相机间的色彩映射参数往往不同,色彩映射参数指示了一个相机(例如,第一相机)获得的图像与另一个相机(例如,第二相机)获得的图像之间的色彩转换的关系。现有的多相机色彩校正技术中,虽然在色彩映射参数标定时考虑了不同的光照条件,但是在图像的色彩一致性调整时,只考虑一种参数,当光照改变时,无法实现多个图像色彩一致性的调整。
本申请实施例提供一种多相机色彩一致性校正方法和装置,可以实现在不同光照环境下,多相机获取的图像色彩一致性的实时校正。
本申请实施例可以应用于多相机(或称多摄像头)的场景,其中,多个相机可以位于同一个设备,例如,搭载多个摄像头的手机等终端设备、搭载多个摄像头的车辆等;多个相机也可以位于不同的设备上,例如采用多个不同设备上的相机对同一场景进行拍摄等,本申请实施例对此不做限定。
下面以图1为例介绍本申请实施例的应用场景。如图1所示,选择多个相机中的一个相机作为主路相机(例如,称为“第一相机”),其他相机作为辅路相机(例如,称为“第二相机”)。场景反射的光经相机镜头投射于两路图像传感器上形成数字图像信号,两路图像信号分别经过各自的预处理模块,在预处理模块可以进行例如坏点矫正、黑电平补偿、阴影矫正、白平衡增益计算、色彩还原参数计算等操作,然后两路图像及相关参数进入色彩校正模块。将第一相机的图像作为目标图像效果,在色彩校正模块,结合第一图像的各项参数以及标定的色彩映射参数,计算第二相机获取的第二图像的色彩补偿参数和白平衡补偿参数。随后,根据计算得到的色彩补偿参数和白平衡补偿参数对第二图像进行调整,可以获得与第一图像的色彩保持一致的校正图像。在一些实施例中,对于第一图像和色彩校正后的第二图像,在后处理模块,还可以分别对两个图像进行伽马变换、动态调整等后处理,以获得显示效果更好的第一图像和校正图像。
图2是本申请实施例的多相机色彩一致性校正方法的流程示意图。如图2所示,本申 请实施例的多相机色彩一致性校正方法包括步骤S210至步骤S220。
S210,确定N个标准光源对应的N个色彩映射参数。
其中,N个标准光源与N个色彩映射参数一一对应。
色彩映射参数可以表示两个图像(例如图像1和图像2)的色彩的转换关系。色彩映射参数的描述方式可以包括线性变换矩阵、高阶多项式、神经网络等。示例性地,本申请实施例的图像采用RGB色彩模式,图像中的每个像素点通过R、G、B三个颜色通道的数值进行描述。在这种情况下,色彩映射参数可以是一个3×3的线性变换矩阵。矩阵形式的色彩映射参数如公式(1)所示:
Figure PCTCN2020109722-appb-000001
其中,(R 1i G 1i B 1i)表示图像中1第i像素的R、G、B三个颜色通道的数值;
(R 2i G 2i B 2i)表示图像2中第i像素的R、G、B三个颜色通道的数值;
3×3的矩阵T是色彩映射参数,表示图像1和图像2的色彩映射关系。
当色彩映射参数是上述公式(1)中的矩阵形式时,对于N个标准光源条件,本申请实施例对于这N个光源分别确定N个矩阵作为不同光源条件下的色彩映射参数。具体来说,标准光源可以包括美式橱窗射灯(A)、模拟太阳光(D50)、模拟蓝天日光(D65)、模拟北方平均太阳光(D75)、模拟水平日光(H)、美国暖白商店光源(U30)、欧洲日本及中国商店光源(TL84)、美国冷白商店光源(cool white fluorescent,CWF)等光源。
本申请实施例中,由于针对不同的标准光源都进行了色彩映射参数的标定,在进行色彩一致性校正时,可以根据不同的光照条件选择合适的色彩映射参数对第二图像进行色彩一致性校正,动态适应性好。
下面结合图3和图4详细介绍色彩映射参数的标定方法。本申请实施例中,色彩映射参数的标定有两种方式。方式1:根据白平衡后的第一标定图像和第二标定图像进行色彩映射参数标定,确定第一色彩映射参数;方式2:根据白平衡前的第一标定图像和第二标定图像进行色彩映射参数标定,确定第二色彩映射参数。
图3是方式1确定第一色彩映射参数的流程示意图。
S310,测量两路相机的光谱响应曲线。根据该光谱响应曲线可以描述相机的光谱敏感度,通常以波长为横坐标,对应光谱响应为纵坐标。
S320,获取标定图像。标定图像是用于标定色彩映射参数的色卡图像,具体来说,标定图像可以是根据成像模型、光谱响应曲线、光源光谱、24色卡反射谱,模拟生成两路相机色卡图像或者其他基准图像。例如,根据第一相机和第二相机的光谱相应曲线,分别生成第一标定图像和第二标定图像。
应理解,在上述步骤S320中,以24色卡为例介绍第一色彩映射参数的标定方法,在其他实施例中,还可以使用其他色卡或者其他含有多种颜色的可用于基准图像的物品进行色彩映射参数的标定。
S330,第一标定图像和第二标定图像分别进行白平衡处理。例如,上述步骤S320中根据第一相机和第二相机的光谱响应曲线分别生成了第一标定图像和第二标定图像,两路图像分别根据图像中灰块的颜色值,计算相应的白平衡增益,并对第一标定图像和第二标 定图像进行白平衡处理。在另一些实施例中,也可以利用当前光源的信息计算光源基准值,并将该光源基准值施加到两个标定图像上进行白平衡处理。
S340,计算第一色彩映射参数。从白平衡后的两路标定图像中,分别获取各图像中的各个像素对应的RGB三个颜色通道的数值。示例性地,(R 1i 1 G 1i 1 B 1i 1)表示白平衡后的第一标定图像的第i像素的RGB数值,(R 2i 1 G 2i 1 B 2i 1)表示白平衡后的第二标定图像的第i像素的RGB数值。
在一些实施例中,根据(R 1i 1 G 1i 1 B 1i 1)和(R 2i 1 G 2i 1 B 2i 1),利用最小二乘法,计算该光源对应的第一色彩映射参数T 1。在另一些实施例中,根据(R 1i 1 G 1i 1 B 1i 1)和(R 2i 1 G 2i 1B 2i 1),也可以利用其它回归方法,计算该光源对应的第一色彩映射参数。
应理解,在一些实施例中,当没有光谱响应测量仪器时,可以不执行上述步骤S310和步骤S320,可以直接利用相机拍摄的图像数据作为第一标定图像和第二标定图像进行上述步骤S330和步骤S340的第一色彩映射参数的标定。
在本申请的步骤S210中,对于N个标准光源的中的每个标准光源,分别在每个标准光源下根据上述步骤S310至步骤S340的方法进行色彩映射参数的标定,可以确定N个标准光源分别对应的N个第一色彩映射参数。
图4是方式2确定第二色彩映射参数的流程示意图。在方式2中,直接根据光谱响应获取第一标定图像和第二标定图像,不需要对标定图像进行白平衡处理。
S410,测量两路相机的光谱响应曲线。
S420,获取标定图像。
步骤S410的具体实现方法和上述步骤S310相似,步骤S420的具体实现方法与上述步骤S320相似,在此不再赘述。
S430,计算第二色彩映射参数。具体来说,对于两路相机获取的第一标定图像和第二标定图像,分别获取各图像中的各个像素对应的RGB三个颜色通道的数值。示例性地,(R 1i 2 G 1i 2 B 1i 2)表示第一标定图像的第i像素的RGB数值,(R 2i 2 G 2i 2 B 2i 2)表示第二标定图像的第i像素的RGB数值。
在一些实施例中,根据(R 1i 2 G 1i 2 B 1i 2)和(R 2i 2 G 2i 2 B 2i 2),利用最小二乘法或者其他回归方法,获取该光源对应的第二色彩映射参数T 2
应理解,在一些实施例中,当没有光谱响应测量仪器时,与上述方法300相似,可以不执行上述步骤S410和步骤S420,可以直接利用相机拍摄的图像作为第一标定图像和第二标定图像,进行步骤S430的第二色彩映射参数的标定。
还应理解,在上述方法300中,第一相机和第二相机根据白平衡后的标定图像确定第一色彩映射参数,因此在第一色彩映射参数不包括白平衡的调整分量。在上述方法400中,第一相机和第二相机根据白平衡前的标定图像确定第二色彩映射参数,因此,第二色彩映射参数中包括了白平衡的分量。
在色彩映射参数标定的过程中,利用色卡或其他基准数据,根据不同光源的光谱、相机的光谱响应参数,可以模拟生成不同光源条件下的标定图像,降低了数据拍摄的时间成本。同时减少了因拍摄引入的不稳定因素,提高了色彩映射参数标定的稳定性和准确性。
应理解,上述步骤S210中介绍的N个色彩映射参数标定的过程可以是离线进行的,N个色彩映射参数标定完成后,可以将N个标准光源以及N个色彩映射参数的对应关系 预置于图像处理设备中,当图像处理设备在对多个相机获取的图像进行色彩一致性校正时,可以直接使用标定的色彩映射参数,无需重复标定。
S220,根据至少一个色彩映射参数对第二图像进行校正。
具体来说,由于以第一图像作为目标图像,因此可以根据第一图像指示的图像信息选取至少一个色彩映射参数对第二图像进行色彩一致性校正,使校正后得到的校正图像与第一图像的色彩保持一致。
在本申请实施例中,根据N个色彩映射参数中的至少一个对第二图像进行校正主要包括图像匹配以及对第二图像进行色彩补偿和白平衡补偿。由于在步骤S210的确定色彩映射参数过程中,有两种方式标定色彩映射参数,因此相应地,在对第二图像进行色彩补偿和白平衡补偿的过程中,也有两种方式。
图5是与色彩映射参数标定方式1相对应的第二图像色彩校正流程的示意图。
S510,图像匹配。图像匹配目的在于确定第一图像和第二的图像公共图像区域,根据两路图像中的公共图像区域,可以确定第一图像和第二图像的色彩差异,并且依据该色彩差异计算色彩补偿参数和白平衡补偿参数对第二图像进行色彩校正。多种方法可以确定两路图像公共图像区域,如基于特征点匹配的方法、基于三维空间投影的方法、基于模板匹配、基于机器学习等方法,本申请实施例对此不做限定。
图6是第一图像和第二图像进行图像匹配的一种方式的示意图。在一些实施例中,当两路相机的相对位置固定时,可以结合相机的标定获得稳定快速的图像匹配。如图6所示,在图像匹配的过程中,根据多个相机的标定(例如相机的位置、焦段等),获得两路图像的缩放因子、公共图像区域在两路图像上的偏移量等参数。根据上述参数对两路图像进行缩放,并根据偏移量确定公共图像区域的大致搜索范围。示例性地,如图6中的(a)所示,可以在第一图像上确定搜索起点和搜索终点,在搜索起点和搜索终点的范围内进行图像匹配,最终获得第一图像和第二图像的公共图像区域。
在另一些实施例中,考虑到两路图像存在不完全同步的问题,为了获得更精确的图像匹配,还可以采用基于特征点、基于亮度、边缘模板匹配等方法实现在搜索范围附近的更精确的匹配。
在多个相机的位置相对固定的情况下,结合相机的标定信息可以缩小图像匹配的搜索区域,实现快速的图像匹配。另外,采用模板匹配技术在小图场景下也能得到较好的图像匹配效果,满足色彩一致性校正的实时性需求。
S520,计算第二图像相对于第一图像的白平衡补偿参数。白平衡补偿参数可以根据两路图像的公共图像区域的色彩信息计算得到,此步骤目的在于计算出两路图像公共图像区域的色彩差异,并根据两路图像公共区域的色彩差异计算第二图像相对于第一图像需要的白平衡补偿参数。白平衡补偿参数计算包括以下步骤:
步骤1:在两路图像施加白平衡增益,获得白平衡处理后的两路图像。具体来说,如图1所示,在一些实施例中,第一相机获取的第一图像和第二相机获取的第二图像经过图像传感器后,进入预处理模块进行预处理。预处理模块可以对每一路图像进行白平衡增益的计算,色彩校正模块根据该白平衡增益对该路图像进行白平衡处理。后续的步骤2至步骤5以步骤1得到的白平衡处理后的两路图像为基础进行白平衡补偿参数的计算。
步骤2:对白平衡处理后的第一图像和第二图像的公共图像区域分块。
在一些实施例中,为了降低在后续计算中的计算量,提升图像处理的效率,可以对步骤S510中图像匹配得到的公共图像区域进行分块。示例性地,可以根据图像的空间位置、色彩相似度、边缘信息、语义信息等进行划分,将公共图像区域图像划分为M块,M为正整数,以块为单位进行后续计算。本申请实施例对图像分块的方式不作限定。分块后的每个图像块的颜色数值可以用该图像块的所有像素的三通道颜色的平均值来表示。示例性的,该图像块包括10个像素点,该图像块的R、G、B三通道颜色数值可以分别由10个像素点的R平均值、G平均值、B平均值来表示。还可以用该图像块的所有像素的三通道颜色数值的众数、中位数等表示图像块的颜色数值,本申请实施例对此不做限定。
在另一些实施例中,也可以不对公共图像区域图像进行分块,在这种情况下,M=1,以像素为单位进行后续计算。在这种情况下可以不执行步骤2。
步骤3:对公共图像区域图像进行置信度(或称为“权重”)赋值。
在一些实施例中,步骤2对公共图像区域图像进行了分块操作后可以根据每一块图像的亮度、饱和度、颜色特性等信息,公共图像区域图像块进行置信度(或权重)的赋值。例如,可以丢弃过曝、过饱和的图像块;例如,可以对过曝、过饱和的图像块赋予较小的置信度(或权重),对接近灰色的图像块赋予较大的置信度(或权重)。
在另一些实施例中,步骤2中没有对公共图像区域的图像进行分块(或没有执行步骤2),在这种情况下,可以根据公共图像区域图像的每个像素的亮度、饱和度、颜色特性等信息,对公共图像区域图像的像素进行置信度(或权重)的赋值。例如,可以丢弃过曝、过饱和的像素;例如,可以对过曝、过饱和的像素赋予较小的置信度(或权重),对接近灰色的像素赋予较大的置信度(或权重)。
在另一些实施例中,也可以将公共图像区域的所有图像块或者公共图像区域的所有像素赋予相同的权重。
本申请实施例对置信度(或权重)的赋值方式、权重的大小不做限定。
步骤4:计算两路图像的公共图像区域的色彩特征。
例如,可以分别计算第一图像和第二图像公共图像区域的图像块的在三个颜色通道的加权平均值;具体来说,步骤3中对每个图像块的三通道颜色数值赋予权重,因此,第一图像的公共图像区域的R通道色彩特征为每个图像块的R数值的加权平均值,第一图像的公共图像区域的G通道色彩特征为每个图像块的G数值的加权平均值,第一图像的公共图像区域的B通道色彩特征为每个图像块的B数值的加权平均值。第二图像的公共图像区域的色彩特征也可以按照相似的方式进行计算。
例如,可以分别计算第一图像和第二图像公共图像区域的图像块的加权颜色直方图;颜色直方图可以表示图像中某一颜色值出现的频数。具体来说,步骤3中对每个图像块的三通道颜色数值赋予权重,以第一图像的公共图像区域为例,在R通道颜色直方图中,每个R数值对应的频数为图像块的对应R数值出现次数的加权和。例如第一图像的公共图像区域包括2个图像块,图像块1的权重为1,R数值为255;图像块2的权重为3,R数值也为255,那么在图像R通道的加权颜色直方图上,数值255对应的频数为1×1+1×3=4。
例如,可以分别计算第一图像和第二图像公共图像区域的像素的三个颜色通道的加权平均值;计算方式与上述分块后的计算方式相似,不再赘述。
例如,可以分别计算第一图像和第二图像公共图像区域的像素的加权颜色直方图,计 算方式与上述分块后的计算方式相似,不再赘述。
步骤5:计算第二图像针对第一图像的白平衡补偿参数。具体来说,可以根据步骤4提取的两路图像的公共图像区域的色彩特征,计算色彩差异,从而获得第二图像相对于第一图像的白平衡补偿参数。
示例性地,可以比较步骤4中获得的两路图像公共图像区域图像的三个颜色通道平均加权值,计算第二图像相对于第一图像的白平衡补偿参数;
示例性地,可以根据步骤4中获得的两路图像公共图像区域的加权颜色直方图特征,计算第二图像相对于第一图像的白平衡补偿参数;
示例性地,可以根据步骤4中获得的两路图像公共图像区域的加权颜色直方图特征,采用直方图匹配的方式计算第二图像相对于第一图像的白平衡补偿参数。
S530,根据第一色彩映射参数,计算第二图像相对于第一图像的色彩补偿参数。色彩补偿参数可以根据两路图像的公共图像区域的色彩信息以及色彩映射参数计算得到。由于在步骤S210中采用了方式1确定的第一色彩映射参数中,不包括白平衡补偿的分量,因此,在步骤S220的第二图像色彩校正过程中,需要先计算第二图像针对第一图像的白平衡补偿参数(步骤S520),再计算第二图像对第一图像的色彩补偿参数。具体来说,在这一步骤中,根据至少一个第一色彩映射参数,确定该场景下适用的色彩映射参数,使两路图像经过色彩补偿后色彩差异最小。
在一些实施例中,可以采用全局搜索的方式在N个第一色彩映射参数中查找合适的一个第一色彩映射参数作为第一目标色彩映射参数,使得第二图像的公共图像区域经过该第一色彩映射参数的变换后生成的第三图像与第一图像的公共图像区域的差异最小。例如,可以按照公式(2)所示的方法进行查找:
Figure PCTCN2020109722-appb-000002
其中,N表示标准光源的数量;
T i 1表示N个标准光源中的第i个标准光源对应的第一色彩映射参数;
M表示第一图像和第二图像的公共图像区域的图像块数,或者像素数;
Figure PCTCN2020109722-appb-000003
Figure PCTCN2020109722-appb-000004
分别表示第一图像的公共图像区域和第二图像的公共图像区域的第m块的三通道数值,或者第m像素的三通道的数值;
Figure PCTCN2020109722-appb-000005
表示将色彩映射参数T i 1施加到第二图像的公共图像区域后生成的第三图像的第m块的三通道数值,或者第m像素的三通道的数值;
Dis()函数用于计算图像的差异,示例性地,可以用绝对值距离、欧氏距离或者其他色彩差异度量方式;示例性地,还可以将图像颜色从RGB空间转换到其他颜色空间来度量图像的差异。
通过上述公式(2),可以从N个第一色彩映射参数中选择一个第一色彩映射参数作为第一目标色彩映射参数,将第一目标色彩映射参数作为第二图像的色彩补偿参数,使得经过该色彩补偿参数作用后的第二图像与第一图像的差异最小。
在另一些实施例中,还可以选取多个第一色彩映射参数,多个第一色彩映射参数是使得上述第三图像和第一图像的公共图像区域差异最小的多个第一色彩映射参数,并且将多个色彩映射参数按照预设的规则进行融合,获得第一目标色彩映射参数,并将第一目标色 彩映射参数作为第二图像的色彩补偿参数。例如,可以将多个第一色彩映射参数的加权值作为第一目标色彩映射参数,并且将该第一目标色彩映射参数作为第二图像的色彩补偿参数。
在另一些实施例中,可以根据步骤S520中的第一图像的白平衡增益确定第一图像的环境光源信息,再根据估计的光源信息从N个标准光源中选择一个最接近第一图像指示的环境光源的光源,并将该标准光源对应的第一色彩映射参数作为第一目标色彩映射参数,第一目标色彩映射参数即为第二图像的色彩补偿参数。或者根据估计的光源信息从N个标准光源中选择多个最接近第一图像指示的环境光源的光源,并将这些标准光源对应的多个第一色彩映射参数进行融合(例如多个第一色彩映射参数的加权平均值),作为第一目标色彩映射参数。第一目标色彩映射参数可以作为第二图像的色彩补偿参数。
应理解,由于在确定第一色彩映射参数的方式1中,根据白平衡后的第一图像和第二图像确定了第一色彩映射参数,因此,第一目标色彩映射参数
Figure PCTCN2020109722-appb-000006
不包括白平衡的分量。
S540,第二图像色彩校正。具体来说,在这一步骤中,将步骤S520计算的白平衡补偿参数和步骤S530计算的色彩补偿参数施加到第二图像上,获取色彩校正后的校正图像。第二图像经过色彩校正后得到的校正图像与第一图像的色彩可以保持一致。
图7是与色彩映射参数标定方式2相对应的第二图像色彩校正流程的示意图。
S710,图像匹配。图像匹配目的在于确定两路图像公共图像区域,用于后续色彩补偿参数和白平衡补偿参数的计算。该步骤的图像匹配过程与上述步骤S510中的图像匹配过程相似,在此不再赘述。
S720,计算第二图像相对于第一图像的色彩补偿参数。计算色彩补偿参数的过程包括两个步骤:
步骤1:计算第二目标色彩映射参数;步骤2:计算色彩补偿参数。
在步骤1中计算第二目标色彩映射参数时,可以根据步骤S530中所述的方案,根据公式(2),从N个第二色彩映射参数中选择一个作为第二目标色彩映射参数,或者选择多个第二色彩映射参数进行融合作为第二目标色彩映射参数。为了简洁,不再详述。
步骤2:计算色彩补偿参数。可以根据公式(3)计算第二图像的色彩补偿参数:
Figure PCTCN2020109722-appb-000007
其中,
Figure PCTCN2020109722-appb-000008
表示步骤1中获得的第二目标色彩映射参数;
Figure PCTCN2020109722-appb-000009
和CC mat分别表示第一图像的白平衡增益和色彩还原参数,该白平衡增益和色彩还原参数可以从图1所示的预处理步骤中获得;
G表示色彩补偿参数,其中包括了白平衡增益分量和色彩还原参数分量。
S730,计算第二图像相对于第一图像的白平衡补偿参数。对色彩补偿后的第二图像进行白平衡补偿的计算,以应对复杂多变的场景。
步骤1:第一图像的公共图像区域施加白平衡增益
Figure PCTCN2020109722-appb-000010
和色彩还原参数CC mat。其中,第一图像的白平衡增益和色彩还原参数可以从图1所示的预处理步骤中计算得到。
步骤2:在第二图像的公共图像区域施加步骤S720中获得的色彩补偿参数G。
步骤3:根据步骤1获得的处理后的第一图像公共区域和步骤2获得的处理后的第二图像公共区域计算白平衡补偿参数。白平衡补偿的计算方式与步骤S520所述的方法相似,为了避免重复,不再详述。
在另一些实施例中,该步骤S730还可以由以下步骤实现:
步骤1:第二图像的公共图像区域施加
Figure PCTCN2020109722-appb-000011
步骤2:根据白平衡前的第一图像的公共图像区域和步骤1获得的处理后的第二图像的公共图像区域计算白平衡补偿参数。通过比较白平衡前的第一图像公共区域以及经过色彩映射参数转换后的第二图像公共区域的整体色彩差异,可以动态调整场景中的白平衡补偿量。其中,白平衡补偿参数的计算方式与步骤S520所述的方法相似,为了避免重复,不再详述。
S740,第二图像色彩校正。具体来说,在这一步骤中,将步骤S720计算的色彩补偿参数和步骤S730计算的白平衡补偿参数施加到第二图像上,获取色彩校正后的校正图像。第二图像经过色彩校正后得到的校正图像与第一图像的色彩可以保持一致。
在本申请实施例中,对第二图像的色彩一致性校正包括色彩补偿和白平衡补偿,白平衡补偿和色彩补偿可以分别依据白平衡补偿参数和色彩补偿参数进行。白平衡补偿使得图像中的灰色区域一致性得到校正,色彩补偿使得图像中的彩色区域一致性得到校正,两种校正相结合,图像的灰色区域和彩色区域都能得到较好的校正效果。
图8是本申请实施例的一种多相机色彩一致性校正装置的示意图。如图8所示,本申请实施例的多相机色彩一致性校正装置包括获取模块810、确定模块820和校正模块830。
获取模块810,用于获取第一相机的拍摄的第一图像和第二相机拍摄的第二图像。
确定模块820,用于根据第一图像指示的图像信息从N个色彩映射参数中确定至少一个色彩映射参数,图像信息包括第一图像的色彩信息和第一图像的环境光源中的至少一个,色彩映射参数指示了第一相机拍摄的图像与第二相机拍摄的图像的色彩转换关系,N个色彩映射参数与N个标准光源一一对应。
校正模块830,用于根据至少一个色彩映射参数对第二图像进行色彩一致性校正,以获得校正图像,校正图像与第一图像的色彩保持一致。
在一些实施例中,确定模块820和校正模块830可以实现上述方法200中的步骤S220中确定色彩映射参数,并根据色彩映射参数对第二图像进行色彩一致性校正的功能。具体地,可以用于实现图5至图7所示的色彩一致性校正方法500和方法700。确定模块820和校正模块830的具体功能和有益效果可以参见上述方法中的描述,为了简洁,在此不再赘述。
在一些实施例中,图8所示的色彩一致性校正装置可以实现图1所示的校正模块的功能。
在一些实施例中,确定模块820还可以用于确定图1所示的应用场景中的色彩映射参 数,即可以实现方法200中的步骤S210中确定色彩映射参数的功能。具体来说,确定模块810可以用于实现图3所示的色彩映射参数标定方法300中的各个步骤,以及实现图4所示的色彩映射参数标定方法400中的各个步骤。在这种情况下,确定模块820的具体功能和有益效果可以参见上述方法中的描述,为了简洁不再详述。
应理解,图8所示的色彩一致性校正装置800仅仅包括获取模块810、确定模块820和校正模块830,在其他实施例中,该色彩一致性校正装置还可以包括其他模块或部件,例如图1所示的预处理模块、后处理模块等,本申请实施例对此不做限定。
图9是本申请实施例的另一种多相机色彩一致性校正装置的示意图。如图9所示的色彩一致性校正装置900包括存储器910、处理器920、通信接口930以及总线940。其中,存储器910、处理器920、通信接口930通过总线940实现彼此之间的通信连接。
存储器910可以是只读存储器(read only memory,ROM),静态存储设备,动态存储设备或者随机存取存储器(random access memory,RAM)。存储器910可以存储程序,当存储器910中存储的程序被处理器920执行时,处理器920用于执行本申请实施例的多相机色彩一致性校正方法的各个步骤,例如,执行图2至图7所示的各个步骤。
应理解,本申请实施例所示的色彩一致性校正装置可以是服务器,例如,可以是云端的服务器,或者,也可以是配置于云端的服务器中的芯片;或者,本申请实施例所示的色彩一致性校正装置可以是智能终端,也可以是配置于智能终端中的芯片。
上述本申请实施例揭示的色彩一致性校正方法可以应用于处理器920中,或者由处理器920实现。处理器920可能是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述色彩一致性校正方法的各步骤可以通过处理器920中的硬件的集成逻辑电路或者软件形式的指令完成。
上述的处理器920可以是中央处理器(central processing unit,CPU)、图像信号处理器(image signal processor,ISP)、图形处理器(graphics processing unit,GPU)、通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存取存储器(random access memory,RAM)、闪存、只读存储器(read-only memory,ROM)、可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器910,处理器920读取存储器910中的指令,结合其硬件完成本申请实施中图2至图7所示的色彩一致性校正方法的各个步骤。
通信接口930使用例如但不限于收发器一类的收发装置,来实现装置900与其他设备或通信网络之间的通信。
总线940可包括在色彩一致性校正装置900各个部件(例如,存储器910、处理器920、通信接口930)之间传送信息的通路。
应注意,尽管上述色彩一致性校正装置900仅仅示出了存储器、处理器、通信接口,但是在具体实现过程中,本领域的技术人员应当理解,色彩一致性校正装置900还可以包 括实现正常运行所必须的其他器件。同时,根据具体需要本领域的技术人员应当理解,上述色彩一致性校正装置900还可包括实现其他附加功能的硬件器件。此外,本领域的技术人员应当理解,上述色彩一致性校正装置900也可仅仅包括实现本申请实施例所必须的器件,而不必包括图9中所示的全部器件。
本申请实施例还提供了一种计算机可读介质,该计算机可读介质存储有计算机程序(也可以称为代码,或指令)当其在计算机上运行时,使得计算机执行上述任一方法实施例中的色彩一致性校正方法。
本申请实施例还提供了一种芯片系统,包括存储器和处理器,该存储器用于存储计算机程序,该处理器用于从存储器中调用并运行该计算机程序,使得安装有该芯片系统的设备或装置执行上述任一方法实施例中的方法。
其中,该芯片系统可以包括用于发送信息或数据的输入电路或者接口,以及用于接收信息或数据的输出电路或者接口。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在存储介质中,或者从一个存储介质向另一个存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述存储介质可以是计算机能够存取的任何可用介质或者是包括一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如,高密度数字视频光盘(digital video disc,DVD))、或者半导体介质(例如,固态硬盘(solid state disk,SSD))等。
应理解,说明书通篇中提到的“一些实施例”或“一实施例”意味着与实施例有关的特定特征、结构或特性包括在本申请的至少一个实施例中。因此,在整个说明书各处出现的“在一些实施例中”或“在一实施例中”未必一定指相同的实施例。此外,这些特定的特征、结构或特性可以任意适合的方式结合在一个或多个实施例中。应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
在本说明书中使用的术语“部件”、“模块”、“系统”等用于表示计算机相关的实体、硬件、固件、硬件和软件的组合、软件、或执行中的软件。例如,部件可以是但不限于,在处理器上运行的进程、处理器、对象、可执行文件、执行线程、程序和/或计算机。通过图示,在计算设备上运行的应用和计算设备都可以是部件。一个或多个部件可驻留在进程和/或执行线程中,部件可位于一个计算机上和/或分布在2个或更多个计算机之间。此外,这些部件可从在上面存储有各种数据结构的各种计算机可读介质执行。部件可例如根据具有一个或多个数据分组(例如来自与本地系统、分布式系统和/或网络间的另一部件交互的二个部件的数据,例如通过信号与其它系统交互的互联网)的信号通过本地和/或远程进程来通信。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各种说明性逻辑块(illustrative logical block)和步骤(step),能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的装置和模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,可以理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个模块或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。
所述作为分离部件说明的模块可以是或者也可以不是物理上分开的,作为模块显示的部件可以是或者也可以不是物理模块,即可以位于一个地方,或者也可以分布到多个网络模块上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能模块可以集成在一个处理单元中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个单元中。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (20)

  1. 一种多相机色彩一致性校正方法,其特征在于,所述方法包括:
    获取第一相机拍摄的第一图像和第二相机拍摄的第二图像;
    根据所述第一图像指示的图像信息从N个色彩映射参数中确定至少一个所述色彩映射参数,所述图像信息包括所述第一图像的色彩信息和所述第一图像的环境光源中的至少一个,所述色彩映射参数指示了所述第一相机拍摄的图像与所述第二相机拍摄的图像的色彩转换关系,所述N个色彩映射参数与N个标准光源一一对应,N为正整数;
    根据所述至少一个所述色彩映射参数对所述第二图像进行色彩一致性校正,以获得校正图像。
  2. 根据权利要求1所述的方法,其特征在于,在所述从N个色彩映射参数中确定至少一个所述色彩映射参数之前,所述方法还包括:
    确定每个所述标准光源下的第一标定图像和第二标定图像,所述第一标定图像和所述第二标定图像分别是根据所述第一相机和所述第二相机的光谱响应曲线生成的色卡图像;
    根据所述第一标定图像和所述第二标定图像确定每个所述标准光源对应的所述色彩映射参数。
  3. 根据权利要求1或2所述的方法,其特征在于,所述根据至少一个所述色彩映射参数对第二图像进行色彩一致性校正包括:
    确定所述第一图像和所述第二图像的公共图像区域;
    根据所述公共图像区域和所述至少一个色彩映射参数确定色彩补偿参数;
    根据所述公共图像区域确定白平衡补偿参数;
    根据所述白平衡补偿参数和所述色彩补偿参数对所述第二图像进行色彩一致性校正。
  4. 根据权利要求3所述的方法,其特征在于,所述确定所述第一图像和所述第二图像的公共图像区域包括:
    根据所述第一相机和所述第二相机的相对位置和视野范围,确定搜索区域;
    根据所述搜索区域确定所述公共图像区域。
  5. 根据权利要求3或4所述的方法,其特征在于,所述根据所述公共图像区域和所述至少一个色彩映射参数确定色彩补偿参数包括:
    分别将N个所述色彩映射参数施加到所述第二图像中的所述公共图像区域,得到N个第三图像;
    分别计算所述第一图像中的所述公共图像区域与每个所述第三图像的色彩差异;
    根据所述色彩差异确定至少一个所述色彩映射参数,所述至少一个色彩映射参数是至少一个所述色彩差异最小的第三图像对应的色彩映射参数;
    根据所述至少一个色彩映射参数确定目标色彩映射参数,所述目标色彩映射参数是所述至少一个色彩映射参数的加权值;
    根据所述目标色彩映射参数确定所述色彩补偿参数。
  6. 根据权利要求3或4所述的方法,其特征在于,所述根据所述公共图像区域和所述至少一个色彩映射参数确定色彩补偿参数包括:
    根据所述第一图像中的所述公共图像区域的白平衡增益确定环境光源;
    根据所述环境光源确定至少一个所述标准光源对应的至少一个所述色彩映射参数,所述至少一个标准光源与所述环境光源的差异最小;
    根据所述至少一个所述色彩映射参数确定目标色彩映射参数,所述目标色彩映射参数是所述至少一个色彩映射参数的加权值;
    根据所述目标色彩映射参数确定所述色彩补偿参数。
  7. 根据权利要求5或6所述的方法,其特征在于,所述色彩补偿参数是所述目标色彩映射参数,或者,所述色彩补偿参数是所述目标色彩映射参数与所述第一图像的白平衡增益以及色彩还原参数的乘积。
  8. 根据权利要求3-7中任一项所述的方法,其特征在于,所述根据所述公共图像区域确定白平衡补偿参数包括:
    分别确定第一图像中的所述公共图像区域的像素在三个颜色通道的加权平均值或者加权颜色直方图;
    分别确定第二图像中的所述公共图像区域的像素在三个颜色通道的加权平均值或者加权颜色直方图;
    根据所述三个颜色通道的加权平均值或者所述加权颜色直方图,确定所述白平衡补偿参数。
  9. 根据权利要求3-7中任一项所述的方法,其特征在于,在确定所述白平衡补偿参数之前,所述方法还包括:
    根据所述公共图像区域的空间位置、色彩相似性和边缘信息,将所述公共图像区域划分为M块,M为正整数;
    所述根据所述公共图像区域确定白平衡补偿参数包括:
    分别确定第一图像中的所述公共图像区域的图像块在三个颜色通道的加权平均值或者加权颜色直方图;
    分别确定第二图像中的所述公共图像区域的图像块在三个颜色通道的加权平均值或者加权颜色直方图;
    根据所述三个颜色通道的加权平均值或者所述加权颜色直方图,确定所述白平衡补偿参数。
  10. 一种多相机色彩一致性校正装置,其特征在于,所述装置包括:
    获取模块,用于获取第一相机拍摄的第一图像和第二相机拍摄的第二图像;
    确定模块,用于根据所述第一图像指示的图像信息从N个色彩映射参数中确定至少一个所述色彩映射参数,所述图像信息包括所述第一图像的色彩信息和所述第一图像的环境光源中的至少一个,所述色彩映射参数指示了所述第一相机拍摄的图像与所述第二相机拍摄的图像的色彩转换关系,所述N个色彩映射参数与N个标准光源一一对应,N为正整数;
    校正模块,用于根据所述至少一个所述色彩映射参数对第二图像进行色彩一致性校正,以获得校正图像。
  11. 根据权利要求10所述的装置,其特征在于,在所述从N个色彩映射参数中确定至少一个所述色彩映射参数之前,所述确定模块具体用于:
    确定每个所述标准光源下的第一标定图像和第二标定图像,所述第一标定图像和所述第二标定图像分别是根据所述第一相机和所述第二相机的光谱响应曲线生成的色卡图像;
    根据所述第一标定图像和所述第二标定图像确定每个所述标准光源对应的所述色彩映射参数。
  12. 根据权利要求10或11所述的装置,其特征在于,所述确定模块具体用于:
    确定所述第一图像和所述第二图像的公共图像区域;
    根据所述公共图像区域和所述至少一个所述色彩映射参数确定色彩补偿参数;
    根据所述公共图像区域确定白平衡补偿参数;
    所述校正模块具体用于:
    根据所述白平衡补偿参数和所述色彩补偿参数对所述第二图像进行色彩一致性校正。
  13. 根据权利要求12所述的方法,其特征在于,所述确定模块具体用于:
    根据所述第一相机和所述第二相机的相对位置和视野范围,确定搜索区域;
    根据所述搜索区域确定所述公共图像区域。
  14. 根据权利要求12或13所述的装置,其特征在于,所述确定模块具体用于:
    分别将N个所述色彩映射参数施加到第二图像中的所述公共图像区域,得到N个第三图像;
    分别计算所述第一图像中的公共图像区域与每个所述第三图像的色彩差异;
    根据所述色彩差异确定至少一个色彩映射参数,所述至少一个色彩映射参数是至少一个所述色彩差异最小的第三图像对应的色彩映射参数;
    根据所述至少一个色彩映射参数确定目标色彩映射参数,所述目标色彩映射参数是所述至少一个色彩映射参数的加权值;
    根据所述目标色彩映射参数确定所述色彩补偿参数。
  15. 根据权利要求12或13所述的装置,其特征在于,所述确定模块具体用于:
    根据所述第一图像中的所述公共图像区域的白平衡增益确定环境光源;
    根据所述环境光源确定至少一个所述标准光源对应的至少一个所述色彩映射参数,所述至少一个标准光源与所述环境光源的差异最小;
    根据所述至少一个所述色彩映射参数确定目标色彩映射参数,所述目标色彩映射参数是所述至少一个色彩映射参数的加权值;
    根据所述目标色彩映射参数确定所述色彩补偿参数。
  16. 根据权利要求14或15所述的装置,其特征在于,所述色彩补偿参数是所述目标色彩映射参数,或者,所述色彩补偿参数是所述目标色彩映射参数与所述第一图像的白平衡增益以及色彩还原参数的乘积。
  17. 根据权利要求12-16中任一项所述的装置,其特征在于,所述确定模块具体用于:
    分别确定第一图像中的所述公共图像区域的像素在三个颜色通道的加权平均值或者加权颜色直方图;
    分别确定第二图像中的所述公共图像区域的像素在三个颜色通道的加权平均值或者加权颜色直方图;
    根据所述三个颜色通道的加权平均值或者所述加权颜色直方图,确定所述白平衡补偿参数。
  18. 根据权利要求12-16中任一项所述的装置,其特征在于,在确定所述白平衡补偿参数之前,所述确定模块还用于:
    根据所述公共图像区域的空间位置、色彩相似性和边缘信息,将所述公共图像区域划分为M块,M为正整数;
    分别确定第一图像中的所述公共图像区域的图像块在三个颜色通道的加权平均值或者加权颜色直方图;
    分别确定第二图像中的所述公共图像区域的图像块在三个颜色通道的加权平均值或者加权颜色直方图;
    根据所述三个颜色通道的加权平均值或者所述加权颜色直方图,确定所述白平衡补偿参数。
  19. 一种多相机色彩一致性校正装置,其特征在于,包括:处理器和存储器,所述存储器用于存储程序,所述处理器用于从存储器中调用并运行所述程序以执行权利要求1至9中任一项所述的方法。
  20. 一种计算机可读存储介质,其特征在于,包括计算机程序,当所述计算机程序在计算机上运行时,使得所述计算机执行权利要求1至9中任一项所述的方法。
PCT/CN2020/109722 2020-08-18 2020-08-18 一种多相机色彩一致性校正方法和装置 WO2022036539A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2020/109722 WO2022036539A1 (zh) 2020-08-18 2020-08-18 一种多相机色彩一致性校正方法和装置
EP20949751.0A EP4195662A4 (en) 2020-08-18 2020-08-18 METHOD AND DEVICE FOR COLOR CONSISTENCY CORRECTION FOR MULTIPLE CAMERAS
CN202080103950.0A CN116158087A (zh) 2020-08-18 2020-08-18 一种多相机色彩一致性校正方法和装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/109722 WO2022036539A1 (zh) 2020-08-18 2020-08-18 一种多相机色彩一致性校正方法和装置

Publications (1)

Publication Number Publication Date
WO2022036539A1 true WO2022036539A1 (zh) 2022-02-24

Family

ID=80323324

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/109722 WO2022036539A1 (zh) 2020-08-18 2020-08-18 一种多相机色彩一致性校正方法和装置

Country Status (3)

Country Link
EP (1) EP4195662A4 (zh)
CN (1) CN116158087A (zh)
WO (1) WO2022036539A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115409953A (zh) * 2022-11-02 2022-11-29 汉斯夫(杭州)医学科技有限公司 基于多相机颜色一致性的颌面重建方法、设备及介质
CN117478802A (zh) * 2023-10-30 2024-01-30 神力视界(深圳)文化科技有限公司 图像处理方法、装置及电子设备
WO2024055793A1 (zh) * 2022-09-15 2024-03-21 海信视像科技股份有限公司 投影设备及投影画质调整方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007323587A (ja) * 2006-06-05 2007-12-13 Matsushita Electric Ind Co Ltd 車載カメラの画像合成装置および画像合成方法
US20090147100A1 (en) * 2007-12-11 2009-06-11 Canon Kabushiki Kaisha Camera control apparatus, camera control method, and camera system
CN103796003A (zh) * 2014-01-21 2014-05-14 深圳市掌网立体时代视讯技术有限公司 一种立体摄像的图像修正方法及系统
CN105979238A (zh) * 2016-07-05 2016-09-28 深圳市德赛微电子技术有限公司 一种多摄像头全局成像一致性控制方法
CN106131527A (zh) * 2016-07-26 2016-11-16 深圳众思科技有限公司 双摄像头颜色同步方法、装置及终端
CN109218561A (zh) * 2018-11-30 2019-01-15 豪威科技(上海)有限公司 多摄像头的同步方法及装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201839717A (zh) * 2017-04-19 2018-11-01 睿緻科技股份有限公司 影像拼接方法及其影像拼接裝置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007323587A (ja) * 2006-06-05 2007-12-13 Matsushita Electric Ind Co Ltd 車載カメラの画像合成装置および画像合成方法
US20090147100A1 (en) * 2007-12-11 2009-06-11 Canon Kabushiki Kaisha Camera control apparatus, camera control method, and camera system
CN103796003A (zh) * 2014-01-21 2014-05-14 深圳市掌网立体时代视讯技术有限公司 一种立体摄像的图像修正方法及系统
CN105979238A (zh) * 2016-07-05 2016-09-28 深圳市德赛微电子技术有限公司 一种多摄像头全局成像一致性控制方法
CN106131527A (zh) * 2016-07-26 2016-11-16 深圳众思科技有限公司 双摄像头颜色同步方法、装置及终端
CN109218561A (zh) * 2018-11-30 2019-01-15 豪威科技(上海)有限公司 多摄像头的同步方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4195662A4 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024055793A1 (zh) * 2022-09-15 2024-03-21 海信视像科技股份有限公司 投影设备及投影画质调整方法
CN115409953A (zh) * 2022-11-02 2022-11-29 汉斯夫(杭州)医学科技有限公司 基于多相机颜色一致性的颌面重建方法、设备及介质
CN117478802A (zh) * 2023-10-30 2024-01-30 神力视界(深圳)文化科技有限公司 图像处理方法、装置及电子设备

Also Published As

Publication number Publication date
CN116158087A8 (zh) 2024-05-21
EP4195662A4 (en) 2023-07-26
EP4195662A1 (en) 2023-06-14
CN116158087A (zh) 2023-05-23

Similar Documents

Publication Publication Date Title
WO2022036539A1 (zh) 一种多相机色彩一致性校正方法和装置
CN108702437B (zh) 计算深度图的方法、系统、设备和存储介质
US11798147B2 (en) Image processing method and device
WO2022100242A1 (zh) 图像处理方法、装置、电子设备和计算机可读存储介质
WO2019085792A1 (en) Image processing method and device, readable storage medium and electronic device
KR102346522B1 (ko) 영상 처리 장치 및 그것의 자동 화이트 밸런싱 방법
EP3888345B1 (en) Method for generating image data for machine learning based imaging algorithms
KR20170019359A (ko) 국부적 적응형 히스토그램 등화
WO2020038255A1 (en) Image processing method, electronic apparatus, and computer-readable storage medium
WO2019011154A1 (zh) 白平衡处理方法和装置
CN109685853B (zh) 图像处理方法、装置、电子设备和计算机可读存储介质
WO2022257396A1 (zh) 图像中的色边像素点的确定方法、确定装置和计算机设备
WO2021008052A1 (zh) 3d摄影模组镜头精度的标定方法、装置及设备
WO2019029573A1 (zh) 图像虚化方法、计算机可读存储介质和计算机设备
WO2019062633A1 (zh) 颜色阴影校正的方法和装置
US20210321069A1 (en) Electronic device which adjusts white balance of image according to attributes of object in image and method for processing image by electronic device
US20190052860A1 (en) Multi-Image Color-refinement with Application to Disparity Estimation
WO2020093653A1 (zh) 色彩调整方法、色彩调整装置、电子设备及计算机可读存储介质
WO2023130922A1 (zh) 图像处理方法与电子设备
US11457189B2 (en) Device for and method of correcting white balance of image
US12015835B2 (en) Multi-sensor imaging color correction
WO2021051307A1 (zh) 图像的颜色校正方法、拍摄设备、图像的颜色校正系统
CN113542709B (zh) 投影图像亮度调整方法、装置、存储介质及投影设备
US9854218B2 (en) Electronic system and image processing method
US20230328396A1 (en) White balance correction method and apparatus, device, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20949751

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020949751

Country of ref document: EP

Effective date: 20230306

NENP Non-entry into the national phase

Ref country code: DE