WO2023015993A9 - 一种色度信息的确定方法及相关电子设备 - Google Patents

一种色度信息的确定方法及相关电子设备 Download PDF

Info

Publication number
WO2023015993A9
WO2023015993A9 PCT/CN2022/091965 CN2022091965W WO2023015993A9 WO 2023015993 A9 WO2023015993 A9 WO 2023015993A9 CN 2022091965 W CN2022091965 W CN 2022091965W WO 2023015993 A9 WO2023015993 A9 WO 2023015993A9
Authority
WO
WIPO (PCT)
Prior art keywords
color temperature
electronic device
correlated color
chromaticity
cct
Prior art date
Application number
PCT/CN2022/091965
Other languages
English (en)
French (fr)
Other versions
WO2023015993A8 (zh
WO2023015993A1 (zh
Inventor
钱彦霖
郗东苗
金萌
王梓蓉
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Priority to US18/042,274 priority Critical patent/US20230342977A1/en
Priority to EP22850697.8A priority patent/EP4181510A4/en
Publication of WO2023015993A1 publication Critical patent/WO2023015993A1/zh
Publication of WO2023015993A9 publication Critical patent/WO2023015993A9/zh
Publication of WO2023015993A8 publication Critical patent/WO2023015993A8/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Definitions

  • the present application relates to the field of image processing, in particular to a method for determining chromaticity information and related electronic equipment.
  • White balance is an adjustment that digital camera equipment or associated software can make to a captured image to ensure that the whites in the image properly reflect the actual whites in the real-world scene in which the image was taken.
  • White balance is related to color temperature, which is a measure of the quality of light, measured in Kelvin, based on the ratio of the amount of blue light to the amount of red light in images and scenes. Images or scenes with a higher color temperature have more blue than those with a lower color temperature. Thus, "cooler” light has a higher color temperature and “hotter” light has a lower color temperature.
  • the human eye and brain can adapt to different color temperatures. For example, whether it is in sunlight or under various lights, the human eye perceives white objects as white, that is, the human eye has color constancy. Because the charge-coupled device circuit (Charge-coupled Device, CCD) or CMOS circuit used to convert the optical signal into an electrical signal in the camera cannot correct the color change of the light source like the human eye. Therefore, it is necessary to estimate the chromaticity of the light source of the captured image through a white balance algorithm, and adjust the color of the image through the estimated chromaticity of the light source, so that the color of the adjusted image is consistent with the color actually observed by the human eye. How to improve the accuracy of the white balance algorithm is a problem that technicians pay more and more attention to.
  • the embodiment of the present application provides a method for determining chromaticity information, which solves the problem that the accuracy of chromaticity information calculated by the fusion algorithm of AWB and AI AWB is not high.
  • the embodiment of the present application provides a method for determining chromaticity information, which is applied to electronic equipment, and the electronic equipment includes a multi-spectral sensor, an automatic white balance module and an AI automatic white balance module, and the automatic white balance module includes multiple Algorithm
  • the method includes: the electronic device starts a camera application; the multispectral sensor acquires a first channel value, wherein the multispectral sensor includes a first channel, and the first channel value is a value acquired by the first channel; the first channel value is sent to An automatic white balance module; according to the first channel value, the automatic white balance module selects a target algorithm from multiple algorithms; the electronic device determines target chromaticity information according to the target algorithm.
  • the automatic white balance module can select the target algorithm according to the first channel value
  • the target algorithm is the AWB algorithm that matches the current shooting environment, and the accuracy of the target chromaticity information calculated by the electronic device based on the target algorithm If the degree is higher, when the electronic device uses the target chromaticity information to adjust the white balance of the image, the adjustment effect will be better.
  • the method further includes: the multi-spectral sensor acquires the first correlated color temperature; sends the first correlated color temperature to the AI automatic white balance module; the AI automatic white balance module determines the first correlated color temperature and The difference of the second correlated color temperature, the second correlated color temperature is the correlated color temperature value obtained by the AI automatic white balance module according to the image collected by the camera; when the difference is greater than the preset threshold, the confidence level output by the AI automatic white balance module is adjusted is the first confidence level; the electronic device determines the target chromaticity information according to the first confidence level.
  • the electronic device adjusts the confidence degree of the AI automatic white balance output according to the difference between the first correlated color temperature and the second correlated color temperature to obtain the first confidence degree, which improves the accuracy of the first confidence degree, so that based on the second
  • the accuracy of the target chromaticity information calculated by the confidence level is higher, and when the electronic device uses the target chromaticity information to adjust the white balance of the image, the adjustment effect is better.
  • the method further includes: when the startup is a cold startup, the multispectral sensor acquires first chromaticity information; the electronic device determines that the target chromaticity information is the first chromaticity information. A colorimetric information. In this way, the electronic device can use the first chromaticity information to adjust the white balance of the first frame or the first few frames of images output by the camera, so that no color cast occurs in the first frame or the first few frames of images.
  • the method further includes: when the automatic white balance module or the AI automatic white balance module determines that the image captured by the camera is a solid color image, the electronic device determines that the target chromaticity information is the second The chromaticity information, the second chromaticity information is the chromaticity information acquired by the multispectral sensor.
  • the electronic device can also call the second chromaticity information with higher accuracy to adjust the white balance of the image, thereby Solve the color cast problem of the image.
  • the first channel value includes one or more of the following: visible light channel value, NIR channel value, and Clear channel value.
  • the electronic device determines the target chromaticity information according to the first confidence level, which specifically includes: the automatic white balance module calculates the image collected by the camera through the target algorithm to obtain the third correlated color temperature and the first A chromaticity distance; the AI automatic white balance module obtains the fourth correlated color temperature and the second chromaticity distance based on the image collected by the camera; the electronic device corrects the first confidence degree to obtain the second confidence degree; the electronic device The third correlated color temperature is fused with the fourth correlated color temperature according to the second confidence degree to obtain the fifth correlated color temperature; the electronic device is fused with the first chromaticity distance and the second chromaticity distance according to the second confidence degree to obtain the fifth correlated color temperature.
  • the electronic device obtains target chromaticity information based on the fifth correlated color temperature and the third chromaticity distance.
  • the automatic white balance module uses the target algorithm to calculate the first chromaticity distance and the third correlated color temperature
  • the calculated first chromaticity distance and the third correlated color temperature are highly accurate.
  • the electronic device adjusts the confidence degree of the AI automatic white balance output according to the difference between the first correlated color temperature and the second correlated color temperature, the first confidence degree is obtained, which improves the accuracy of the first confidence degree, and the second confidence degree is determined by the second degree of confidence.
  • the first confidence level is corrected, so the accuracy of the second confidence level is also improved.
  • the fifth correlated color temperature is obtained by the electronic device by combining the third correlated color temperature and the second chromaticity distance according to the second confidence level. It is obtained by fusion of four correlated color temperatures, so the accuracy of the third chromaticity distance and the fifth correlated color temperature is also improved, so that the accuracy of the target chromaticity information obtained based on the third chromaticity distance and the fifth correlated color temperature is also improved.
  • the electronic device uses the target chromaticity information to adjust the white balance of the image, it can more effectively solve the color cast problem of the image.
  • the electronic device obtains the target chromaticity information based on the fifth correlated color temperature and the third chromaticity distance, which specifically includes: adjusting the fifth correlated color temperature to obtain the sixth correlated color temperature; The third chromaticity distance is subjected to inclination adjustment to obtain a fourth chromaticity distance; based on the sixth correlated color temperature and the fourth chromaticity distance, the target chromaticity information is obtained.
  • the electronic device can obtain high-accuracy target chromaticity information according to the sixth correlated color temperature and the fourth chromaticity distance.
  • the electronic device uses the target chromaticity information to adjust the white balance of the image, it can more effectively solve the problem of the image. Color cast problem.
  • the AI automatic white balance module obtains the fourth correlated color temperature and the second chromaticity distance based on the image collected by the camera, specifically including: the AI automatic white balance module outputs The second correlated color temperature and the initial chromaticity distance; correct the second correlated color temperature and the initial chromaticity distance to obtain the corrected second correlated color temperature and the corrected initial chromaticity distance; correct based on the corrected second correlated color temperature
  • the final initial chromaticity distance is subjected to sequential filtering to obtain the fourth correlated color temperature and the second chromaticity distance. In this way, the accuracy of the fourth correlated color temperature and the second chromaticity distance obtained through time series filtering is higher.
  • the time series filtering is performed based on the corrected second correlated color temperature and the corrected initial chromaticity distance to obtain the fourth correlated color temperature and the second chromaticity distance, which specifically includes: According to the formula Update the first covariance matrix to obtain the updated first covariance matrix; Sigma' 1 is the updated first covariance matrix, Sigma 1 is the first covariance matrix, ⁇ 1 is the first parameter, and the first covariance
  • the matrix is the covariance matrix output by the AI automatic white balance module according to the image collected by the camera or the first covariance matrix is the covariance matrix calculated based on the second confidence; according to the formula Update the second covariance matrix to obtain the updated second covariance matrix;
  • the second covariance matrix is the covariance matrix of the second image, the second image is the last frame image collected by the camera, and Sigma' 2 is the Describe the updated second covariance matrix, Sigma 2 is the second covariance matrix, and ⁇ 2 is the second parameter; Based on the revised second correlated color
  • the electronic device can obtain highly accurate target chromaticity information according to the fifth correlated color temperature and the third chromaticity distance, and when the electronic device uses the target chromaticity information to adjust the white balance of the image, it can more effectively solve the problem of the image. Color cast problem.
  • the electronic device can obtain highly accurate target chromaticity information according to the fifth correlated color temperature and the third chromaticity distance, and when the electronic device uses the target chromaticity information to adjust the white balance of the image, it can more effectively solve the problem of the image. Color cast problem.
  • the image color can be more in line with the user's visual habit on the basis of solving the image color cast.
  • the third chromaticity distance is adjusted to obtain the fourth chromaticity distance, which specifically includes: according to the brightness value of the image collected by the camera, the fifth correlated color temperature and the third color
  • the image color can be more in line with the user's visual habit on the basis of solving the image color cast.
  • an embodiment of the present application provides an electronic device, which includes: one or more processors and a memory; the memory is coupled to the one or more processors, and the memory is used to store computer program codes,
  • the computer program code includes computer instructions that are invoked by the one or more processors to cause the electronic device to execute: the electronic device starts a camera application; the multispectral sensor acquires a first channel value, wherein the multispectral sensor includes a first channel , the first channel value is the value obtained by the first channel; the first channel value is sent to the automatic white balance module; according to the first channel value, the automatic white balance module selects a target algorithm from multiple algorithms; the electronic device selects the target algorithm according to the target Algorithms determine target colorimetric information.
  • the one or more processors are further configured to call the computer instruction to make the electronic device execute: the multispectral sensor acquires the first correlated color temperature; sends the first correlated color temperature to the AI Automatic white balance module; the AI automatic white balance module determines the difference between the first correlated color temperature and the second correlated color temperature, and the second correlated color temperature is the correlated color temperature value obtained by the AI automatic white balance module according to the image collected by the camera; when the difference is greater than When the threshold is preset, the confidence level output by the AI automatic white balance module is adjusted to the first confidence level; the electronic device determines the target chromaticity information according to the first confidence level.
  • the one or more processors are further configured to call the computer instruction to make the electronic device execute: when the startup is a cold startup, the multispectral sensor acquires the first chromaticity information ; The electronic device determines that the target chromaticity information is the first chromaticity information.
  • the one or more processors are further configured to call the computer instructions to make the electronic device execute: when the automatic white balance module or the AI automatic white balance module determines that the camera captures When the image is a pure color image, the electronic device determines that the target chromaticity information is the second chromaticity information, and the second chromaticity information is the chromaticity information acquired by the multispectral sensor.
  • the one or more processors are further configured to call the computer instruction to make the electronic device execute: the automatic white balance module calculates the image collected by the camera through the target algorithm to obtain the first The three correlated color temperatures and the first chromaticity distance; the AI automatic white balance module obtains the fourth correlated color temperature and the second chromaticity distance based on the image collected by the camera; the electronic device corrects the first confidence level to obtain the second Confidence degree; the electronic device fuses the third correlated color temperature and the fourth correlated color temperature according to the second confidence degree to obtain the fifth correlated color temperature; the electronic device determines the first chromaticity distance and the second chromaticity distance according to the second confidence degree performing fusion to obtain a third chromaticity distance; the electronic device obtains target chromaticity information based on the fifth correlated color temperature and the third chromaticity distance.
  • the one or more processors are further configured to call the computer instruction to make the electronic device execute: adjusting the fifth correlated color temperature to obtain a sixth correlated color temperature; The third chromaticity distances are adjusted inclination to obtain a fourth chromaticity distance; based on the sixth correlated color temperature and the fourth chromaticity distance, the target chromaticity information is obtained.
  • the one or more processors are further configured to call the computer instructions to make the electronic device execute: the AI automatic white balance module outputs the second correlated color temperature and The initial chromaticity distance; correct the second correlated color temperature and the initial chromaticity distance to obtain the corrected second correlated color temperature and the corrected initial chromaticity distance; based on the corrected second correlated color temperature and the corrected initial chromaticity The distance is subjected to sequential filtering to obtain the fourth correlated color temperature and the second chromaticity distance.
  • the one or more processors are further configured to call the computer instruction so that the electronic device executes: according to the formula Update the first covariance matrix to obtain the updated first covariance matrix; Sigma' 1 is the updated first covariance matrix, Sigma 1 is the first covariance matrix, ⁇ 1 is the first parameter, and the first covariance
  • the matrix is the covariance matrix output by the AI automatic white balance module according to the image collected by the camera or the first covariance matrix is the covariance matrix calculated based on the second confidence; according to the formula Update the second covariance matrix to obtain the updated second covariance matrix;
  • the second covariance matrix is the covariance matrix of the second image, the second image is the last frame image collected by the camera, and Sigma' 2 is the Describe the updated second covariance matrix, Sigma 2 is the second covariance matrix, and ⁇ 2 is the second parameter;
  • the one or more processors are further configured to call the computer instruction to make the electronic device execute: according to the brightness value of the image captured by the camera, the fifth correlated color temperature, and the third color
  • CCT 6 CCT 5 +Delta_CCT'
  • the sixth correlated color temperature is obtained
  • CCT 5 is the fifth correlated color temperature
  • CCT 6 is the sixth correlated color temperature
  • Delta_CCT' is the first correlated color temperature adjustment value.
  • the one or more processors are further configured to call the computer instruction to make the electronic device execute: according to the brightness value of the image captured by the camera, the fifth correlated color temperature, and the third color
  • an embodiment of the present application provides an electronic device, including: a touch screen, a camera, one or more processors, and one or more memories; the one or more processors and the touch screen , the camera, the one or more memories are coupled, the one or more memories are used to store computer program codes, the computer program codes include computer instructions, and when the one or more processors execute the computer instructions , so that the electronic device executes the method described in the first aspect or any possible implementation manner of the first aspect.
  • an embodiment of the present application provides a chip system, which is applied to an electronic device, and the chip system includes one or more processors, and the processor is used to call a computer instruction so that the electronic device executes the first Aspect or the method described in any possible implementation manner of the first aspect.
  • the embodiment of the present application provides a computer program product containing instructions.
  • the computer program product is run on an electronic device, the electronic device is made to execute any one of the possible implementations of the first aspect or the first aspect. The method described in the manner.
  • the embodiment of the present application provides a computer-readable storage medium, including instructions, and when the instructions are run on the electronic device, the electronic device executes any one of the possible implementations of the first aspect or the first aspect. The method described in the manner.
  • FIG. 1 is a schematic diagram of a hardware structure of an electronic device 100 provided in an embodiment of the present application
  • Figures 2A-2C are a set of shooting interface diagrams provided by the embodiment of the present application.
  • FIG. 2D is a comparison effect diagram of the image adjusted by the automatic white balance algorithm provided by the embodiment of the present application.
  • FIG. 3 is a flow chart of a method for determining chromaticity information provided by an embodiment of the present application
  • Fig. 4 is a photographing interface diagram provided by the embodiment of the present application.
  • Fig. 5 is a corresponding diagram of the preview image and its RGB adjustment value provided by the embodiment of the present application.
  • FIG. 6A is a CCT conversion table provided by the embodiment of the present application.
  • Fig. 6B is a kind of CCT Shift Table three-dimensional spatial coordinate system that the embodiment of the present application provides;
  • FIG. 7A is a D uv conversion table provided by the embodiment of the present application.
  • Fig. 7B is a three-dimensional space coordinate system of Du uv Shift Table provided by the embodiment of the present application.
  • FIG. 8A is a Confidence conversion table provided by the embodiment of the present application.
  • Fig. 8B is a kind of Confidence Shift Table three-dimensional spatial coordinate system provided by the embodiment of the present application.
  • Fig. 9 is a schematic diagram of converting (CCT, D uv ) into chromaticity information provided by the embodiment of the present application;
  • Fig. 10 is a kind of CCT fusion table provided by the embodiment of the present application.
  • FIG. 11 is a D uv fusion table provided by the embodiment of the present application.
  • Fig. 12 is a kind of CCT tendency adjustment table provided by the embodiment of the present application.
  • Fig. 13 is a Duv tendency adjustment table provided by the embodiment of the present application.
  • a unit may be, but is not limited to being limited to, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or distributed between two or more computers.
  • these units can execute from various computer readable media having various data structures stored thereon.
  • a unit may, for example, be based on a signal having one or more data packets (eg, data from a second unit interacting with another unit between a local system, a distributed system, and/or a network. For example, the Internet via a signal interacting with other systems) Communicate through local and/or remote processes.
  • Planckian locus An object that neither reflects nor completely projects under the action of radiation, but can absorb all the radiation falling on it is called a black body or a complete radiator.
  • the black body When the black body is continuously heated, the maximum value of its relative spectral power distribution will move to the short-wave direction, and the corresponding light color will change in the order of red, yellow, white, and blue. At different temperatures, the light color corresponding to the black body will change.
  • the arc locus formed on the chromaticity diagram is called the black body locus or Planck locus.
  • Correlated color temperature refers to the temperature of the black body radiator closest to the color with the same brightness stimulus, expressed in K temperature, used to describe the color of light located near the Planckian locus measure.
  • K temperature used to describe the color of light located near the Planckian locus measure.
  • Light sources other than thermal radiation light sources have linear spectra, and their radiation characteristics are quite different from black body radiation characteristics. Therefore, the light color of these light sources may not exactly fall on the black body locus on the chromaticity diagram.
  • CCT is usually used to describe the color characteristics of the light source.
  • Chromaticity distance refers to the distance from the chromaticity value (u, v) of the test light source to the nearest point on the Planckian locus, and D uv characterizes the chromaticity value (u, v) of the test light source v) Information on the color shift (green or pink) and orientation from the Planckian locus.
  • Exposure is the exposure time
  • Aperture is the aperture size
  • Iso is the sensitivity
  • Luma is the average value of Y in the XYZ color space of the image.
  • XYZ space the RGB value in the embodiment of this application is DeviceRGB
  • the DeviceRGB color space is a color space related to the device, that is, different devices have different understandings of the RGB value. Therefore, DeviceRGB is not suitable for calculating parameters such as brightness values. Calculating LV requires converting the DeviceRGB color space to a device-independent XYZ space, that is, converting RGB values to XYZ values.
  • the common method of converting RGB color space to XYZ space is: under different light source environments (typical light sources include A, H, U30, TL84, D50, D65, D75, etc.) to calibrate a color correction matrix with a size of 3*3 (Color Correction Matrix, CCM), and store the CCM of different light sources in the memory of the electronic device, through the formula:
  • the corresponding light source is often matched according to the white balance reference point in the image, and the CCM corresponding to the light source is selected. If the RGB value of the white balance reference point is between the two light sources (for example, the RGB value of the image falls between D50 and D65), CCM can be obtained by bilinear interpolation between D50 and D65.
  • the color correction matrix of D50 is CCM 1
  • the correlated color temperature is CCT 1
  • the color correction matrix of D60 is CCM 2
  • the correlated color temperature is CCT 2
  • the correlated color temperature of the image light source is CCT a .
  • Electronics can be based on the formula:
  • the CCM of the image can be calculated.
  • FIG. 1 is a schematic diagram of a hardware structure of an electronic device 100 provided in an embodiment of the present application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (Universal Serial Bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit, NPU
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be realized through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data created during the use of the electronic device 100 (such as audio data, phonebook, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of the electronic device 100 around three axes ie, x, y and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access to application locks, take pictures with fingerprints, answer incoming calls with fingerprints, and the like.
  • Touch sensor 180K also known as "touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the position of the display screen 194 .
  • the CCD circuit or CMOS circuit in electronic devices such as digital cameras or mobile phones cannot correct the color change of the light source, in order to prevent problems such as color temperature in the captured image, it is necessary to perform white balance processing on the image.
  • the most used white balance algorithms include traditional automatic white balance (Automatic White Balance, AWB) algorithm and artificial intelligence automatic white balance algorithm (AI AWB algorithm).
  • the AWB algorithm includes algorithms such as the grayscale world, and the AI AWB algorithm usually calculates the color description value of the white point of the image light source through the trained AI AWB model (such as the RGB value of the white point of the image light source, the R/G value of the white point of the image light source, etc.) and B/G, the chromaticity value of the white point of the image light source).
  • the AI AWB model can be a neural network model, a fast Fourier color constancy (Fast Fourier Color Constancy, FFCC) model, or other models with the function of calculating the color description value of the white point of the image light source.
  • FFCC Fast Fourier Color Constancy
  • the color of the white point of the image light source is the color of the image light source.
  • the RGB value or R/G, B/G of the white point is calculated, the RGB value of the image pixel can be adjusted based on the RGB value or R/G, B/G of the white point Value or R/G, B/G, so as to realize the white balance adjustment of the image.
  • Using the AWB algorithm to white balance the image has the advantages of a small amount of calculation and a wide range of application scenarios.
  • the accuracy of the RGB value of the white point of the image light source calculated by the AWB algorithm is not high, which leads to the RGB value based on the white point of the image light source
  • White balancing the image has little effect.
  • the AI AWB model For the AI AWB algorithm, because the AI AWB model it uses performs complex calculations on the input image, in a shooting scene similar to the training scene, the accuracy of the RGB value of the white point of the image light source calculated by the AI AWB model is very high of. However, due to the limited training samples in the process of training the AI AWB model, the AI AWB model cannot be applied to all shooting environments. When the user takes pictures in a shooting scene with poor generalization, the accuracy of the AI AWB model in calculating the RGB value of the white point of the image light source is extremely low. When the ISP adjusts the white balance of the image based on the RGB value of the white point of the image light source , the adjustment effect is not obvious.
  • the embodiment of the present application provides a method for determining chromaticity information.
  • the principle of the method is: multispectral
  • the sensor sends the calculated first chromaticity information of the white point of the light source of the image to be processed to the memory. Due to the fast start-up speed of the multi-spectral sensor, during the cold start of the camera, the electronic device can use the first chromaticity information to adjust the white balance of the first frame or the first few frames of images, so that the first frame or the first few frames of images There will be no color cast problems.
  • the electronic device when the electronic device calculates the chromaticity information of the image through the AWB algorithm and AI AWB algorithm from the original data of the image, if the number of white points of the statistical image light source is less than a certain threshold, the electronic device judges that the current shooting scene is a pure color scene , In a pure color scene, the chromaticity information of the white point of the image light source calculated by the electronic device through the AWB algorithm and the AI AWB algorithm is inaccurate. Therefore, the electronic device may also call the multispectral sensor to adjust the white balance of the image based on the calculated first chromaticity information.
  • the electronic device processes the raw data of the image to be processed output by the camera, the spectral data output by the multi-spectral sensor and the correlated color temperature (CCT) through the AWB algorithm to obtain the RGB value of the light source white point of the image. Based on the RGB value, the image light source white point can be obtained.
  • the electronic device calculates and processes the raw data of the image to be processed output by the camera, the spectral data output by the multi-spectral sensor and the CCT through the artificial intelligence automatic white balance algorithm (AI AWB algorithm), and obtains the RGB value of the white point of the image light source. Based on the RGB value, the ratio R 21 /G 21 of the R (red) color channel to the G (green) color channel of the image light source white point and the ratio B of the B (blue) color channel to the G (green) color channel can be obtained 21 /G 21 .
  • AI AWB algorithm artificial intelligence automatic white balance algorithm
  • the electronic device calculates the correlated color temperature CCT 11 and chromaticity distance D uv11 of the white point of the image light source based on R 11 / G 11 and B 11 /G 11 , and calculates the Correlated color temperature CCT 21 and chromaticity distance D uv21 of the white point of the image light source. Then, the electronic device fuses CCT 11 and CCT 21 to obtain fused CCT, and fuses Du uv11 and D uv21 to obtain fused Du uv .
  • the electronic device obtains the fourth chromaticity information based on the fused CCT and the fused Duv , and sends the fourth chromaticity information to the memory, so that the ISP can adjust the white balance of the image based on the fourth chromaticity information.
  • the electronic device 100 may display an interface 210 of a home screen, and the interface 210 includes a plurality of application icons. For example, a weather application icon, a video application icon, a memo icon, a setting application icon, and a camera application icon 211 .
  • the electronic device 100 detects an input operation (for example, click) on the camera application icon 211 , in response to the input operation, the electronic device 100 may display a shooting interface 220 as shown in FIG. 2B .
  • the shooting interface 220 includes a shooting preview interface 221 , a shooting control 222 and an echo control 223 .
  • the shooting preview interface 221 is used to display the current shooting environment.
  • the electronic device 100 detects an input operation (for example, click) on the echo control 223 , the electronic device 100 will display historical photos taken.
  • the electronic device 100 detects an input operation (for example, click) on the camera control 222 , in response to the operation, the electronic device displays a photo preview interface 230 as shown in FIG. 2C . In this way, the user can preview the captured photos in the photo preview interface 230 .
  • FIG. 3 is a flow chart of a method for determining chromaticity information provided by an embodiment of the present application. The specific process is as follows:
  • Step S301 start the camera application.
  • Step S302 the electronic device sends the first chromaticity information of the white point of the image light source calculated by the multispectral sensor to the memory.
  • the multi-spectral sensor is used to acquire the spectral data of the image light source, and calculate the color description values such as the correlated color temperature CCT and RGB value of the white point of the image light source based on these spectral data, so that the ISP can perform image processing based on these color description values ( For example, to white balance an image).
  • the embodiment of the present application uses a multi-spectral sensor including 8 visible light channels (F1 channel to F8 channel), an all-pass channel (Clear channel) and a near-infrared channel (NIR Value channel) as an example.
  • Each channel of the multi-spectral sensor corresponds to For different wavelength ranges, when the device is irradiated by light, each channel will generate a channel response value per unit time, and this response value is the spectral data acquired by the multispectral sensor.
  • the response value of the all-pass channel is used to represent the brightness of the image
  • the response value of the near-infrared channel is used to represent the intensity of infrared light.
  • the multi-spectral sensor can calculate the first chromaticity information based on the spectral data acquired by it.
  • the first chromaticity information can be the R 1 G 1 B 1 value of the white point of the image light source, or the R 1 / G 1 and B 1 /G 1 may also be the chromaticity value (u 1 , v 1 ) of the white point of the image light source, which is not limited in this embodiment of the present application.
  • the value of R 1 G 1 B 1 is used to represent the channel response value of the white point of the image light source on the red, green and blue color channels, and R 1 G 1 B 1 can interact with the chromaticity value (u 1 , v 1 ) convert.
  • the multi-spectral sensor When the camera is cold-started, the multi-spectral sensor will calculate the first chromaticity information of the white point of the image light source, spectral data, and the correlated color temperature (CCT) of the image light source. During the period from camera startup to taking pictures, the multi-spectral sensor calculates the first chromaticity information of the preview image at intervals of one or more frames, and transfers the first chromaticity information to the memory.
  • the preview image is an image displayed on a preview interface of the electronic device after the camera application is started.
  • the electronic device when the camera application is started, the electronic device will display a shooting interface 410 as shown in FIG. 4 .
  • the shooting interface 410 includes a preview interface 411 for displaying real-time images of the current shooting environment.
  • the real-time images displayed on the preview interface 411 are displayed to the user in units of image frames, and the image frames are preview images.
  • the multi-spectral sensor will calculate the spectral data of each frame of image and the correlated color temperature (CCT) of the light source, and the electronic equipment will calculate and process the spectral data, raw data and correlated color temperature of the image through the AWB algorithm and AI AWB algorithm respectively. Get the chroma information of each frame image.
  • the speed at which the camera produces images is faster than the speed at which electronic devices can calculate image chromaticity information through the AWB algorithm and AI AWB algorithm, and the startup speed of the multispectral sensor is fast.
  • the electronic device uses the first chromaticity information calculated by the multispectral sensor to adjust the white balance of the first frame or the first few frames of the preview image, so that the first frame or the first few frames of the preview image will not occur Color cast problem. For example, as shown in Figure 5, after the camera starts up, it outputs images 1 to 8 in sequence.
  • the electronic device When the camera starts up, the electronic device starts to use the AWB algorithm and the AI AWB algorithm to calculate the RGB value (chromaticity Information), because the speed of the camera output image is faster than the speed of the electronic device to calculate the RGB value of the white point of the light source of the preview image through the AWB algorithm and the AI AWB algorithm, when the camera outputs image 3, the electronic device calculates through the AWB algorithm and the AI AWB algorithm The RGB 1 value of the white point of the light source in image 1 is obtained. When the camera outputs image 4, the electronic device calculates the RGB 2 value of the white point of the light source in image 2.
  • the electronic device Since the timing difference between image 1 and image 3, and image 2 and image 4 is extremely small, after the electronic device outputs image 3, it can use the RGB 1 value of image 1 to adjust the white balance of image 3, and the electronic device can adjust the white balance of image 3 after outputting image 3. After 4, the RGB 2 value can be used to adjust the white balance of image 4.
  • the camera outputs image 1 (the first frame image) and image 2 the electronic device has not calculated the RGB value of the white point of the light source. Therefore, the camera output After image 1 and image 2, the electronic device uses the first chromaticity information (RGB value) sent to the memory by the multispectral sensor to adjust the white balance of image 1 and image 2, thereby solving the color cast problem of image 1 and image 2 .
  • RGB value chromaticity information
  • the electronic device judges that the current shooting scene is a pure color scene.
  • the pure color scene whether it is the AWB algorithm or the AI AWB algorithm, the accuracy of the chromaticity information of the light source white point is not accurate. high. Therefore, in a pure color scene, the electronic device can adjust the white balance of the preview image based on the first chromaticity information calculated by the multispectral sensor.
  • the first threshold may be obtained based on empirical values, historical values, or experimental data, which is not limited in this embodiment of the present application.
  • Step S303 The electronic device calculates and processes the raw data output by the camera, the spectral data output by the spectral sensor, and the CCT through the AWB algorithm to obtain the second chromaticity information of the white point of the image light source.
  • the raw data is the data in which the image sensor in the camera converts the captured light source signal into a digital signal.
  • the image sensor includes a plurality of image sensitive units, each image sensitive unit is sampled and quantized to obtain an RGB value, and the RGB value obtained by downsampling the image sensitive unit by the electronic device is the original data.
  • the original data obtained by the electronic device is a vector with a size of 64x48x3.
  • 64x48 is the number of image sensitive units
  • 3 is the calculated RGB value of each image sensitive unit.
  • the electronic device takes the original data as the input of the AWB module, and performs calculation and processing through the AWB algorithm to obtain the chromaticity information of the white point of the image light source.
  • This chromaticity information is the second chromaticity information of the white point of the image light source.
  • the second chromaticity information may be the R 2 G 2 B 2 value of the white point of the image light source, or R 2 /G 2 , B 2 /G 2 of the white point of the image light source, or the value of the white point of the image light source.
  • the chromaticity value (u 2 , v 2 ) is not limited in this embodiment of the present application.
  • the spectral data includes the channel response values of the 8 visible light channels output by the multispectral sensor, the response values of the all-pass channel and the response values of the near-infrared channel.
  • the electronic device can judge multiple shooting scenes according to the NIR value output by the NIR channel in the spectral data, and the correspondence table between the NIR value and the shooting scene and its related AWB algorithm is shown in Table 1:
  • the electronic device may judge that the type of the current shooting scene is blue sky and white clouds based on Table 1, and select Algorithm 1 to calculate the second chromaticity information.
  • Step S304 The electronic device converts the second chromaticity information into a first correlated color temperature and a first chromaticity distance of a white point of the image light source.
  • the first correlated color temperature is CCT 1
  • the first chromaticity distance is D uv1 .
  • the method for calculating Duv1 by an electronic device may be:
  • the electronic device can calculate the chromaticity value (u 2 , v 2 ), the formula (1) is as follows:
  • the method for electronic equipment to calculate CCT 1 may be:
  • the electronic device can refer to formula (1) to calculate the chromaticity value (u 2 , v 2 ). Then, the electronic device obtains a point M on the Planckian locus with the shortest distance to (u 2 , v 2 ) on the chromaticity diagram, and the CCT corresponding to the point M is CCT 1 .
  • Step S305 The electronic device calculates and processes the raw data output by the camera, the spectral data and correlated color temperature output by the multispectral sensor through the AI AWB algorithm, and obtains the third chromaticity information and the first confidence level of the white point of the image light source.
  • the trained AI AWB model is included in the AI AWB algorithm.
  • the electronic device takes raw data, spectral data, and the correlated color temperature (CCT) calculated by the multi-spectral sensor as input.
  • CCT correlated color temperature
  • the image light source white Tertiary chromaticity information for the point.
  • the third chromaticity information may be the R 3 G 3 B 3 value of the white point of the image light source, or R 3 /G 3 and B 3 /G 3 of the white point of the image light source, or the value of the image
  • the chromaticity value (u 3 , v 3 ) of the white point of the light source is not limited in this embodiment of the present application.
  • the first confidence level is used to characterize the reliability of the AI AWB model.
  • Step S306 The electronic device converts the third chromaticity information into a second correlated color temperature and a second chromaticity distance.
  • the second correlated color temperature is CCT 2
  • the second chromaticity distance is D uv2 .
  • the relevant description of converting the second chromaticity information into CCT 1 and Duv1 by the electronic device in step S304 which will not be described in this embodiment of the present application.
  • Step S307 The electronic device adjusts the first confidence level based on the CCT output by the multispectral sensor to obtain a second confidence level.
  • the electronic device calculates the deviation value Fn based on the CCT and CCT 2 output by the multispectral sensor, and the Fn is used to characterize the degree of deviation between the CCT output by the multispectral sensor and CCT 2 .
  • One degree of confidence yields a second degree of confidence.
  • the preset range of Fn is 500K
  • the first confidence level Conf_1 is 70%
  • the CCT output by the multispectral sensor is 3500K
  • the CCT 2 is 4500K
  • the deviation value Fn is 1000K.
  • the electronic device judges that the first confidence level output by the AI AWB model is inaccurate, and the electronic device lowers the first confidence level from 70% to 60%. Therefore, the second degree of confidence Conf_2 is 60%.
  • Step S308 The electronic device corrects the second correlated color temperature, the second chromaticity distance, and the second confidence level to obtain a third correlated color temperature, a third chromaticity distance, and a third confidence level.
  • the AI AWB model in the AI AWB algorithm is a pre-trained model.
  • the AI AWB model cannot be applied to all shooting environments.
  • the accuracy of CCT 2 , Duv2 and Conf_2 calculated by the AI AWB algorithm is extremely low.
  • the electronic device needs to correct CCT 2 , D uv2 and Conf_2, and if any of CCT 2 , D uv2 and Conf_2 deviates too much from its corrected value, the electronic device can correct it , so that CCT 2 , D uv2 and Conf_2 are within a reasonable range of values.
  • the correction process of the electronic device to CCT 2 is as follows: a CCT shift table (CCT Shift Table) is stored in the electronic device.
  • the CCT Shift Table is a three-dimensional coordinate table with three coordinate axes: CCT axis, D uv axis and LV axis.
  • CCT Shift Table In the three-dimensional space of the CCT Shift Table, there are many cells, and each cell corresponds to a CCT correction value.
  • the electronic device finds a corresponding point in the CCT Shift Table three-dimensional coordinate system based on the LV, CCT 2 and Du uv2 of the white point of the image light source, and determines the cell adjacent to this point.
  • the weight of each adjacent cell is calculated by Trilinear interpolation (trilinear interpolation), and the weight of the cell is multiplied by its corresponding CCT correction value to obtain the product of each adjacent cell, and each The products of adjacent cells are summed to obtain CCT 3 , and CCT 3 is the third correlated color temperature.
  • the electronic device finds a point M in the CCT Shift Table three-dimensional coordinate system based on the LV, CCT 2 and D uv2 of the white point of the image light source, the point M is the midpoint of the line segment XY. Among them, X is the intersection of cells 1 to 4, and Y is the intersection of cells 5 to 8.
  • the electronic device determines cell 1 to cell 8 as the adjacent cells of point M, and the electronic device performs Trilinear interpolation on cell 1 to cell 8 to obtain cell 1-cell
  • the weight f 1 to f 8 of grid 8 and then calculate CCT 3 according to the formula (3), the formula (3) is as follows:
  • CCT 3 CCT 11 *f 1 +...+CCT 13 *f 3 +CCT 14 *f 4 +CCT 21 *f 5 +...+CCT 24 *f 8 (3)
  • the process of correcting D uv2 by the electronic device is as follows: a D uv shift table (D uv Shift Table) is stored in the electronic device.
  • D uv Shift Table is a three-dimensional coordinate table with three coordinate axes, namely: D uv axis, CCT axis and LV axis.
  • D uv Shift Table there are many cells, and each cell corresponds to a D uv correction value.
  • the electronic device finds the corresponding point in the three-dimensional coordinate system of the D uv Shift Table based on the LV, D uv2 and CCT 2 of the white point of the image light source, and determines the cell adjacent to this point. Among them, each adjacent cell corresponds to a Duv correction value. Then, the electronic device calculates the weight of each adjacent cell through Trilinear interpolation (trilinear interpolation), and multiplies the weight of the cell with its corresponding D uv correction value to obtain the product of each relevant cell, The products of each relevant cell are summed to obtain D uv3 , where D uv3 is the third chromaticity distance.
  • Trilinear interpolation trilinear interpolation
  • the electronic device finds a point M' in the D uv Shift Table three-dimensional coordinate system based on the LV, CCT 2 and D uv2 of the white point of the image light source, the point M' is the line segment X'Y' midpoint.
  • X' is the intersection point of cell 1-cell 4
  • Y' is the intersection point of cell 5-cell 8.
  • the electronic device determines that cell 1 to cell 8 are the adjacent cells of point M′, and the electronic device performs Trilinear interpolation on cell 1-cell 8 to obtain cell 1 -The weight f′ 1 ⁇ weight f′ 8 of cell 8, and then calculate the third chromaticity distance according to the formula (4), the formula (4) is as follows:
  • D uv11 ⁇ D uv14 , D uv21 ⁇ D uv21 are the D uv correction values of cell 1 ⁇ cell 8 respectively
  • f′ 1 ⁇ f′ 8 are the weights of cell 1 ⁇ cell 8 respectively
  • f ' 1 +...+f' 8 1
  • D uv3 is the third chromaticity distance.
  • the correction process of the electronic device to Conf_2 is as follows: a confidence shift table (Confidence Shift Table) is stored in the electronic device.
  • the Confidence Shift Table is a three-dimensional coordinate table with three coordinate axes: CCT axis, D uv axis and LV axis.
  • CCT axis three coordinate axes
  • D uv axis D uv axis
  • LV axis three coordinate axes
  • Mult_Conf the electronic device finds the corresponding point in the three-dimensional coordinate system of Confidence Shift Table based on the LV, D uv2 and CCT 2 of the image, and determines the cell adjacent to this point.
  • the electronic device finds a point M′′ in the ConfidenceShift Table three-dimensional coordinate system based on the LV, CCT 2 and Duv2 of the white point of the image light source, the point M′′ is the midpoint of the line segment X′′Y′′ .
  • X is the intersection point of cell 1-cell 4
  • Y is the intersection point of cell 5-cell 8.
  • the electronic device determines that cells 1 to 8 are the adjacent cells of point M′′, and the electronic device calculates the weight f′′ of cells 1 to 8 based on Trilinear interpolation 1 ⁇ weight f′′ 8 , and then calculate Mult_Conf_new according to the formula (5), the formula (5) is as follows:
  • Step S309 the electronic device calculates a first filter vector based on the third correlated color temperature and the third chromaticity distance.
  • the electronic device corrects CCT 2 , D uv2 and Conf_2 to obtain CCT 3 and D uv3 .
  • the electronic device will convert CCT 3 and D uv3 into a first filter vector Mu 1 for timing filtering.
  • the Mu 1 is [log(R 4 /G 4 ), log(B 4 /G 4 )], and Mu 1 includes the chromaticity information (R 4 B 4 G 4 value) of the white point of the image light source .
  • the electronic device is based on CCT 3 and D uv3 , and the process of obtaining Mu 1 is shown in Figure 9.
  • the electronic device determines point D on the Planckian locus of the chromaticity diagram based on CCT 3 , and the chromaticity value of point D is (u′ 4 , v′ 4 ), and then, the coordinates (u′′ 4 , v′′ 4 ) of a point E on the Planckian locus where the correlated color temperature is CCT 3 + ⁇ T are calculated.
  • the electronic device calculates the inclination angle ⁇ , and obtains du, dv, sin ⁇ , and cos ⁇ according to formula (6) to formula (9), and formula (6) to formula (9) are as follows:
  • the electronic device calculates R 4 G 4 B 4 according to the formula (12), and the formula (12) is as follows:
  • the electronic device calculates the first filter vector Mu 1 based on the R 4 G 4 B 4 value.
  • Step S310 The electronic device performs time series filtering on the first filter vector to obtain a second filter vector.
  • time-series filtering is to fuse multiple frames of signals in time series into multi-frame stationary signals.
  • the purpose of electronic equipment to perform time-series filtering is to fuse the chrominance information of the frame image with the chrominance information of the previous frame image to obtain The fused chromaticity information is used to improve the accuracy of the third chromaticity information of the frame image.
  • the timing filtering of Mu 1 by the electronic device through the Kalman filter is taken as an example for illustration.
  • the timing filtering process is as follows: the electronic device first obtains the first covariance matrix Sigma 1 , where Sigma 1 is used to represent AI The reliability of the chromaticity information of the white point of the light source of the frame image output by the AWB model can be transformed into the confidence degree. Sigma 1 can be output by AI AWB model, or calculated by electronic equipment based on Conf_3. Then, the electronic device updates Sigma 1 according to the formula (13) to obtain the updated Sigma 1 , and the formula (13) is as follows:
  • Sigma′ 1 is the updated Sigma 1
  • ⁇ 1 is the first parameter.
  • Sigma 2 is the covariance matrix obtained by calculating the chromaticity information of the white point of the light source in the previous frame image for the AI AWB model.
  • the formula (14) is as follows :
  • Sigma′ 2 is the updated Sigma 2
  • ⁇ 2 is the second parameter.
  • ⁇ 1 and ⁇ 2 can be obtained from empirical values, historical data, or experimental test data, which is not limited in this embodiment of the present application.
  • the electronic device performs timing filtering on Mu 1 according to the formula (15) to obtain the second filter vector Mu, and the formula (15) is as follows:
  • Mu 2 is the filter vector of the previous frame image, and Mu is the second filter vector.
  • Step S311 The electronic device calculates a fourth correlated color temperature and a fourth chromaticity distance based on the second filter vector.
  • the R 5 G 5 B 5 value of the white point of the image light source can be obtained according to Mu, and the R 5 G 5 B 5 value can be converted into the fourth correlated color temperature CCT 4 and the fourth Chromaticity distance D uv4 .
  • the electronic device For the process of converting the R 5 G 5 B 5 value into CCT 4 and D uv4 by the electronic device, please refer to the relevant description of the electronic device converting the second chromaticity information into CCT 1 and D uv1 in the above step S303, which is implemented in this application Examples will not be repeated here.
  • the accuracy of CCT 4 and D uv4 is higher than that of CCT 3 and D uv3 .
  • Step S312 The electronic device obtains a fifth correlated color temperature based on the first correlated color temperature and the fourth correlated color temperature.
  • a correlated color temperature merging table (CCT Merging Table) is stored in the electronic device.
  • the CCT Merging Table is a three-dimensional coordinate table with three coordinate axes, namely: D uv axis, CCT axis and LV axis.
  • D uv axis namely: D uv axis
  • CCT axis namely: CCT axis
  • LV axis namely: D uv axis
  • LV axis three-dimensional coordinate table with three coordinate axes, namely: D uv axis, CCT axis and LV axis.
  • the electronic device finds the corresponding point in the three-dimensional coordinate system of the CCT Merging Table based on the LV, CCT 4 and D uv4 of the image, and determines the cell adjacent to this point.
  • Equation (16) is as follows:
  • CCT 5 Conf_3*X′*CCT 4 +(1-Conf_3*X′)*CCT 1 (16)
  • Step S313 The electronic device obtains a fifth chromaticity distance based on the first chromaticity distance and the fourth chromaticity distance.
  • a chromaticity distance merging table ( Duv Merging Table) is stored in the electronic device.
  • the D uv Merging Table is a three-dimensional coordinate table with three coordinate axes, namely: D uv axis, CCT axis and LV axis.
  • the electronic device finds a corresponding point in the three-dimensional coordinate system of the D uv Merging Table based on the LV, CCT 4 and D uv4 of the image, and determines the cell adjacent to this point.
  • the electronic device calculates the weight of each adjacent cell through Trilinear interpolation (trilinear interpolation), and multiplies the weight of the cell with its corresponding probability value Y to obtain the product of each relevant cell, which will be The products of each relevant cell are summed to obtain the second probability value Y'.
  • the electronic device is calculated by the formula (17) to obtain the fifth chromaticity distance D uv5 , and the formula (17) is as follows:
  • Step S314 The electronic device adjusts the fifth correlated color temperature to obtain a sixth correlated color temperature.
  • the electronic device needs to adjust the tendency of CCT 5 and D uv5 .
  • the specific process for the electronic device to adjust the CCT 5 propensity is as follows: a CCT Propensity Table (correlated color temperature propensity adjustment table) as shown in Figure 12 is stored in the electronic device, and the CCT Propensity Table is a three-dimensional coordinate table with three coordinate axes , respectively: CCT axis, D uv axis and LV axis.
  • CCT axis CCT axis
  • D uv axis a three-dimensional coordinate table with three coordinate axes , respectively: CCT axis, D uv axis and LV axis.
  • CCT adjustment value Delta_CCT
  • the electronic device finds the corresponding point in the three-dimensional coordinate system of CCT Propensity Table based on the LV, CCT 5 and D uv5 of the image, and determines the cell adjacent to this point.
  • CCT 6 is the sixth correlated color temperature.
  • Step S315 The electronic device adjusts the fifth chromaticity distance to obtain a sixth chromaticity distance.
  • a Duv Propensity Table (chromaticity distance propensity adjustment table) as shown in Figure 13 is stored in the electronic device, and the Duv Propensity Table is a three-dimensional coordinate
  • the table has three coordinate axes, namely: CCT axis, D uv axis and LV axis.
  • CCT axis CCT axis
  • D uv axis D uv axis
  • LV axis three coordinate axes
  • the electronic device Based on the LV, D uv5 and CCT 5 of the image, the electronic device finds the corresponding point in the three-dimensional coordinate system of D uv Propensity Table, and determines the cell adjacent to this point. Then calculate the weight of each adjacent cell through Trilinear interpolation (trilinear interpolation), and multiply the weight of the cell with its corresponding Delta_D uv to get the product of each related cell, and divide each related cell The products of grids are summed to obtain the first chromaticity distance adjustment value Delta_D uv ′. Then, the electronic device calculates the adjusted chromaticity distance according to the formula (19), and the formula (19) is as follows:
  • D uv6 is the sixth chromaticity distance.
  • Step S316 The electronic device calculates a third filter vector based on the sixth correlated color temperature and the sixth chromaticity distance.
  • the third filter vector is Mu 3
  • the method for the electronic device to calculate Mu 3 based on CCT 6 and D uv6 please refer to the relevant description of the electronic device's calculation of Mu 1 based on CCT 3 and D uv3 in step S309 , the embodiment of the present application is here No longer.
  • Step S317 the electronic device performs time series filtering on the third filter vector to obtain a fourth filter vector.
  • the electronic device performs sequential filtering on the filter vector of the current frame image and the filter vector of the previous frame image , so as to obtain a fused filter vector (fourth filter vector), and the electronic device can obtain fourth chrominance information based on the fourth filter vector.
  • the electronic device uses the fourth chromaticity information to adjust the white balance of the current frame image, the overall color of the adjusted image will not be significantly different from the overall color of the previous frame image, and two adjacent frames of images will not be generated. Problems with sudden color changes.
  • the electronic device updates the first covariance matrix of the image according to the formula (20) to obtain the third covariance matrix, and the formula (20) is as follows:
  • Sigma 3 is the third covariance matrix
  • Sigma 1 is the first covariance matrix
  • a 1 is the third parameter.
  • the electronic device updates the second covariance matrix according to the formula (21) to obtain the fourth covariance matrix, and the formula (21) is as follows:
  • Sigma 4 is the fourth covariance matrix
  • Sigma 2 is the second covariance matrix
  • a 1 is the third parameter
  • a 1 can be obtained from historical values, empirical values, or experimental data. Application examples are not limited.
  • Mu 4 is the filter vector of the previous frame image, and Mu 4 is calculated by the electronic device based on CCT' 2 and D' uv2 .
  • CCT' 2 is the correlated color temperature of the previous frame image adjusted by the CCT inclination
  • D' uv2 is the chromaticity distance of the previous frame image adjusted by the D uv inclination.
  • Step S318 The electronic device calculates fourth chrominance information based on the fourth filter vector.
  • the fourth chromaticity information may be the R 6 G 6 B 6 value of the white point of the image light source, or R 6 /G 6 and B 6 /G 6 of the white point of the image light source, which is not done in this embodiment of the present application. limit.
  • Step S319 the electronic device sends the fourth chromaticity information to the memory.
  • the fourth chromaticity information is used to adjust the color of the image in a non-pure color scene.
  • the electronic device selects the corresponding AWB algorithm to calculate the second chromaticity information based on the spectral data output by the multispectral sensor, thereby improving the accuracy of the first correlated color temperature and the first chromaticity distance calculated based on the second chromaticity information.
  • the electronics adjust the confidence level of the AI AWB algorithm output based on the CCT output from the multispectral sensor, thereby improving the accuracy of the confidence level.
  • the accuracy of the first correlated color temperature, the first chromaticity distance, and the confidence degree output by the AI AWB algorithm are all improved, when the electronic device fuses the first correlated color temperature with the fourth correlated color temperature calculated based on the AI AWB algorithm, The accuracy of the fifth correlated color temperature obtained is also improved. After the electronic device fuses the first chromaticity distance with the fourth chromaticity distance calculated based on the AI AWB algorithm, the accuracy of the fifth chromaticity distance obtained is also improved. Therefore, the accuracy of the fourth chromaticity information distributed to the memory based on the fifth correlated color temperature and fifth chromaticity distance is also improved.
  • the electronic device uses the fourth chromaticity information to adjust the white balance of the image, the effect is better.
  • the multi-spectral sensor since the multi-spectral sensor has been sending the first chromaticity value of the calculated image to the memory during the period from the cold start of the camera to the electronic device taking pictures, when the camera is in the process of cold start, due to the image output of the camera The speed is fast, and the electronic device can use the first chromaticity information to adjust the white balance of the first frame or the first few frames of the preview image, thereby avoiding the color cast problem of the first frame or the first few frames of the preview image.
  • the electronic device when the electronic device detects that the current shooting scene is a pure color scene (the number of white points of the image light source is insufficient), since the accuracy of the fourth chromaticity information calculated by the AWB algorithm and the AI AWB algorithm is not high, the electronic device can also call the first The chroma information adjusts the white balance of the image.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server or data center Transmission to another website site, computer, server, or data center by wired (eg, coaxial cable, optical fiber, DSL) or wireless (eg, infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the available medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, DVD), or a semiconductor medium (for example, a Solid State Disk).
  • the processes can be completed by computer programs to instruct related hardware.
  • the programs can be stored in computer-readable storage media.
  • When the programs are executed may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: ROM or random access memory RAM, magnetic disk or optical disk, and other various media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Color Television Image Signal Generators (AREA)
  • Processing Of Color Television Signals (AREA)

Abstract

本申请提供了一种色度信息的确定方法及相关电子设备,其中,方法包括:电子设备启动相机应用;多光谱传感器获取第一通道值,其中,多光谱传感器包括第一通道,第一通道值为第一通道获取的值;将第一通道值发送到自动白平衡模块;根据第一通道值,自动白平衡模块从多个算法中选择目标算法;电子设备根据目标算法确定目标色度信息。

Description

一种色度信息的确定方法及相关电子设备
本申请要求于2021年8月12日提交中国专利局、申请号为202110925200.8、发明名称为“一种色度信息的确定方法及相关电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像处理领域,尤其涉及一种色度信息的确定方法及相关电子设备。
背景技术
白平衡是数码相机设备或相关软件能够对捕获图像进行的一种调整,用于确保图像中的白色能够适当地反映拍摄图像的真实世界场景中的实际白色。白平衡与色温相关,色温是基于图像和场景中的蓝光量与红光量的比率来衡量光的质量,以开尔文为单位。具有较高色温的图像或场景比具有较低色温的图像和场景具有更多的蓝色。因此,“较冷”的光具有较高的色温,“较热”的光具有较低的色温。
人眼和大脑可以适应不同的色温。例如,不管是在阳光下还是在各种灯光下,人眼将白色的物体视为白色,即人眼具有颜色恒常性。由于摄像机内用于将光信号转化为电信号的电荷耦合元件电路(Charge-coupled Device,CCD)或CMOS电路没有办法像人眼一样会对光源的颜色变化进行修正。因此,需要通过白平衡算法来估计捕获图像光源的色度,并通过估计的光源色度来调整图像颜色,使得调整后的图像的色彩与人眼真实观察的色彩一致。如何提高白平衡算法的准确度,是技术人员日益关注的问题。
发明内容
本申请实施例提供了一种色度信息的确定方法,解决了通过AWB和AI AWB融合算法计算出来的色度信息准确度不高的问题。
第一方面,本申请实施例提供了一种色度信息的确定方法,应用于电子设备,该电子设备包括多光谱传感器、自动白平衡模块和AI自动白平衡模块,自动白平衡模块包括多个算法,该方法包括:电子设备启动相机应用;多光谱传感器获取第一通道值,其中,多光谱传感器包括第一通道,第一通道值为第一通道获取的值;将第一通道值发送到自动白平衡模块;根据第一通道值,自动白平衡模块从多个算法中选择目标算法;电子设备根据所述目标算法确定目标色度信息。在上述实施例中,由于自动白平衡模块可以根据第一通道值选择目标算法,这个目标算法为与当前拍摄环境相匹配的AWB算法,电子设备基于该目标算法计算得到的目标色度信息的准确度更高,当电子设备使用该目标色度信息对图像进行白平衡调节时,调节的效果更好。
结合第一方面,在一种实施方式中,该方法还包括:多光谱传感器获取第一相关色温;将第一相关色温发送到AI自动白平衡模块;AI自动白平衡模块确定第一相关色温与第二相关色温的差值,第二相关色温为AI自动白平衡模块根据相机采集的图像得到的相关色温值;当该差值大于预设阈值时,将AI自动白平衡模块输出的置信度调整为第一置信度;电 子设备根据所述第一置信度确定目标色度信息。在上述实施例中,电子设备根据第一相关色温与第二相关色温的差值调整AI自动白平衡输出的置信度,得到第一置信度,提高了第一置信度的准确度,使得基于第一置信度计算得到的目标色度信息准确度更高,当电子设备使用该目标色度信息对图像进行白平衡调节时,调节的效果更好。
结合第一方面,在一种实施方式中,该方法还包括:当该启动为冷启动时,多光谱传感器获取第一色度信息;所述电子设备确定所述目标色度信息为所述第一色度信息。这样,电子设备可以使用第一色度信息对相机输出的首帧或前几帧图像进行白平衡调节,使得该首帧或前几帧图像不会出现偏色。
结合第一方面,在一种实施方式中,该方法还包括:当自动白平衡模块或所述AI自动白平衡模块确定相机采集的图像为纯色图像时,电子设备确定目标色度信息为第二色度信息,第二色度信息为多光谱传感器获取的色度信息。这样,在自动白平衡模块或所述AI自动白平衡模块不能准确计算纯色图像色度信息的情况下,电子设备也能调用准确度较高的第二色度信息对图像进行白平衡调节,从而解决图像的偏色问题。
结合第一方面,在一种实施方式中,所述第一通道值包括以下一项或多项:可见光通道值、NIR通道值、Clear通道值。
结合第一方面,在一种实施方式中,电子设备根据第一置信度确定目标色度信息,具体包括:自动白平衡模块将相机采集的图像通过目标算法进行计算,得到第三相关色温与第一色度距离;AI自动白平衡模块基于相机采集的图像,得到第四相关色温与第二色度距离;所述电子设备将所述第一置信度进行修正,得到第二置信度;电子设备根据所述第二置信度将第三相关色温与第四相关色温进行融合,得到第五相关色温;电子设备根据第二置信度将第一色度距离与第二色度距离进行融合,得到第三色度距离;电子设备基于第五相关色温和第三色度距离得到目标色度信息。在上述实施例中,由于自动白平衡模块使用目标算法计算得到第一色度距离和第三相关色温,因此,计算出的第一色度距离和第三相关色温的准确度很高。由于电子设备根据第一相关色温与第二相关色温的差值调整AI自动白平衡输出的置信度,得到第一置信度,提高了第一置信度的准确度,且第二置信度是由第一置信度修正得到,因此第二置信度的准确度也有所提高。由于第三色度距离是电子设备根据第二置信度将第一色度距离和第二色度距离进行融合得到的,第五相关色温是电子设备根据第二置信度将第三相关色温和第四相关色温进行融合得到的,因此第三色度距离和第五相关色温的准确性也有所提高,使得基于第三色度距离和第五相关色温得到的目标色度信息的准确性也有所提高。当电子设备使用该目标色度信息对图像进行白平衡调节时,能够更加有效地解决图像的偏色问题。
结合第一方面,在一种实施方式中,电子设备基于第五相关色温和第三色度距离得到目标色度信息,具体包括:将第五相关色温进行倾向度调节得到第六相关色温;将第三色度距离进行倾向度调节,得到第四色度距离;基于第六相关色温和第四色度距离,得到所述目标色度信息。这样,电子设备可以根据第六相关色温和第四色度距离得到准确度高的目标色度信息,当电子设备使用该目标色度信息对图像进行白平衡调节时,能够更加有效地解决图像的偏色问题。
结合第一方面,在一种实施方式中,AI自动白平衡模块基于相机采集的图像,得到第 四相关色温与第二色度距离,具体包括:AI自动白平衡模块根据相机采集的图像,输出第二相关色温和初始色度距离;将第二相关色温与初始色度距离进行修正,得到修正后的第二相关色温和修正后的初始色度距离;基于修正后的第二相关色温和修正后的初始色度距离,进行时序滤波,得到第四相关色温和第二色度距离。这样,通过时序滤波得到的第四相关色温和第二色度距离的准确性更高。
结合第一方面,在一种实施方式中,所述基于修正后的第二相关色温和修正后的初始色度距离,进行时序滤波,得到第四相关色温和第二色度距离,具体包括:根据公式
Figure PCTCN2022091965-appb-000001
更新第一协方差矩阵,得到更新后的第一协方差矩阵;Sigma′ 1为更新后的第一协方差矩阵,Sigma 1为第一协方差矩阵,λ 1为第一参数,第一协方差矩阵为所述AI自动白平衡模块根据相机采集的图像输出的协方差矩阵或者第一协方差矩阵为基于第二置信度计算得到的协方差矩阵;根据公式
Figure PCTCN2022091965-appb-000002
更新第二协方差矩阵,得到更新后的第二协方差矩阵;第二协方差矩阵为第二图像的协方差矩阵,第二图像为所述相机采集的上一帧图像,Sigma′ 2为所述更新后的第二协方差矩阵,Sigma 2为第二协方差矩阵,λ 2为第二参数;基于修正后的第二相关色温和修正后的初始色度距离计算,得到第一滤波向量;根据公式Mu=Mu 1*(Sigma′ 1) -1+Mu 2*(Sigma′ 2) -1,得到第二滤波向量;Mu 1为第一滤波向量,Mu 2为第二图像的滤波向量,Mu为第二滤波向量;基于第二滤波向量进行计算处理,得到第四相关色温和所述第二色度距离。这样,通过时序滤波得到的第四相关色温和第二色度距离的准确性更高。
结合第一方面,在一种实施方式中,电子设备根据第二置信度将第三相关色温与第四相关色温进行融合,得到第五相关色温,具体包括:根据相机采集的图像的亮度值、第四相关色温以及第二色度距离,确定第一概率值;根据公式CCT 5=Conf_2*X′*CCT 4+(1-Conf_2*X′)*CCT 3计算得到第五相关色温;其中,所述Conf_2为第二置信度,X′为第一概率值,CCT 4为第四相关色温,CCT 3为第三相关色温,CCT 5为第五相关色温。这样,电子设备可以根据第五相关色温和第三色度距离得到准确度高的目标色度信息,当电子设备使用该目标色度信息对图像进行白平衡调节时,能够更加有效地解决图像的偏色问题。
结合第一方面,在一种实施方式中,电子设备根据所述第二置信度将第一色度距离与第二色度距离进行融合,得到第三色度距离,具体包括:根据相机采集的图像的亮度值、第四相关色温以及第二色度距离,确定第二概率值;根据公式D uv3=Conf_2*Y′*D uv2+(1-Conf_2*Y′)*D uv1,计算得到第三色度距离;其中,Conf_2为第二置信度,Y′为第二概率值,D uv1为第一色度距离,D uv2为第二色度距离,D uv3为第三色度距离。这样,电子设备可以根据第五相关色温和第三色度距离得到准确度高的目标色度信息,当电子设备使用该目标色度信息对图像进行白平衡调节时,能够更加有效地解决图像的偏色问题。
结合第一方面,在一种实施方式中,将第五相关色温进行倾向度调节得到第六相关色温,具体包括:根据相机采集的图像的亮度值、第五相关色温以及第三色度距离确定第一相关色温调节值;根据公式CCT 6=CCT 5+Delta_CCT′,得到第六相关色温,CCT 5为第五相 关色温,CCT 6为第六相关色温,Delta_CCT′为第一相关色温调节值。这样,当电子设备使用基于第六相关色温和第四色度距离计算出的目标色度信息对图像进行白平衡调节时,在解决图像色偏的基础上使得图像颜色更符合用户的视觉习惯。
结合第一方面,在一种实施方式中,将第三色度距离进行倾向度调节,得到第四色度距离,具体包括:根据相机采集的图像的亮度值、第五相关色温以及第三色度距离确定第一色度距离调节值;根据公式D uv4=D uv3+Delta_D uv′,得到第四色度距离,D uv4为第四色度距离,D uv3为第三色度距离,Delta_D uv′为第一色度距离调节值。这样,当电子设备使用基于第六相关色温和第四色度距离计算出的目标色度信息对图像进行白平衡调节时,在解决图像色偏的基础上使得图像颜色更符合用户的视觉习惯。
第二方面,本申请实施例提供了一种电子设备,该电子设备包括:一个或多个处理器和存储器;该存储器与该一个或多个处理器耦合,该存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令,该一个或多个处理器调用该计算机指令以使得该电子设备执行:电子设备启动相机应用;多光谱传感器获取第一通道值,其中,多光谱传感器包括第一通道,第一通道值为第一通道获取的值;将第一通道值发送到自动白平衡模块;根据第一通道值,自动白平衡模块从多个算法中选择目标算法;电子设备根据所述目标算法确定目标色度信息。
结合第二方面,在一种实施方式中,该一个或多个处理器还用于调用该计算机指令以使得该电子设备执行:多光谱传感器获取第一相关色温;将第一相关色温发送到AI自动白平衡模块;AI自动白平衡模块确定第一相关色温与第二相关色温的差值,第二相关色温为AI自动白平衡模块根据相机采集的图像得到的相关色温值;当该差值大于预设阈值时,将AI自动白平衡模块输出的置信度调整为第一置信度;电子设备根据所述第一置信度确定目标色度信息。
结合第二方面,在一种实施方式中,该一个或多个处理器还用于调用该计算机指令以使得该电子设备执行:当该启动为冷启动时,多光谱传感器获取第一色度信息;所述电子设备确定所述目标色度信息为所述第一色度信息。
结合第二方面,在一种实施方式中,该一个或多个处理器还用于调用该计算机指令以使得该电子设备执行:当自动白平衡模块或所述AI自动白平衡模块确定相机采集的图像为纯色图像时,电子设备确定目标色度信息为第二色度信息,第二色度信息为多光谱传感器获取的色度信息。
结合第二方面,在一种实施方式中,该一个或多个处理器还用于调用该计算机指令以使得该电子设备执行:自动白平衡模块将相机采集的图像通过目标算法进行计算,得到第三相关色温与第一色度距离;AI自动白平衡模块基于相机采集的图像,得到第四相关色温与第二色度距离;所述电子设备将所述第一置信度进行修正,得到第二置信度;电子设备根据所述第二置信度将第三相关色温与第四相关色温进行融合,得到第五相关色温;电子设备根据第二置信度将第一色度距离与第二色度距离进行融合,得到第三色度距离;电子设备基于第五相关色温和第三色度距离得到目标色度信息。
结合第二方面,在一种实施方式中,该一个或多个处理器还用于调用该计算机指令以 使得该电子设备执行:将第五相关色温进行倾向度调节得到第六相关色温;将第三色度距离进行倾向度调节,得到第四色度距离;基于第六相关色温和第四色度距离,得到所述目标色度信息。
结合第二方面,在一种实施方式中,该一个或多个处理器还用于调用该计算机指令以使得该电子设备执行:AI自动白平衡模块根据相机采集的图像,输出第二相关色温和初始色度距离;将第二相关色温与初始色度距离进行修正,得到修正后的第二相关色温和修正后的初始色度距离;基于修正后的第二相关色温和修正后的初始色度距离,进行时序滤波,得到第四相关色温和第二色度距离。
结合第二方面,在一种实施方式中,该一个或多个处理器还用于调用该计算机指令以使得该电子设备执行:根据公式
Figure PCTCN2022091965-appb-000003
更新第一协方差矩阵,得到更新后的第一协方差矩阵;Sigma′ 1为更新后的第一协方差矩阵,Sigma 1为第一协方差矩阵,λ 1为第一参数,第一协方差矩阵为所述AI自动白平衡模块根据相机采集的图像输出的协方差矩阵或者第一协方差矩阵为基于第二置信度计算得到的协方差矩阵;根据公式
Figure PCTCN2022091965-appb-000004
更新第二协方差矩阵,得到更新后的第二协方差矩阵;第二协方差矩阵为第二图像的协方差矩阵,第二图像为所述相机采集的上一帧图像,Sigma′ 2为所述更新后的第二协方差矩阵,Sigma 2为第二协方差矩阵,λ 2为第二参数;基于修正后的第二相关色温和修正后的初始色度距离计算,得到第一滤波向量;根据公式Mu=Mu 1*(Sigma′ 1) -1+Mu 2*(Sigma′ 2) -1,得到第二滤波向量;Mu 1为第一滤波向量,Mu 2为第二图像的滤波向量,Mu为第二滤波向量;基于第二滤波向量进行计算处理,得到第四相关色温和所述第二色度距离。
结合第二方面,在一种实施方式中,该一个或多个处理器还用于调用该计算机指令以使得该电子设备执行:根据相机采集的图像的亮度值、第四相关色温以及第二色度距离,确定第一概率值;根据公式CCT 5=Conf_2*X′*CCT 4+(1-Conf_2*X′)*CCT 3,计算得到第五相关色温;其中,所述Conf_2为第二置信度,X′为第一概率值,CCT 4为第四相关色温,CCT 3为第三相关色温,CCT 5为第五相关色温。
结合第二方面,在一种实施方式中,该一个或多个处理器还用于调用该计算机指令以使得该电子设备执行:根据相机采集的图像的亮度值、第四相关色温以及第二色度距离,确定第二概率值;根据公式D uv3=Conf_2*Y′*D uv2+(1-Conf_2*Y′)*D uv1,计算得到第三色度距离;其中,Conf_2为第二置信度,Y′为第二概率值,D uv1为第一色度距离,D uv2为第二色度距离,D uv3为第三色度距离。
结合第二方面,在一种实施方式中,该一个或多个处理器还用于调用该计算机指令以使得该电子设备执行:根据相机采集的图像的亮度值、第五相关色温以及第三色度距离确定第一相关色温调节值;根据公式CCT 6=CCT 5+Delta_CCT′,得到第六相关色温,CCT 5为第五相关色温,CCT 6为第六相关色温,Delta_CCT′为第一相关色温调节值。
结合第二方面,在一种实施方式中,该一个或多个处理器还用于调用该计算机指令以 使得该电子设备执行:根据相机采集的图像的亮度值、第五相关色温以及第三色度距离确定第一色度距离调节值;根据公式D uv4=D uv3+Delta_D uv′,得到第四色度距离,D uv4为第四色度距离,D uv3为第三色度距离,Delta_D uv′为第一色度距离调节值。
第三方面,本申请实施例提供了一种电子设备,包括:触控屏、摄像头、一个或多个处理器和一个或多个存储器;所述一个或多个处理器与所述触控屏、所述摄像头、所述一个或多个存储器耦合,所述一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当所述一个或多个处理器执行所述计算机指令时,使得所述电子设备执行如第一方面或第一方面的任意一种可能实现的方式所述的方法。
第四方面,本申请实施例提供了一种芯片系统,该芯片系统应用于电子设备,该芯片系统包括一个或多个处理器,该处理器用于调用计算机指令以使得该电子设备执行如第一方面或第一方面的任意一种可能实现的方式所述的方法。
第五方面,本申请实施例提供了一种包含指令的计算机程序产品,当该计算机程序产品在电子设备上运行时,使得该电子设备执行如第一方面或第一方面的任意一种可能实现的方式所述的方法。
第六方面,本申请实施例提供了一种计算机可读存储介质,包括指令,当该指令在电子设备上运行时,使得该电子设备执行如第一方面或第一方面的任意一种可能实现的方式所述的方法。
附图说明
图1是本申请实施例提供的一种电子设备100的硬件结构示意图;
图2A-图2C是本申请实施例提供的一组拍摄界面图;
图2D是本申请实施例提供的通过自动白平衡算法调节后的图像的对比效果图;
图3是本申请实施例提供的一种确定色度信息的方法的流程图;
图4是本申请实施例提供的拍摄界面图;
图5是本申请实施例提供的预览图像与其RGB调节值的对应图;
图6A是本申请实施例提供的一种CCT转换表;
图6B是本申请实施例提供的一种CCT Shift Table三维空间坐标系;
图7A是本申请实施例提供的一种D uv转换表;
图7B是本申请实施例提供的一种D uvShift Table三维空间坐标系;
图8A是本申请实施例提供的一种Confidence转换表;
图8B是本申请实施例提供的一种Confidence Shift Table三维空间坐标系;
图9是本申请实施例提供的一种(CCT,D uv)转换成色度信息的示意图;
图10是本申请实施例提供的一种CCT融合表;
图11是本申请实施例提供的一种D uv融合表;
图12是本申请实施例提供的一种CCT倾向度调节表;
图13是本申请实施例提供的一种D uv倾向度调节表。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。在本文中提及“实施例”意味着,结合实施例描述的特定特征、结构或者特性可以包含在本实施例申请的至少一个实施例中。在说明书中的各个位置出现该短语并不一定均是相同的实施例,也不是与其它实施例互斥的独立的或是备选的实施例。本领域技术人员可以显式地和隐式地理解的是,本文所描述的实施例可以与其它实施例相结合。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书及所述附图中术语“第一”、“第二”、“第三”等是区别于不同的对象,而不是用于描述特定顺序。此外,术语“包括”和“具有”以及它们的任何变形,意图在于覆盖不排他的包含。例如,包含了一系列步骤或单元,或者可选地,还包括没有列出的步骤或单元,或者可选地还包括这些过程、方法、产品或设备固有的其它步骤或单元。
附图中仅示出了与本申请相关的部分而非全部内容。在更加详细地讨论示例性实施例之前,应当提到的是,一些示例性实施例被描述成作为流程图描绘的处理或方法。虽然流程图将各项操作(或步骤)描述成顺序的处理,但是其中的许多操作可以并行地、并发地或者同时实施。此外,各项操作的顺序可以被重新安排。当其操作完成时所述处理可以被终止,但是还可以具有未包括在附图中的附加步骤。所述处理可以对应于方法、函数、规程、子例程、子程序等等。
在本说明书中使用的术语“部件”、“模块”、“系统”、“单元”等用于表示计算机相关的实体、硬件、固件、硬件和软件的组合、软件或执行中的软件。例如,单元可以是但不限于在处理器上运行的进程、处理器、对象、可执行文件、执行线程、程序和/或分布在两个或多个计算机之间。此外,这些单元可从在上面存储有各种数据结构的各种计算机可读介质执行。单元可例如根据具有一个或多个数据分组(例如来自与本地系统、分布式系统和/或网络间的另一单元交互的第二单元数据。例如,通过信号与其它系统交互的互联网)的信号通过本地和/或远程进程来通信。
下面,对本申请实施例涉及的一些专有名词进行解释。
(1)普朗克轨迹:在辐射作用下既不反射也不完全投射,而能把落在它上面的辐射全部吸收的物体称为黑体或完全辐射体。当黑体连续加热时,它的相对光谱功率分布的最大值将向短波方向移动,相应的光色将按照红、黄、白、蓝的顺序进行变化,在不同温度下,黑体对应的光色变化在色度图上形成的弧形轨迹,叫做黑体轨迹或普朗克轨迹。
(2)相关色温(Correlated Colour Temperature,CCT):是指与具有相同亮度刺激的颜色最相近的黑体辐射体的温度,用K氏温度表示,用于描述位于普朗克轨迹附近的光的颜色 的度量。除热辐射光源以外的其它光源具有线状光谱,其辐射特性与黑体辐射特性差别较大,所以这些光源的光色在色度图上不一定准确地落在黑体轨迹上,对这样一类光源,通常用CCT来描述光源的颜色特性。
(3)色度距离(D uv):是指从测试光源的色度值(u,v)到普朗克轨迹上的最近点的距离,D uv表征了测试光源的色度值(u,v)与普朗克轨迹的颜色偏移(绿色或粉红色)和方向的信息。
(4)亮度值(Lighting Value,LV):用于估计环境亮度,其具体计算公式如下:
Figure PCTCN2022091965-appb-000005
其中,Exposure为曝光时间,Aperture为光圈大小,Iso为感光度,Luma为图像在XYZ颜色空间中,Y的平均值。
(5)XYZ空间:本申请实施例中的RGB值是DeviceRGB,DeviceRGB颜色空间是与设备相关的颜色空间,即:不同设备对RGB值的理解不同。因此,DeviceRGB不适合用于计算亮度值等参数。计算LV需要将DeviceRGB颜色空间转换为与设备无关的XYZ空间,即:将RGB值转换为XYZ值。
RGB颜色空间转换为XYZ空间的常用方法为:在不同光源环境下(典型的光源包括A、H、U30、TL84、D50、D65、D75等等)标定出一个大小为3*3的颜色校正矩阵(Color Correction Matrix,CCM),并将不同光源的CCM存储在电子设备的内存中,通过公式:
Figure PCTCN2022091965-appb-000006
得到图像在XYZ空间对应的三维向量,从而实现RGB空间到XYZ空间的转化。在拍摄过程中,往往根据图像中的白平衡基准点来匹配对应的光源,并选择该光源对应的CCM。若存在白平衡基准点的RGB值在两个光源之间(例如图像的RGB值落在D50和D65之间),CCM可由D50和D65进行双线性插值所得到。例如,D50的颜色校正矩阵为CCM 1,相关色温为CCT 1,D60的颜色校正矩阵为CCM 2,相关色温为CCT 2,图像光源的相关色温为CCT a。电子设备可以根据公式:
Figure PCTCN2022091965-appb-000007
计算出比例值g,基于比例值,根据公式:
CCM=g*CCM 1+(1-g)*CCM 2
可以计算出图像的CCM。
请参见图1,图1是本申请实施例提供的一种电子设备100的硬件结构示意图。电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(Universal Serial Bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP 还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不 同。
由于,数码相机或手机等电子设备中的CCD电路或CMOS电路不能对光源的颜色变化作出修正,为了防止拍摄出的图像出现色温等问题,需要将图像进行白平衡处理。目前,使用最多的白平衡算法包括传统的自动白平衡(Automatic White Balance,AWB)算法和人工智能自动白平衡算法(AI AWB算法)。AWB算法包括灰度世界等算法,而AI AWB算法通常是通过训练好的AI AWB模型来计算图像光源白点的颜色描述值(例如图像光源白点的RGB值、图像光源白点的R/G和B/G、图像光源白点的色度值)。其中,AI AWB模型可以是神经网络模型,也可以是快速傅里叶颜色恒常性(Fast Fourier Color Constancy,FFCC)模型,还可以是其它具有计算图像光源白点的颜色描述值功能的模型,本申请实施例对此不限制。图像光源白点的颜色为图像光源的颜色,当计算出白点的RGB值或者R/G、B/G,可以基于该白点的RGB值或R/G、B/G调节图像像素的RGB值或R/G、B/G,从而实现对图像的白平衡调节。
使用AWB算法对图像进行白平衡处理,其优点是计算量小、应用场景的范围较广。但是,由于其计算逻辑简单且计算量小,当拍摄场景的光源环境复杂时,通过AWB算法计算出的图像光源白点的RGB值的准确性不高,进而导致基于图像光源白点的RGB值对图像进行白平衡处理的效果不明显。
对于AI AWB算法,由于其使用的AI AWB模型对输入图像进行复杂的计算处理,在与训练场景类似的拍摄场景下,AI AWB模型计算出的图像光源白点的RGB值的准确度是非常高的。但是,由于在对AI AWB模型进行训练的过程中,训练样本有限,使得AI AWB模型不能适用所有的拍摄环境。当用户在泛化场景不好的拍摄场景下进行拍照时,AI AWB模型计算图像光源白点的RGB值的准确度极低,当ISP基于图像光源白点的RGB值对图像进行白平衡调节时,调节效果不明显。
为了解决AWB算法计算图像光源白点的RGB值准确度不高,AI AWB算法的应用场景有限的问题,本申请实施例提供了一种确定色度信息的方法,该方法的原理为:多光谱传感器将其计算的待处理图像的光源白点的第一色度信息下发到内存。由于多光谱传感器的启动速度快,在相机进行冷启动的过程中,电子设备可以使用第一色度信息对首帧或前几帧的图像进行白平衡调节,使得首帧或前几帧的图像不会出现偏色问题。此外,在电子设备将图像的原始数据通过AWB算法以及AI AWB算法计算图像的色度信息的过程中,若统计图像光源白点的个数小于一定的阈值,电子设备判断当前拍摄场景为纯色场景,在纯色场景下,电子设备通过AWB算法和AI AWB算法计算出的图像光源白点的色度信息不准确。因此,电子设备也可以调用多光谱传感器就计算的第一色度信息对图像进行白平衡调节。电子设备将相机输出的待处理图像的原始数据以及多光谱传感器输出的光谱数据和相关色温CCT通过AWB算法进行处理,得到图像的光源白点的RGB值,基于该RGB值可以得到该图像光源白点的R(红色)颜色通道与G(绿色)颜色通道的比值R 11/G 11以及B(蓝色)颜色通道与G(绿色)颜色通道的比值B 11/G 11。同时,电子设备将相机输出的待处理图像的原始数据以及多光谱传感器输出的光谱数据和CCT通过人工智能自动白平衡算法(AI AWB算法)进行计算处理,得到该图像光源白点的RGB值,基于该RGB值可以得到该图像光源白点的R(红色)颜色通道与G(绿色)颜色通道的比值R 21/G 21以及 B(蓝色)颜色通道与G(绿色)颜色通道的比值B 21/G 21。然后,电子设备基于R 11/G 11和B 11/G 11计算所述图像光源白点的相关色温CCT 11和色度距离D uv11,基于R 21/G 21和B 21/G 21计算所述图像光源白点的相关色温CCT 21和色度距离D uv21。然后,电子设备将CCT 11和CCT 21进行融合,得到融合后的CCT,将D uv11和D uv21进行融合,得到融合后的D uv。最后,电子设备基于融合后的CCT和融合后的D uv得到第四色度信息,并将第四色度信息下发到内存中,以便ISP基于第四色度信息对图像进行白平衡调节。
下面,结合图2A-图2D对该确定色度信息的方法的应用场景进行介绍。
如图2A所示,电子设备100可以显示有主屏幕的界面210,该界面210中包括多个应用图标。例如,天气应用图标、视频应用图标、备忘录图标、设置应用图标、相机应用图标211。当电子设备100检测到针对相机应用图标211的输入操作后(例如单击),响应该输入操作,电子设备100可以显示如图2B所示的拍摄界面220。
如图2B所示,拍摄界面220包括拍摄预览界面221、拍照控件222以及回显控件223。其中,拍摄预览界面221用于显示当前拍摄环境,当电子设备100检测到针对回显控件223的输入操作(例如单击),电子设备100会显示历史拍摄照片。当电子设备100检测到针对拍照控件222的输入操作(例如单击)后,响应该操作,电子设备显示如图2C所示的照片预览界面230。这样,用户可以在照片预览界面230中预览拍摄照片。
[根据细则91更正 20.03.2023]
当电子设备100检测到针对图2B中拍照控件222的输入操作后,电子设备100开始拍照,并对图像进行白平衡处理。如图2D所示,由于拍摄环境光源的色温的不同,导致处理前图像1出现色偏(图像1颜色整体偏灰),当电子设备100对图像1进行白平衡进行处理后,处理后的图像1的整体颜色不再偏灰,与人眼真实观察的颜色一致。
下面,结合附图介绍一种确定色度信息方法的流程图。请参见图3,图3是本申请实施例提供的一种确定色度信息方法的流程图,具体流程如下:
步骤S301:相机应用启动。
示例性的,在上述图2A实施例中,当电子设备检测到针对相机应用图标211的输入操作(例如,单击)后,相机应用启动。
步骤S302:电子设备将多光谱传感器计算出的图像光源白点的第一色度信息发送到内存中。
具体地,多光谱传感器用于获取图像光源的光谱数据,并根据这些光谱数据计算图像光源白点的相关色温CCT和RGB值的等颜色描述值,以便ISP可以基于这些颜色描述值进行图像处理(例如,对图像进行白平衡调节)。本申请实施例以包括8个可见光通道(F1通道~F8通道)、一个全通通道(Clear通道)以及近红外通道(NIR Value通道)的多光谱传感器进行举例说明,多光谱传感器每个通道对应不同的波长范围,当器件受到光线照射后,在单位时间内,每个通道会产生通道响应值,这个响应值就是多光谱传感器获取的光谱数据。其中,全通通道的响应值用于表征所述图像的亮度,近红外通道的响应值用于表征红外光的强度。多光谱传感器可以基于其获取的光谱数据计算第一色度信息,第一色度信息可以为图像光源白点的R 1G 1B 1值,也可以为所述图像光源白点的R 1/G 1和B 1/G 1,还可 以为该图像光源白点的色度值(u 1,v 1),本申请实施例对此不做限制。其中,R 1G 1B 1值用于表征图像光源白点在红、绿、蓝三颜色通道上的通道响应值,R 1G 1B 1可以与色度值(u 1,v 1)相互转换。
当相机冷启动时,多光谱传感器会计算图像光源白点的第一色度信息、光谱数据以及所述图像光源的相关色温CCT。在相机启动到拍照这段时间,多光谱传感器会每隔一帧或多帧计算预览图像的第一色度信息,并将第一色度信息下放到内存中。其中,预览图像为相机应用启动后,显示在电子设备预览界面的图像。示例性的,当相机应用启动后,电子设备会显示如图4所示的拍摄界面410,在该拍摄界面410中包括预览界面411,预览界面411用于显示当前拍摄环境的实时影像。在预览界面411中显示的实时影像是以图像帧为单位展示给用户的,该图像帧为预览图像。与此同时,多光谱传感器会计算每一帧图像的光谱数据和光源相关色温(CCT),电子设备会将图像的光谱数据、原始数据以及相关色温分别通过AWB算法和AI AWB算法进行计算处理,得到每帧图像的色度信息。但是,相机出图的速度相较于电子设备通过AWB算法和AI AWB算法计算图像色度信息的速度要快,且多光谱传感器的启动速度快。因此,在相机启动后,电子设备使用多光谱传感计算的第一色度信息来对首帧或前几帧预览图像进行白平衡调节,使得所述首帧或前几帧预览图像不会发生偏色问题。例如,如图5所示,相机启动后依次输出图像1~图像8,在相机启动的同时,电子设备开始使用AWB算法和AI AWB算法计算每一帧预览图像光源白点的RGB值(色度信息),由于相机出图的速度快于电子设备通过AWB算法和AI AWB算法计算预览图像光源白点的RGB值的速度,当相机输出图像3时,电子设备才过AWB算法和AI AWB算法计算出图像1光源白点的RGB 1值,当相机输出图像4时,电子设备才计算出图像2光源白点的RGB 2值。由于图像1与图像3、图像2与图像4之间的时序差异极小,电子设备在输出图像3后,可以使用图像1的RGB 1值对图像3进行白平衡的调节,电子设备在输出图像4后,可以使用RGB 2值对图像4进行白平衡调节,但是,相机输出图像1(首帧图像)和图像2时,电子设备未计算出其光源白点的RGB值,因此,在相机输出图像1和图像2后,电子设备使用多光谱传感器的下发到内存的第一色度信息(RGB值)对图像1和图像2进行白平衡调节,从而解决图像1和图像2的偏色问题。
在一些拍摄环境中,在相机将图像的原始数据以及多光谱传感器输出的相关色温CCT、光谱数据通过AWB算法和AI AWB算法进行计算处理的过程中,当检测到所述图像接近普朗克轨迹附近的有效图像光源白点数目小于第一阈值时,电子设备判断当前的拍摄场景为纯色场景,在纯色场景下,不管是AWB算法还是AI AWB算法计算光源白点的色度信息的准确度不高。因此,在纯色场景下,电子设备可以基于多光谱传感器计算的第一色度信息来对预览图像进行白平衡的调节。其中,第一阈值可以基于经验值得到,也可以基于历史值得到,还可以基于实验数据得到,本申请实施例对此不做限制。
步骤S303:电子设备将相机输出的原始数据和光谱传感器输出的光谱数据、CCT通过AWB算法进行计算处理,得到所述图像光源白点的第二色度信息。
具体地,原始数据是相机内的图像传感器将其捕捉到的光源信号转换为数字信号的数据。在图像传感器中包括多个像敏单元,每个像敏单元进行采样和量化后,得到RGB值,电子设备对像敏单元进行下采样所得到的RGB值为原始数据。
例如,如果传感器有6400x4800个像敏单元,电子设备对其进行100倍的下采样,得到64x48个像敏单元的RGB值,那么电子设备获取的原始数据是一个大小为64x48x3的向量。其中,64x48为像敏单元的个数,3为每个像敏单元计算得到的RGB值。
电子设备将原始数据作为AWB模块的输入,通过AWB算法进行计算处理,可以得到图像光源白点的色度信息,这个色度信息为图像光源白点的第二色度信息。其中,第二色度信息可以为图像光源白点的R 2G 2B 2值,也可以为图像光源白点的R 2/G 2、B 2/G 2,还可以为图像光源白点的色度值(u 2,v 2),本申请实施例对此不做限制。
光谱数据包括多光谱传感器输出的8个可见光通道输出的通道响应值,全通通道的响应值以及近红外通道的响应值。在电子设备内存储有多套AWB算法,每套AWB算法对应不同的拍摄场景,电子设备基于多光谱传感器输出的光谱数据可以判断当前的拍摄出场景的类型,并根据当前拍摄场景的类型选择对应的AWB算法来计算第二色度信息。这样,计算出的第二色度信息的准确度更高。示例性的,电子设备可以根据光谱数据中NIR通道输出的NIR值来判断多个拍摄场景,NIR值与拍摄场景及其相关的AWB算法的对应表如表1所示:
表1
NIR值 拍摄场景 AWB算法
100-200 蓝天白云 算法1
201-401 室外绿植 算法2
402-402 城市夜景 算法3
…… …… ……
若NIR值为150,电子设备可以基于表1判断当前拍摄场景的类型为蓝天白云,选择算法1计算第二色度信息。
步骤S304:电子设备将所述第二色度信息转换为所述图像光源白点的第一相关色温和所述图像光源白点的第一色度距离。
具体地,所述第一相关色温为CCT 1,所述第一色度距离为D uv1
示例性的,电子设备计算D uv1的方法可以为:
当第二色度信息为R 2/G 2和B 2/G 2或R 2G 2B 2值时,电子设备可以根据公式(1)计算该帧图像的色度值(u 2,v 2),公式(1)如下:
Figure PCTCN2022091965-appb-000008
其中,
Figure PCTCN2022091965-appb-000009
为转换矩阵,本申请实施例对该转换矩阵仅做示例性说明,对转换矩阵的具体内容不做限制。
然后,电子设备在色度图上获取在普朗克轨迹上与(u 2,v 2)最短距离点的坐标(u′ 2,v′ 2)。最后,根据公式(2)计算D uv1,公式(2)如下所示:
D uv1=sgn(v 2-v′ 2)*[(u 2-u′ 2) 2+(v 2-v′ 2) 2] 1/2   (2)
其中,当v 2-v′ 2≥0时,sgn(v 2-v′ 2)=1;当v 2-v′ 2<0时,sgn(v 2-v′ 2)=-1。
示例性的,电子设备计算CCT 1的方法可以为:
当第二色度信息为R 2/G 2和B 2/G 2或R 2G 2B 2值时,电子设备可以参考公式(1)计算该帧图像的色度值(u 2,v 2)。然后,电子设备在色度图上获取在普朗克轨迹上与(u 2,v 2)最短距离点M,点M对应的CCT为CCT 1
步骤S305:电子设备将相机输出的原始数据和多光谱传感器输出的光谱数据和相关色温通过AI AWB算法进行计算处理,得到所述图像光源白点的第三色度信息和第一置信度。
具体地,在AI AWB算法中包括已训练好的AI AWB模型,电子设备将原始数据和光谱数据以及多光谱传感器计算的相关色温CCT作为输入,通过AI AWB模型进行计算处理后,得到图像光源白点的第三色度信息。其中,第三色度信息可以为图像光源白点的R 3G 3B 3值,也可以为所述图像光源白点的R 3/G 3和B 3/G 3,还可以为所述图像光源白点的色度值(u 3,v 3),本申请实施例对此不做限制。第一置信度用于表征AI AWB模型的可靠程度。
步骤S306:电子设备将所述第三色度信息转换为第二相关色温和第二色度距离。
具体地,所述第二相关色温为CCT 2,所述第二色度距离为D uv2。电子设备计算CCT 2和D uv2的方法和过程请参见步骤S304中电子设备将第二色度信息转换成CCT 1和D uv1的相关叙述,本申请实施例在此不做赘述。
步骤S307:电子设备基于多光谱传感器输出的CCT,调整所述第一置信度,得到第二置信度。
具体地,电子设备基于多光谱传感器输出的CCT和CCT 2计算偏差值Fn,所述Fn用于表征多光谱传感器输出的CCT与CCT 2的偏差程度,Fn越大,多光谱传感器输出的CCT和CCT 2的偏差程度就越大,当偏差值超过了预设的范围时,电子设备判断AI AWB计算模型(例如,AI AWB模型)输出的第一置信度不准确,电子设备会基于Fn调整第一置信度,得到第二置信度。
示例性的,若Fn预设的范围为500K,第一置信度Conf_1为70%,多光谱传感器输出的CCT为3500K,CCT 2为4500K,则偏差值Fn为1000K。此时,Fn超过了预设范围,电子设备判断AI AWB模型输出的第一置信度不准确,电子设备将第一置信度从70%下调至60%。因此,第二置信度Conf_2为60%。
步骤S308:电子设备将第二相关色温、第二色度距离以及第二置信度进行修正,得到第三相关色温、第三色度距离以及第三置信度。
具体地,AI AWB算法中的AI AWB模型是预先训练好的模型,在训练过程中,由于训练样本有限,使得AI AWB模型不能适用所有的拍摄环境。当用户在泛化场景不好的场景下进行拍照时,通过AI AWB算法计算得到的CCT 2、D uv2以及Conf_2准确度极低。为了有效解决上述问题,电子设备需要将CCT 2、D uv2以及Conf_2进行修正,在CCT 2、D uv2以及Conf_2中任意一项与其修正值偏离程度过大的情况下,电子设备能够对其进行修正,使得CCT 2、D uv2以及Conf_2在合理的数值范围之内。
电子设备对CCT 2的修正过程为:在电子设备内存储有CCT转换表(CCT Shift Table)。如图6A所示,CCT Shift Table是一个三维坐标表,有三个坐标轴,分别是:CCT轴、D uv轴以及LV轴。在CCT Shift Table的三维空间中,存在许多单元格,每个单元格对应一个CCT修正值。电子设备基于所述图像光源白点的LV、CCT 2以及D uv2在CCT Shift Table三维坐标 系中找到对应的点,确定与这个点相邻的单元格。然后,通过Trilinear插值(三线性插值)计算每个相邻单元格的权值,并将单元格的权值与其对应的CCT修正值相乘,得到每个相邻的单元格的乘积,将每个相邻的单元格的乘积进行求和,得到CCT 3,CCT 3为第三相关色温。
示例性的,如图6B所示,若电子设备基于图像光源白点的LV、CCT 2以及D uv2在CCT Shift Table三维坐标系中找到点M,点M为线段XY的中点。其中,X为单元格1~单元4的交点,Y为单元格5-单元格8的交点。电子设备基于点M在CCT Shift Table的坐标位置,确定单元格1~单元格8为点M的相邻单元格,电子设备对单元格1~单元格8进行Trilinear插值,得到单元格1-单元格8的权值f 1~权值f 8,然后根据公式(3)计算CCT 3,公式(3)如下所示:
CCT 3=CCT 11*f 1+…+CCT 13*f 3+CCT 14*f 4+CCT 21*f 5+…+CCT 24*f 8   (3)
其中,CCT 11~CCT 14、CCT 21~CCT 24分别为单元格1~单元格8的CCT修正值,f 1~f 8分别为单元格1~单元格8的权值,且f 1+…+f 8=1。
电子设备对D uv2的修正过程为:在电子设备内存储有D uv转换表(D uvShift Table)。如图7A所示,D uvShift Table是一个三维坐标表,有三个坐标轴,分别是:D uv轴、CCT轴以及LV轴。在D uvShift Table的三维空间中,存在许多单元格,每个单元格对应一个D uv修正值。电子设备基于所述图像光源白点的LV、D uv2以及CCT 2在D uvShift Table三维坐标系中找到对应的点,确定与这个点相邻的单元格。其中,每个相邻的单元格对应一个D uv修正值。然后,电子设备通过Trilinear插值(三线性插值)计算每个相邻的单元格的权值,并将单元格的权值与其对应的D uv修正值相乘,得到每个相关单元格的乘积,将每个相关单元格的乘积进行求和,得到D uv3,D uv3为第三色度距离。
示例性的,如图7B所示,若电子设备基于图像光源白点的LV、CCT 2以及D uv2在D uv Shift Table三维坐标系中找到点M′,点M′为线段X′Y′的中点。其中,X′为单元格1~单元格4的交点,Y′为单元格5-单元格8的交点。电子设备基于点M′在CCT Shift Table的坐标位置,确定单元格1~单元格8为点M′的相邻单元格,电子设备对单元格1-单元格8进行Trilinear插值,得到单元格1-单元格8的权值f′ 1~权值f′ 8,然后根据公式(4)计算第三色度距离,公式(4)如下所示:
D uv3=D uv11*f′ 1+…+D uv13*f′ 3+D uv14*f′ 4+D uv21*f′ 5+…+D uv24*f′ 8    (4)
其中,D uv11~D uv14、D uv21~D uv21分别为单元格1~单元格8的D uv修正值,f′ 1~f′ 8分别为单元格1~单元格8的权值,且f′ 1+…+f′ 8=1,D uv3为第三色度距离。
电子设备对Conf_2的修正过程为:在电子设备内存储有置信度转换表(Confidence Shift Table)。如图8A所示,Confidence Shift Table是一个三维坐标表,有三个坐标轴,分别是:CCT轴、D uv轴以及LV轴。在Confidence Shift Table的三维空间中,存在许多单元格,每个单元格对应一个置信度调节值(Mult_Conf)。电子设备基于所述图像的LV、D uv2以及CCT 2在Confidence Shift Table这个三维坐标系中找到对应的点,确定与这个点相邻的单元格。然后,电子设备通过Trilinear插值(三线性插值)计算每个相邻的单元格的权值,并将单元格的权值与其对应的Mult_Conf相乘,得到每个相邻的单元格的乘积,将每个相邻的单元格的乘积进行求和,得到Mult_Conf_new。然后,电子设备根据公式 Conf_3=Conf_2+Mult_Conf_new计算得到第三置信度。其中,Conf_3为第三置信度。需要说明的是,Conf_3可能存在大于1的情况,当Conf_3大于1时,电子设备将Conf_3置1,当Conf_3小于0时,电子设备将Conf_3置0。
示例性的,如图8B所示,若电子设备基于图像光源白点的LV、CCT 2以及D uv2在ConfidenceShift Table三维坐标系中找到点M″,点M″为线段X″Y″的中点。其中,X″为单元格1~单元格4的交点,Y″为单元格5-单元格8的交点。电子设备基于点M″在Confidence Shift Table的坐标位置,确定单元格1~单元格8为点M″的相邻单元格,电子设备根据Trilinear插值计算单元格1~单元格8的权值f″ 1~权值f″ 8,然后根据公式(5)计算Mult_Conf_new,公式(5)如下所示:
Mult_Conf_new=Mult_Conf 11*f″ 1+…+Mult_Conf 14*f″ 4+Mult_Conf 21*f″ 5+…+Mult_Conf 24*f″ 8    (5)
其中,Mult_Conf 11~Mult_Conf 14、Mult_Conf 21~Mult_Conf 24分别为单元格1~单元格8的置信度调节值,f″ 1~f″ 8分别为单元格1~单元格8的权值,且f″ 1+…+f″ 8=1。
最后,电子设备根据公式Conf_3=Mult_Conf_new+Conf_2得到第三置信度。
步骤S309:电子设备基于第三相关色温和第三色度距离计算第一滤波向量。
具体地,电子设备对CCT 2、D uv2以及Conf_2进行修正后,得到CCT 3和D uv3。电子设备会将CCT 3和D uv3转换为第一滤波向量Mu 1,以便进行时序滤波。
其中,所述Mu 1为[log(R 4/G 4),log(B 4/G 4)],在Mu 1中包括了图像光源白点的色度信息(R 4B 4G 4值)。
示例性的,电子设备基于CCT 3和D uv3,得到Mu 1的过程如图9所示,电子设备基于CCT 3在色度图的普朗克轨迹上确定点D,点D的色度值为(u′ 4,v′ 4),然后,计算普朗克轨迹上相关色温为CCT 3+ΔT的点E的坐标(u″ 4,v″ 4)。其中,ΔT为CCT 3的微小增量(例如,ΔT=0.001K)。然后,电子设备计算倾角θ,并根据公式(6)~公式(9)得到du、dv、sin θ和cos θ,公式(6)~公式(9)如下所示:
du=u′ 4-u″ 4(6);dv=v′ 4-v″ 4   (7);
Figure PCTCN2022091965-appb-000010
然后,电子设备根据公式(10)和公式(11)计算色度值(u 4,v 4),公式(10)和公式(11)如下所示:
u 3=u′ 4-D uv3*sin θ   (10);
v 3=v′ 4+D uv3*cos θ   (11);
然后,电子设备根据公式(12),计算R 4G 4B 4,公式(12)如下所示:
Figure PCTCN2022091965-appb-000011
其中,
Figure PCTCN2022091965-appb-000012
为转换矩阵,本申请实施例对该转换矩阵仅做示例性说明,对转换矩阵的具体内容不做任何限制。
最后,电子设备基于R 4G 4B 4值计算第一滤波向量Mu 1
步骤S310:电子设备将第一滤波向量进行时序滤波,得到第二滤波向量。
具体地,时序滤波就是将时序上的多帧信号融合为多帧平稳信号,电子设备将进行时序滤波的目的是将该帧图像的色度信息与其上一帧图像的色度信息进行融合,得到融合后的色度信息,以此来提高该帧图像的第三色度信息的准确度。
本申请实施例以电子设备通过卡尔曼滤波器对Mu 1进行时序滤波为例,进行说明,时序滤波的过程为:电子设备首先获取第一协方差矩阵Sigma 1,其中,Sigma 1用于表征AI AWB模型输出该帧图像光源白点的色度信息的可靠程度,可以与置信度相互转化。Sigma 1可以由AI AWB模型输出,也可以由电子设备基于Conf_3计算得到Sigma 1。然后,电子设备根据公式(13)更新Sigma 1,得到更新后的Sigma 1,公式(13)如下所示:
Figure PCTCN2022091965-appb-000013
其中,Sigma′ 1为更新后的Sigma 1,λ 1为第一参数。
然后,电子设备基于公式(14)更新第二协方差矩阵Sigma 2,Sigma 2为AI AWB模型计算上一帧图像光源白点的色度信息所得到的协方差矩阵,公式(14)如下所示:
Figure PCTCN2022091965-appb-000014
其中,Sigma′ 2为更新后的Sigma 2,λ 2为第二参数。其中,λ 1和λ 2可以由经验值得到,也可以由历史数据得到,还可以由实验测试数据得到,本申请实施例对此不作限制。
最后,电子设备根据公式(15)对Mu 1进行时序滤波,得到第二滤波向量Mu,公式(15)如下所示:
Mu=Mu 1*(Sigma′ 1) -1+Mu 2*(Sigma′ 2) -1   (15)
其中,Mu 2为上一帧图像的滤波向量,Mu为第二滤波向量。
步骤S311:电子设备基于第二滤波向量,计算出第四相关色温和第四色度距离。
具体地,电子设备计算出Mu后,可以Mu根据计算得到所述图像光源白点的R 5G 5B 5值,并将R 5G 5B 5值转换为第四相关色温CCT 4和第四色度距离D uv4。电子设备将R 5G 5B 5值转换为CCT 4和D uv4的过程请参见上述步骤S303中,电子设备将所述第二色度信息转换为CCT 1和D uv1的相关叙述,本申请实施例对此不再赘述。CCT 4和D uv4的准确度相较于CCT 3和D uv3更高。
步骤S312:电子设备基于第一相关色温和第四相关色温,得到第五相关色温。
具体地,在电子设备内存储有相关色温融合表(CCT Merging Table)。如图10所示,CCT Merging Table是一个三维坐标表,有三个坐标轴,分别是:D uv轴、CCT轴以及LV轴。在CCT Merging Table的三维空间中,存在许多单元格,每个单元格对应一个概率值X。电子设备基于所述图像的LV、CCT 4以及D uv4在CCT Merging Table三维坐标系中找到对应的点,确定与这个点相邻的单元格。然后,电子设备通过Trilinear插值(三线性插值)计算每个相邻的单元格的权值,并将单元格的权值与其对应的概率值X相乘,得到每个相关单元格的乘积,将每个相关单元格的乘积进行求和,得到第一概率值X′。最后,电子设备 通过公式(16)计算第五相关色温CCT 5。公式(16)如下所示:
CCT 5=Conf_3*X′*CCT 4+(1-Conf_3*X′)*CCT 1   (16)
步骤S313:电子设备基于第一色度距离和第四色度距离,得到第五色度距离。
具体地,在电子设备内存储有色度距离融合表(D uv Merging Table)。如图11所示,D uvMerging Table是一个三维坐标表,有三个坐标轴,分别是:D uv轴、CCT轴以及LV轴。在D uvMerging Table的三维空间中,存在许多单元格,每个单元格对应一个概率值Y。电子设备基于所述图像的LV、CCT 4以及D uv4在D uvMerging Table三维坐标系中找到对应的点,确定与这个点相邻的单元格。然后,电子设备通过Trilinear插值(三线性插值)计算每个相邻的单元格的权值,并将单元格的权值与其对应的概率值Y相乘,得到每个相关单元格的乘积,将每个相关单元格的乘积进行求和,得到第二概率值Y′。最后,电子设备通过公式(17)计算,得到第五色度距离D uv5,公式(17)如下所示:
D uv5=Conf_3*Y′*D uv4+(1-Conf_3*Y′)*D uv1   (17)
步骤S314:电子设备将第五相关色温进行倾向度调节,得到第六相关色温。
具体地,对于部分用户而言,在使用电子设备拍照后,对图像的颜色有其他的要求。例如,有些用户希望图像的颜色整体呈现暖色系,有些用户希望图像的颜色整体呈现冷色系等等。为了令图像的整体颜色更倾向于用户希望的颜色,电子设备需要对CCT 5和D uv5进行倾向度的调节。电子设备对CCT 5进行倾向度调节的具体过程为:在电子设备中存储有如图12所示的CCT Propensity Table(相关色温倾向度调节表),CCT Propensity Table是一个三维坐标表,有三个坐标轴,分别是:CCT轴、D uv轴以及LV轴。在CCT Propensity Table的三维空间中,存在许多单元格,每个单元格对应一个CCT调节值(Delta_CCT)。电子设备基于所述图像的LV、CCT 5以及D uv5在CCT Propensity Table这个三维坐标系中找到对应的点,确定与这个点相邻的单元格。然后通过Trilinear插值(三线性插值)计算每个相邻的单元格的权值,并将单元格的权值与其对应的Delta_CCT相乘,得到每个相关单元格的乘积,将每个相关单元格的乘积进行求和,得到第一相关色温调节值Delta_CCT′。然后,电子设备根据公式(18)计算第六相关色温,公式(18)如下所示:
CCT 6=CCT 5+Delta_CCT′   (18)
在公式(18)中,CCT 6为第六相关色温。
步骤S315:电子设备将第五色度距离进行倾向度调节,得到第六色度距离。
具体地,电子设备对D uv5进行倾向度调节的具体过程为:在电子设备中存储有如图13所示的D uvPropensity Table(色度距离倾向度调节表),D uvPropensity Table是一个三维坐标表,有三个坐标轴,分别是:CCT轴、D uv轴以及LV轴。在D uvPropensity Table的三维空间中,存在许多单元格,每个单元格对应一个D uv调节值(Delta_D uv)。电子设备基于所述图像的LV、D uv5以及CCT 5在D uvPropensity Table这个三维坐标系中找到对应的点,确定与这个点相邻的单元格。然后通过Trilinear插值(三线性插值)计算每个相邻的单元格的权值,并将单元格的权值与其对应的Delta_D uv相乘,得到每个相关单元格的乘积,将每个相关单元格的乘积进行求和,得到第一色度距离调节值Delta_D uv′。然后,电子设备根据公式(19)计算调节后的色度距离,公式(19)如下所示:
D uv6=D uv5+Delta_D uv′   (19)
在公式(19)中,D uv6为第六色度距离。
步骤S316:电子设备基于第六相关色温和第六色度距离计算第三滤波向量。
具体地,第三滤波向量为Mu 3,电子设备基于CCT 6和D uv6计算Mu 3的方法请参见步骤S309中电子设备基于CCT 3和D uv3计算Mu 1的相关叙述,本申请实施例在此不再赘述。
步骤S317:电子设备将第三滤波向量进行时序滤波,得到第四滤波向量。
具体地,为了防止相邻两帧图像的色度信息差异过大,导致相邻两帧图像的颜色发生骤变,电子设备将本帧图像的滤波向量与上一帧图像的滤波向量进行时序滤波,从而得到融合后的滤波向量(第四滤波向量),电子设备能够基于第四滤波向量得到第四色度信息。这样,电子设备使用第四色度信息对本帧图像进行白平衡调节后,调节后的图像的整体颜色与上一帧图像的整体颜色不会发生较大的差异,不会产生相邻两帧图像颜色骤变的问题。首先,电子设备根据公式(20)更新所述图像的第一协方差矩阵,得到第三协方差矩阵,公式(20)如下所示:
Figure PCTCN2022091965-appb-000015
其中,Sigma 3为第三协方差矩阵,Sigma 1为第一协方差矩阵,a 1为第三参数。
然后,电子设备根据公式(21)更新第二协方差矩阵,得到第四协方差矩阵,公式(21)如下所示:
Figure PCTCN2022091965-appb-000016
其中,Sigma 4为第四协方差矩阵,Sigma 2为第二协方差矩阵,a 1为第三参数,a 1可以由历史值得到,也可以由经验值得到,还可以由实验数据得到,本申请实施例不做限制。
[根据细则91更正 20.03.2023]
然后,电子设备根据公式(22)对Mu 3进行时序滤波,得到第四滤波向量Mu 5,公式(22)如下所示:
Mu 5=Mu 3**(Sigma 3) -1+Mu 4*(Sigma 4) -1   (22)
其中,Mu 4为上一帧图像的滤波向量,且Mu 4是电子设备基于CCT′ 2和D′ uv2计算得到的。其中,CCT′ 2为上一帧图像通过CCT倾向度调节后的相关色温,D′ uv2为上一帧图像通过D uv倾向度调节后的色度距离。
步骤S318:电子设备基于第四滤波向量,计算出第四色度信息。
具体地,Mu 5=[log(R 6/G 6),log(B 6/G 6)]。其中,第四色度信息可以为图像光源白点的R 6G 6B 6值,也可以为图像光源白点的R 6/G 6和B 6/G 6,本申请实施例对此不做限制。
步骤S319:电子设备将第四色度信息发送到内存中。
具体地,所述第四色度信息用于在非纯色场景下,调节图像的颜色。
本申请实施例,电子设备基于多光谱传感器输出的光谱数据选择对应的AWB算法计算第二色度信息,从而提高基于第二色度信息计算得到的第一相关色温以及第一色度距离的准确度。电子设备基于多光谱传感器输出的CCT调节AI AWB算法输出的置信度,从而提高了该置信度的准确度。由于第一相关色温、第一色度距离以及AI AWB算法输出的置信度的准确度都有提升,当电子设备将第一相关色温与基于AI AWB算法计算得到的第四 相关色温进行融合后,得到的第五相关色温的准确度也提升了,电子设备将第一色度距离与基于AI AWB算法计算得到的第四色度距离进行融合后,得到的第五色度距离的准确度也提升了,从而使得下放到内存中的、基于第五相关色温和第五色度距离的第四色度信息的准确度也有所提升。由于,第四色度信息的准确度有所提升,当电子设备使用第四色度信息对图像进行白平衡调节的效果更好。此外,由于多光谱传感器在相机冷启动到电子设备拍照的这段时间内,一直向内存下发其计算的图像的第一色度值,当相机在冷启动的过程中,由于相机的出图速度较快,电子设备可以使用第一色度信息调节首帧或前几帧预览图像的白平衡,从而避免首帧或前几帧预览图像出现偏色问题。或者,在电子设备检测当前拍摄场景为纯色场景(图像光源白点数量不足)的情况下,由于AWB算法和AI AWB算法计算出的第四色度信息准确度不高,电子设备也可以调用第一色度信息调节图像的白平衡。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘Solid State Disk)等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。
总之,以上所述仅为本发明技术方案的实施例而已,并非用于限定本发明的保护范围。凡根据本发明的揭露,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (17)

  1. 一种色度信息的确定方法,其特征在于,应用于电子设备,所述电子设备包括多光谱传感器、自动白平衡模块和AI自动白平衡模块,所述自动白平衡模块包括多个算法,所述方法包括:
    所述电子设备启动相机应用;
    所述多光谱传感器获取第一通道值,其中,所述多光谱传感器包括第一通道,所述第一通道值为所述第一通道获取的值;
    将所述第一通道值发送到自动白平衡模块;
    根据所述第一通道值,所述自动白平衡模块从所述多个算法中选择目标算法;
    所述电子设备根据所述目标算法确定目标色度信息。
  2. 如权利要求1所述的方法,其特征在于,所述方法还包括:
    所述多光谱传感器获取第一相关色温;
    将所述第一相关色温发送到AI自动白平衡模块;
    所述AI自动白平衡模块确定第一相关色温与第二相关色温的差值,所述第二相关色温为所述AI自动白平衡模块根据所述相机采集的图像得到的相关色温值;
    当所述差值大于预设阈值时,将AI自动白平衡模块输出的置信度调整为第一置信度;
    所述电子设备根据所述第一置信度确定目标色度信息。
  3. 如权利要求1或2所述的方法,其特征在于,所述方法还包括:
    当所述启动为冷启动时,所述多光谱传感器获取第一色度信息;
    所述电子设备确定所述目标色度信息为所述第一色度信息。
  4. 如权利要求1-3任一项所述的方法,其特征在于,所述方法还包括:
    当所述自动白平衡模块或所述AI自动白平衡模块确定所述相机采集的图像为纯色图像时,所述电子设备确定所述目标色度信息为第二色度信息,所述第二色度信息为所述多光谱传感器获取的色度信息。
  5. 如权利要求1-4任一项所述的方法,其特征在于,所述第一通道值包括以下一项或多项:可见光通道值、NIR通道值、Clear通道值。
  6. 如权利要求2-5任一项所述的方法,其特征在于,所述电子设备根据所述第一置信度确定目标色度信息,具体包括:
    所述自动白平衡模块将所述相机采集的图像通过所述目标算法进行计算,得到第三相关色温与第一色度距离;
    所述AI自动白平衡模块基于所述相机采集的图像,得到第四相关色温与第二色度距离;
    所述电子设备将所述第一置信度进行修正,得到第二置信度;
    所述电子设备根据所述第二置信度将所述第三相关色温与所述第四相关色温进行融合,得到第五相关色温;
    所述电子设备根据所述第二置信度将所述第一色度距离与所述第二色度距离进行融合,得到第三色度距离;
    所述电子设备基于所述第五相关色温和所述第三色度距离得到所述目标色度信息。
  7. 如权利要求6所述的方法,其特征在于,所述电子设备基于所述第五相关色温和所述第三色度距离得到所述目标色度信息,具体包括:
    将所述第五相关色温进行倾向度调节,得到第六相关色温;
    将所述第三色度距离进行倾向度调节,得到第四色度距离;
    基于所述第六相关色温和所述第四色度距离,得到所述目标色度信息。
  8. 如权利要求6-7任一项所述的方法,其特征在于,所述AI自动白平衡模块基于所述相机采集的图像,得到第四相关色温与第二色度距离,具体包括:
    所述AI自动白平衡模块根据所述相机采集的图像,输出所述第二相关色温和初始色度距离;
    将所述第二相关色温与所述初始色度距离进行修正,得到修正后的第二相关色温和修正后的初始色度距离;
    基于所述修正后的第二相关色温和所述修正后的初始色度距离,进行时序滤波,得到所述第四相关色温和所述第二色度距离。
  9. 如权利要求8所述的方法,其特征在于,所述基于所述修正后的第二相关色温和所述修正后的初始色度距离,进行时序滤波,得到所述第四相关色温和所述第二色度距离,具体包括:
    根据公式
    Figure PCTCN2022091965-appb-100001
    更新第一协方差矩阵,得到更新后的第一协方差矩阵;所述Sigma′ 1为所述更新后的第一协方差矩阵,所述Sigma 1为第一协方差矩阵,所述λ 1为第一参数,所述第一协方差矩阵为所述AI自动白平衡模块根据所述相机采集的图像输出的协方差矩阵或者所述第一协方差矩阵为基于所述第二置信度计算得到的协方差矩阵;
    根据公式
    Figure PCTCN2022091965-appb-100002
    更新第二协方差矩阵,得到更新后的第二协方差矩阵;所述第二协方差矩阵为第二图像的协方差矩阵,所述第二图像为所述相机采集的上一帧图像,所述Sigma′ 2为所述更新后的第二协方差矩阵,所述Sigma 2为第二协方差矩阵,所述λ 2为第二参数;
    基于所述修正后的第二相关色温和所述修正后的初始色度距离计算,得到第一滤波向量;
    根据公式Mu=Mu 1*(Sigma′ 1) -1+Mu 2*(Sigma′ 2) -1,得到第二滤波向量;所述Mu 1为第一滤波向量,所述Mu 2为所述第二图像的滤波向量,所述Mu为第二滤波向量;
    基于所述第二滤波向量进行计算处理,得到所述第四相关色温和所述第二色度距离。
  10. 如权利要求6-9任一项所述的方法,其特征在于,所述电子设备根据所述第二置信度将所述第三相关色温与所述第四相关色温进行融合,得到第五相关色温,具体包括:
    根据所述相机采集的图像的亮度值、所述第四相关色温以及所述第二色度距离,确定第一概率值;
    根据公式CCT 5=Conf_2*X′*CCT 4+(1-Conf_2*X′)*CCT 3,计算得到所述第五相关色温;其中,所述Conf_2为第二置信度,所述X′为第一概率值,所述CCT 4为第四相关色温,所述CCT 3为第三相关色温,所述CCT 5为第五相关色温。
  11. 根据权利要求6-10任一项所述的方法,其特征在于,所述电子设备根据所述第二置信度将所述第一色度距离与所述第二色度距离进行融合,得到第三色度距离,具体包括:
    根据所述相机采集的图像的亮度值、所述第四相关色温以及所述第二色度距离,确定第二概率值;
    根据公式D uv3=Conf_2*Y′*D uv2+(1-Conf_2*Y′)*D uv1,计算得到所述第三色度距离;其中,所述Conf_2为第二置信度,所述Y′为第二概率值,所述D uv1为第一色度距离,所述D uv2为第二色度距离,所述D uv3为第三色度距离。
  12. 如权利要求7-11任一项所述的方法,其特征在于,所述将所述第五相关色温进行倾向度调节得到第六相关色温,具体包括:
    根据所述相机采集的图像的亮度值、所述第五相关色温以及第三色度距离确定第一相关色温调节值;
    根据公式CCT 6=CCT 5+Delta_CCT′,得到第六相关色温,所述CCT 5为第五相关色温,所述CCT 6为第六相关色温,所述Delta_CCT′为第一相关色温调节值。
  13. 如权利要求7-12任一项所述的方法,其特征在于,所述将所述第三色度距离进行倾向度调节,得到第四色度距离,具体包括:
    根据所述相机采集的图像的亮度值、所述第五相关色温以及第三色度距离确定第一色度距离调节值;
    根据公式D uv4=D uv3+Delta_D uv′,得到第四色度距离,所述D uv4为第四色度距离,所述D uv3为第三色度距离,所述Delta_D uv′为第一色度距离调节值。
  14. 一种电子设备,其特征在于,包括:触控屏、摄像头、一个或多个处理器和一个或多个存储器;所述一个或多个处理器与所述触控屏、所述摄像头、所述一个或多个存储器耦合,所述一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当所述一个或多个处理器执行所述计算机指令时,使得所述电子设备执行如权利要求1-13中的任一项所述的方法。
  15. 一种计算机可读存储介质,包括指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-13中任一项所述的方法。
  16. 一种芯片系统,所述芯片系统应用于电子设备,所述芯片系统包括一个或多个处理器,所述处理器用于调用计算机指令以使得所述电子设备执行如权利要求1-13中任一项所述的方法。
  17. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-13中任一项所述的方法。
PCT/CN2022/091965 2021-08-12 2022-05-10 一种色度信息的确定方法及相关电子设备 WO2023015993A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/042,274 US20230342977A1 (en) 2021-08-12 2022-05-10 Method for Determining Chromaticity Information and Related Electronic Device
EP22850697.8A EP4181510A4 (en) 2021-08-12 2022-05-10 METHOD FOR DETERMINING CHROMATICITY INFORMATION AND ASSOCIATED ELECTRONIC DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110925200.8 2021-08-12
CN202110925200.8A CN115633260B (zh) 2021-08-12 2021-08-12 一种色度信息的确定方法及相关电子设备

Publications (3)

Publication Number Publication Date
WO2023015993A1 WO2023015993A1 (zh) 2023-02-16
WO2023015993A9 true WO2023015993A9 (zh) 2023-08-03
WO2023015993A8 WO2023015993A8 (zh) 2023-11-02

Family

ID=84902717

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/091965 WO2023015993A1 (zh) 2021-08-12 2022-05-10 一种色度信息的确定方法及相关电子设备

Country Status (4)

Country Link
US (1) US20230342977A1 (zh)
EP (1) EP4181510A4 (zh)
CN (1) CN115633260B (zh)
WO (1) WO2023015993A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117316122B (zh) * 2023-11-21 2024-04-09 荣耀终端有限公司 一种色温校准方法、电子设备及介质

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9106879B2 (en) * 2011-10-04 2015-08-11 Samsung Electronics Co., Ltd. Apparatus and method for automatic white balance with supplementary sensors
JP6462990B2 (ja) * 2014-03-07 2019-01-30 キヤノン株式会社 画像処理装置およびその方法
KR102281256B1 (ko) * 2014-12-04 2021-07-23 삼성전자주식회사 화이트 밸런스 제어 방법 및 그 전자 장치
KR20170014357A (ko) * 2015-07-29 2017-02-08 엘지전자 주식회사 카메라를 구비하는 이동 단말기 및 그 제어방법
US9762878B2 (en) * 2015-10-16 2017-09-12 Google Inc. Auto white balance using infrared and/or ultraviolet signals
US10630954B2 (en) * 2017-12-28 2020-04-21 Intel Corporation Estimation of illumination chromaticity in automatic white balancing
US10791310B2 (en) * 2018-10-02 2020-09-29 Intel Corporation Method and system of deep learning-based automatic white balancing
US20200228769A1 (en) * 2019-01-11 2020-07-16 Qualcomm Incorporated Lens rolloff assisted auto white balance
GB202011144D0 (en) * 2019-08-28 2020-09-02 ams Sensors Germany GmbH Systems for characterizing ambient illumination
CN112866656B (zh) * 2019-11-26 2022-05-31 Oppo广东移动通信有限公司 一种白平衡校正方法、装置、存储介质及终端设备
US20220412798A1 (en) * 2019-11-27 2022-12-29 ams Sensors Germany GmbH Ambient light source classification
CN111551266B (zh) * 2020-05-25 2021-01-22 吉林求是光谱数据科技有限公司 一种基于多光谱图像探测技术的环境色温测试方法及系统

Also Published As

Publication number Publication date
CN115633260A (zh) 2023-01-20
EP4181510A1 (en) 2023-05-17
EP4181510A4 (en) 2024-06-12
CN115633260B (zh) 2024-06-07
US20230342977A1 (en) 2023-10-26
WO2023015993A8 (zh) 2023-11-02
WO2023015993A1 (zh) 2023-02-16

Similar Documents

Publication Publication Date Title
US10791310B2 (en) Method and system of deep learning-based automatic white balancing
US10074165B2 (en) Image composition device, image composition method, and recording medium
WO2020172888A1 (zh) 一种图像处理方法和装置
US20210006760A1 (en) Meta-learning for camera adaptive color constancy
WO2023016320A1 (zh) 图像处理方法、装置、设备及介质
WO2023040725A1 (zh) 白平衡处理方法与电子设备
US11457189B2 (en) Device for and method of correcting white balance of image
CN112055190A (zh) 图像处理方法、装置及存储介质
WO2023015993A9 (zh) 一种色度信息的确定方法及相关电子设备
CN113727085B (zh) 一种白平衡处理方法、电子设备、芯片系统和存储介质
WO2022257574A1 (zh) Ai自动白平衡和自动白平衡的融合算法以及电子设备
CN116668656A (zh) 图像处理方法及电子设备
CN116055699B (zh) 一种图像处理方法及相关电子设备
TW201830335A (zh) 確定圖像的光源和對圖像進行色覺適配的方法及設備
WO2021016846A1 (zh) 图像处理方法、系统、可移动平台和存储介质
US20230045128A1 (en) Method and system for establishing light source information prediction model
CN115514947B (zh) 一种ai自动白平衡的算法和电子设备
CN113271450B (zh) 白平衡调整方法、影像处理装置与影像处理系统
CN116051434B (zh) 一种图像处理方法及相关电子设备
WO2023124165A1 (zh) 一种图像处理方法及相关电子设备
CN113766206A (zh) 一种白平衡调整方法、装置及存储介质
CN116437060B (zh) 一种图像处理方法及相关电子设备
WO2022267612A1 (zh) 一种动态调节光谱传感器曝光参数的方法及电子设备
CN113066020B (zh) 图像处理方法及装置、计算机可读介质和电子设备
CN116029914B (zh) 图像处理方法与电子设备

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2022850697

Country of ref document: EP

Effective date: 20230208

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22850697

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE