WO2023040725A1 - 白平衡处理方法与电子设备 - Google Patents

白平衡处理方法与电子设备 Download PDF

Info

Publication number
WO2023040725A1
WO2023040725A1 PCT/CN2022/117586 CN2022117586W WO2023040725A1 WO 2023040725 A1 WO2023040725 A1 WO 2023040725A1 CN 2022117586 W CN2022117586 W CN 2022117586W WO 2023040725 A1 WO2023040725 A1 WO 2023040725A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
color
white balance
pixel
sample
Prior art date
Application number
PCT/CN2022/117586
Other languages
English (en)
French (fr)
Inventor
钱彦霖
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Priority to US18/006,180 priority Critical patent/US20240129446A1/en
Priority to EP22839975.4A priority patent/EP4175275A4/en
Publication of WO2023040725A1 publication Critical patent/WO2023040725A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4015Image demosaicing, e.g. colour filter arrays [CFA] or Bayer patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
    • H04N25/136Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements using complementary colours
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control

Definitions

  • the present application relates to the field of image processing, and in particular, relates to a white balance processing method and electronic equipment.
  • white balance processing refers to the operation of reversely eliminating the influence of the light source color in the image through accurate estimation of the light source color, so that Achieve a white light shooting effect.
  • the existing image white balance processing relies on a single frame image, for example, 3-channel image information; for some shooting scenes, such as the scene of shooting a solid-color object at close range, the accuracy of the existing white balance processing needs to be improved; Therefore, how to perform white balance processing on the image and improve the color accuracy of the image has become an urgent problem to be solved.
  • the present application provides a white balance processing method and electronic equipment, which can improve the accuracy of colors in an image.
  • a white balance processing method which is applied to electronic equipment, including:
  • the first interface includes a first control
  • the first image refers to an image in a first color space collected by a multispectral color filter array sensor
  • the white balance model is used to calculate the parameters of white balance processing, and the white balance model is obtained by using the sample color correction matrix as input data and the first pixel information as target data training, and the sample color correction matrix is obtained according to the third sample image and the fourth sample image, the sample color correction matrix is used to represent the pixel change amount for converting the third sample image into the fourth sample image, the third sample image and The fourth sample image is obtained by performing the decomposition process and the demosaicing process on the first sample image, and the first pixel information refers to the neutral color block included in the color card in the fifth sample image.
  • the fifth sample image is obtained by performing the decomposition processing and the demosaic processing on the second sample image, and the first sample image and the second sample image refer to the same light source scene
  • the image of the first color space collected by the multi-spectral color filter array sensor, the second sample image includes the color card.
  • the first color space may be a Raw color space;
  • the color card may refer to a color board including different colors, and the color board includes a neutral color;
  • the neutral color may refer to black, white and black and white A series of different shades of gray.
  • a color filter array color filter array, CFA
  • CFA color filter array
  • a general photoelectric sensor can only sense the intensity of light, and cannot distinguish the intensity of light.
  • wavelength color
  • the filter array sensor can obtain color information of pixels through color filter.
  • the Raw image that can be collected by the color filter array sensor can include RGB color mode and other color modes; for example, the collected Raw image can refer to an RGBCYM image, or an RGBCYGM image, or an image in other color modes.
  • the neutral material of the neutral color block reflects light of various wavelengths uniformly, any color of light that shines on the color card will reflect the light of that color; for example, red light is irradiated on the color card. Red light is reflected on the neutral color block; green light is reflected on the neutral color block of the color card; therefore, the neutral color card of the color card can be used to mark the color of the light source.
  • the first image is collected by a multi-spectral color filter array sensor, and the second image and the third image in different color modes can be obtained by decomposing and demosaicing the first image; according to the second
  • the color correction matrix can be obtained from the image and the third image, and the white balance parameters can be obtained by inputting the color correction matrix to the pre-trained white balance model; the first image is processed through the white balance parameters to obtain the fourth image; due to the multispectral color
  • the color information in the image collected by the filter array sensor (for example, 6-channel Raw image) is more than the color information of a single frame image (for example, 3-channel image), so the white balance parameters obtained by the first image can improve the white balance processing accuracy and improve the color accuracy of the image.
  • the decomposing and demosaicing the first image to obtain the second image and the third image include:
  • the first Bayer array image is a Bayer array image in the first color mode
  • the second Bayer array image The image is a Bayer array image of the second color mode
  • the first color mode may refer to the RGB color mode
  • the first Bayer array image may refer to the RGGB image
  • the second color mode may refer to the CYM color mode
  • the second Bayer array image An image may refer to a CYYM image.
  • the first color mode may refer to the RGB color mode
  • the first Bayer array image may refer to the RGGB image
  • the second color mode may refer to the CYGM color mode
  • the second Bayer array image An image may refer to a CYGM image.
  • the first image collected by the multispectral color filter array sensor may include RGB color mode and other color modes, and other color modes may refer to CYM color mode, CYGM color mode or other color modes.
  • the second image includes a first pixel, a second pixel, and a third pixel
  • the third image includes a fourth pixel, a fifth pixel, and a sixth pixel pixel
  • the color correction matrix obtained according to the second image and the third image includes:
  • the color correction matrix is composed of the first vector, the second vector and the third vector.
  • the first Bayer array image is an RGGB image
  • the second Bayer array image is a CMMY image
  • each pixel in the RGB image may correspond to a set of RGB values
  • the CMY image Each pixel can correspond to a set of CMY values; convert the RGB value corresponding to the R pixel to the CMY value corresponding to the C pixel to obtain a 3 ⁇ 1 matrix; similarly, convert the RGB value corresponding to the G pixel to the corresponding M pixel
  • the CMY value can get a 3 ⁇ 1 matrix; convert the RGB value corresponding to the B pixel to the CMY value corresponding to the Y pixel to get a 3 ⁇ 1 matrix; get a 3 ⁇ 3 color correction matrix through three one-dimensional matrices .
  • the parameters of the white balance model are obtained by iteratively performing backpropagation calculations according to the difference between predicted pixel information and the first pixel information, so
  • the predicted pixel information refers to output information obtained by inputting the sample color correction matrix into the white balance model.
  • the multispectral color filter array sensor refers to a sensor that covers a mosaic color filter array above the pixel sensor.
  • the white balance model is a fully connected neural network.
  • the first interface refers to the home screen interface of the electronic device
  • the home screen interface includes a camera application program
  • the first control refers to the camera The control corresponding to the application.
  • the first operation refers to an operation of clicking a camera application.
  • the first interface refers to a photographing interface
  • the first control refers to a control for instructing photographing
  • the first operation refers to an operation of clicking a control for instructing to take a photo.
  • the first interface refers to a video shooting interface
  • the first control refers to a control for instructing video shooting.
  • the first operation refers to an operation of clicking a control indicating to shoot a video.
  • the first operation is illustrated by taking the first operation as an example of a click operation; the first operation may also include a voice instruction operation, or other operations instructing the electronic device to take a photo or video; the above is an example and does not limit this application in any way .
  • two training methods of the white balance model including:
  • the training data includes a first sample image and a second sample image
  • the first sample image and the second sample image refer to the multi-spectral color filter array sensor under the same light source scene
  • the collected image in the first color space, the second sample image includes a color card
  • Decomposing and demosaicing the first sample image to obtain a third sample image and a fourth sample image the third sample image is an image in the first color mode, and the fourth sample image is an image in the second color mode.
  • Image in color mode
  • sample color correction matrix Obtaining a sample color correction matrix according to the third sample image and the fourth sample image, where the sample color correction matrix is used to represent a pixel change amount for converting the third sample image into the fourth sample image;
  • the first pixel information refers to the pixel value corresponding to the neutral color block included in the color card in the fifth sample image.
  • the first color space may be a Raw color space;
  • the color card may refer to a color board including different colors, and the color board includes a neutral color;
  • the neutral color may refer to black, white and black and white A series of different shades of gray.
  • a color filter array color filter array, CFA
  • CFA color filter array
  • a general photoelectric sensor can only sense the intensity of light, and cannot distinguish the intensity of light.
  • wavelength color
  • the filter array sensor can obtain color information of pixels through color filter.
  • the Raw image that can be collected by the color filter array sensor can include RGB color mode and other color modes; for example, the collected Raw image can refer to an RGBCYM image, or an RGBCYGM image, or an image in other color modes.
  • the Raw image acquired by the multispectral color filter array sensor may be an RGBCMY image; wherein, R represents red (red), G represents green (green), B represents blue (blue), and C represents cyan (cyan) , M means dark red (mayenta), Y means yellow (yellow).
  • the Raw image acquired by the multispectral color filter array sensor may be an RGBCYGM image; wherein, R represents red (red), G represents green (green), B represents blue (blue), and C represents cyan (cyan) , Y means yellow (yellow), M means deep red (mayenta).
  • the neutral material of the neutral color block reflects light of various wavelengths uniformly, any color of light that shines on the color card will reflect the light of that color; for example, red light is irradiated on the color card. Red light is reflected on the neutral color block; green light is reflected on the neutral color block of the color card; therefore, the neutral color card of the color card can be used to mark the color of the light source.
  • the second sample image when acquiring the target data for training the white balance model, may also be processed by a demosaic algorithm to obtain a processed image; in the processed image Get the RGB pixel value of the pixel value corresponding to the neutral color block of the color card.
  • the same light source is collected by the multi-spectral color filter array sensor, and the first sample image not including the color card and the second sample image including the color card are collected in the same scene;
  • the input data for training the white balance model can be obtained by processing; the target data for training the white balance model can be obtained through the second sample image; the trained white balance model obtained through the training method of the white balance model in the embodiment of the present application can be used for calculation
  • the parameters of the white balance processing according to the parameters, the Raw image collected by the multispectral color filter array sensor can be used for white balance processing; in the embodiment of the present application, since the training data for training the white balance model is collected by the multispectral color filter array sensor
  • the color information in the image collected by the multi-spectral color filter array sensor (for example, 6-channel Raw image) is more than the color information of a single frame image (for example, 3-channel image), so the output of the white balance model obtained by training is
  • the parameters used for white balance processing can improve the accuracy of white balance processing and improve the color accuracy of the image
  • the training of the white balance model takes the sample color correction matrix as input data and uses the first pixel information in the fifth sample image as target data, include:
  • the white balance model is trained according to the predicted pixel information and the first pixel information to obtain the trained white balance model.
  • the predicted pixel information refers to the predicted RGB value of the light source.
  • the parameters of the trained white balance model are calculated by backpropagation according to the difference between the predicted pixel information and the first pixel information. obtained by iteration.
  • the angle error loss (Angular Error Loss) between the predicted pixel information and the first pixel information may be reversely iteratively trained for the white balance model.
  • the multispectral color filter array sensor refers to a sensor that covers a mosaic color filter array above the pixel sensor.
  • the white balance model is a fully connected neural network.
  • an electronic device including a module/unit for performing the first aspect or any method in the first aspect.
  • an electronic device including a module/unit for executing the second aspect or any one of the training methods in the second aspect.
  • an electronic device in a fifth aspect, includes one or more processors and a memory; the memory is coupled to the one or more processors, the memory is used to store computer program codes, the The computer program code includes computer instructions that are invoked by the one or more processors to cause the electronic device to perform:
  • the first interface includes a first control
  • the first image refers to an image in a first color space collected by a multispectral color filter array sensor
  • the white balance model is used to calculate the parameters of white balance processing, and the white balance model is obtained by using the sample color correction matrix as input data and the first pixel information as target data training, and the sample color correction matrix is obtained according to the third sample image and the fourth sample image, the sample color correction matrix is used to represent the pixel change amount for converting the third sample image into the fourth sample image, the third sample image and The fourth sample image is obtained by performing the decomposition process and the demosaicing process on the first sample image, and the first pixel information refers to the neutral color block included in the color card in the fifth sample image.
  • the fifth sample image is obtained by performing the decomposition processing and the demosaic processing on the second sample image, and the first sample image and the second sample image refer to the same light source scene
  • the image of the first color space collected by the multi-spectral color filter array sensor, the second sample image includes the color card.
  • the one or more processors call the computer instructions so that the electronic device executes:
  • the first Bayer array image is a Bayer array image in the first color mode
  • the second Bayer array image The image is a Bayer array image of the second color mode
  • the second image includes a first pixel, a second pixel, and a third pixel
  • the third image includes a fourth pixel, a fifth pixel, and a sixth pixel pixels
  • the one or more processors invoke the computer instructions to cause the electronic device to perform:
  • the color correction matrix is composed of the first vector, the second vector and the third vector.
  • the parameters of the white balance model are obtained by iterating through backpropagation according to the difference between the predicted pixel information and the first pixel information, so
  • the predicted pixel information refers to output information obtained by inputting the sample color correction matrix into the white balance model.
  • the multi-spectral color filter array sensor refers to a sensor that covers a mosaic color filter array above the pixel sensor.
  • the white balance model is a fully connected neural network.
  • the first interface refers to the home screen interface of the electronic device
  • the home screen interface includes a camera application program
  • the first control refers to the camera The control corresponding to the application.
  • the first interface refers to a photographing interface
  • the first control refers to a control for instructing photographing.
  • the first interface refers to a video shooting interface
  • the first control refers to a control for instructing video shooting.
  • an electronic device in a sixth aspect, includes one or more processors and a memory; the memory is coupled to the one or more processors, the memory is used to store computer program codes, the The computer program code includes computer instructions that are invoked by the one or more processors to cause the electronic device to perform:
  • the training data includes a first sample image and a second sample image
  • the first sample image and the second sample image refer to the multi-spectral color filter array sensor under the same light source scene
  • the collected image in the first color space, the second sample image includes a color card
  • Decomposing and demosaicing the first sample image to obtain a third sample image and a fourth sample image the third sample image is an image in the first color mode, and the fourth sample image is an image in the second color mode.
  • Image in color mode
  • sample color correction matrix Obtaining a sample color correction matrix according to the third sample image and the fourth sample image, where the sample color correction matrix is used to represent a pixel change amount for converting the third sample image into the fourth sample image;
  • the first pixel information refers to the pixel value corresponding to the neutral color block included in the color card in the fifth sample image.
  • the one or more processors call the computer instructions so that the electronic device executes:
  • the white balance model is trained according to the predicted pixel information and the first pixel information to obtain the trained white balance model.
  • the parameters of the trained white balance model are calculated by backpropagation according to the difference between the predicted pixel information and the first pixel information. obtained by iteration.
  • the multispectral color filter array sensor refers to a sensor that covers a mosaic color filter array above the pixel sensor.
  • the white balance model is a fully connected neural network.
  • an electronic device is provided, the electronic device is used for white balance processing, and the electronic device includes: one or more processors and a memory; the memory is coupled to the one or more processors, and the The memory is used to store computer program codes, the computer program codes include computer instructions, and the one or more processors call the computer instructions to make the electronic device perform any one of the first aspect or the first aspect. way.
  • an electronic device is provided, the electronic device is used for training a white balance model, and the electronic device includes: one or more processors and a memory; the memory is coupled to the one or more processors , the memory is used to store computer program codes, the computer program codes include computer instructions, and the one or more processors call the computer instructions to make the electronic device perform the second aspect or any of the second aspects a training method.
  • a chip system which is applied to an electronic device, and the chip system includes one or more processors, and the processor is used to invoke computer instructions so that the electronic device executes the first aspect or any of the methods in the second aspect.
  • a computer-readable storage medium stores computer program code, and when the computer program code is run by an electronic device, the electronic device executes the first aspect or the second aspect. Either method in the aspect.
  • a computer program product comprising: computer program code, when the computer program code is run by an electronic device, the electronic device is made to execute the first aspect or the second aspect. either way.
  • the same light source is collected by the multi-spectral color filter array sensor, and the first sample image not including the color card and the second sample image including the color card are collected in the same scene;
  • the input data for training the white balance model can be obtained by processing; the target data for training the white balance model can be obtained through the second sample image; the trained white balance model obtained through the training method of the white balance model in the embodiment of the present application can be used for calculation
  • the parameters of the white balance processing according to the parameters, the Raw image collected by the multispectral color filter array sensor can be used for white balance processing; in the embodiment of the present application, since the training data for training the white balance model is collected by the multispectral color filter array sensor
  • the color information in the image collected by the multi-spectral color filter array sensor (for example, 6-channel Raw image) is more than the color information of a single frame image (for example, 3-channel image), so the output of the white balance model obtained by training is
  • the parameters used for white balance processing can improve the accuracy of white balance processing and improve the color accuracy of the image
  • Fig. 1 is a schematic diagram of a fully connected neural network provided by the present application
  • FIG. 2 is a schematic diagram of a hardware system applicable to the electronic device of the present application
  • FIG. 3 is a schematic diagram of an application scenario applicable to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of an application scenario applicable to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a system architecture applicable to the white balance processing method of the present application.
  • Fig. 6 is a schematic diagram of a spectral response curve of a multispectral image provided by the present application.
  • FIG. 7 is a schematic diagram of a white balance processing method applicable to the present application.
  • FIG. 8 is a schematic diagram of a white balance model training method provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of the effect of the white balance processing method provided by the embodiment of the present application.
  • FIG. 10 is a schematic diagram of a display interface of an electronic device provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of a display interface of an electronic device provided in an embodiment of the present application.
  • Fig. 12 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • White balance is an index that describes the accuracy of white after the three primary colors of red, green and blue are mixed in the display.
  • the white balance setting of the camera equipment is an important guarantee to ensure the ideal color of the picture;
  • white balance processing refers to the operation of reversely eliminating the influence of the color of the light source in the image through accurate estimation of the color of the light source, so as to achieve a white color The effect of light shooting.
  • the Bayer array is one of the main technologies for enabling charge-coupled device (CCD) or complementary metal oxide semiconductor (complementary metal oxide semiconductor (CMOS)) sensors to capture color images; it can be a 4 ⁇ 4 array, for example, It consists of 8 green, 4 blue and 4 red pixels. When converting a grayscale image to a color image, it will perform 9 operations in a 2 ⁇ 2 matrix, and finally generate a color image.
  • CCD charge-coupled device
  • CMOS complementary metal oxide semiconductor
  • the color filter array sensor refers to the sensor that covers the mosaic color filter array above the pixel sensor.
  • the color filter array sensor is used to collect the color information of the image; the general photoelectric sensor can only sense the intensity of light, and cannot distinguish the wavelength (color) of light. ;
  • the filter array sensor can obtain the color information of pixels through color filtering (color filter).
  • a color model is a model that represents a color in numerical form, or a way of recording the color of an image.
  • Demosaicing refers to the image processing process of converting a Raw image into an RGB image.
  • Neutral colors can refer to various shades of gray series composed of black, white and black and white.
  • a neural network refers to a network formed by connecting multiple single neural units together, that is, the output of one neural unit can be the input of another neural unit; the input of each neural unit can be connected to the local receptive field of the previous layer, To extract the features of the local receptive field, the local receptive field can be an area composed of several neural units.
  • a fully connected neural network can also be called a deep neural network (DNN) or a multi-layer neural network, which can be understood as a neural network with multiple hidden layers.
  • the fully connected neural network is divided according to the position of different layers.
  • the neural network inside the fully connected neural network can be divided into three categories: input layer, hidden layer, and output layer; usually, the first layer is the input layer, and the last layer is The output layer and the middle layers are all hidden layers; the layers are fully connected, that is to say, any neuron in the i-th layer must be connected to any neuron in the i+1-th layer.
  • y a(w x+b); where x represents the input vector, y represents the output vector, b represents the offset vector, and w represents the weight matrix ( Also known as the coefficient), a() represents the activation function.
  • Each layer operates on the input vector x through a linear expression to obtain the output vector y.
  • DNN Due to the large number of DNN layers, the number of coefficient w and offset vector b is relatively large; the definition of these parameters in DNN is as follows:
  • the three-layer DNN includes an input layer (first layer), a hidden layer (second layer) and an output layer (third layer); for example, the second layer
  • the definition of the linear coefficient from the 4th neuron to the 1st neuron of the third layer can be expressed as w 78 , and the subscript 78 indicates that it corresponds to the input index 7 of the second layer and the output index 8 of the third layer.
  • the neural network can use the error back propagation (back propagation, BP) algorithm to correct the size of the parameters in the initial neural network model during the training process, so that the reconstruction error loss of the neural network model becomes smaller and smaller. Specifically, passing the input signal forward until the output will generate an error loss, and updating the parameters in the initial neural network model by backpropagating the error loss information, so that the error loss converges.
  • the backpropagation algorithm is a backpropagation movement dominated by error loss, aiming to obtain the optimal parameters of the neural network model, such as the weight matrix.
  • Fig. 2 shows a hardware system applicable to the electronic equipment of this application.
  • the electronic device 100 may be a mobile phone, a smart screen, a tablet computer, a wearable electronic device, a vehicle electronic device, an augmented reality (augmented reality, AR) device, a virtual reality (virtual reality, VR) device, a notebook computer, a super mobile personal computer ( ultra-mobile personal computer (UMPC), netbook, personal digital assistant (personal digital assistant, PDA), projector, etc.
  • augmented reality augmented reality
  • VR virtual reality
  • a notebook computer a super mobile personal computer ( ultra-mobile personal computer (UMPC), netbook, personal digital assistant (personal digital assistant, PDA), projector, etc.
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure shown in FIG. 2 does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than those shown in FIG. 2 , or the electronic device 100 may include a combination of some of the components shown in FIG. 2 , or , the electronic device 100 may include subcomponents of some of the components shown in FIG. 2 .
  • the components shown in FIG. 2 can be realized in hardware, software, or a combination of software and hardware.
  • Processor 110 may include one or more processing units.
  • the processor 110 may include at least one of the following processing units: an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor) , ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, neural network processor (neural-network processing unit, NPU).
  • an application processor application processor, AP
  • modem processor graphics processing unit
  • graphics processing unit graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • the processor 110 may be configured to execute the white balance processing method of the embodiment of the present application; for example, display a first interface, the first interface includes a first control; detect a first operation on the first control; respond to the first One operation, acquiring the first image, the first image refers to the image in the first color space collected by the multi-spectral color filter array sensor; performing decomposition processing and demosaic processing on the first image to obtain the second image and the third image,
  • the second image is an image in the first color mode
  • the third image is an image in the second color mode; a color correction matrix is obtained according to the second image and the third image, and the color correction matrix is used to represent the conversion of the second image into the third image
  • the pixel change amount of input the color correction matrix to the white balance model to obtain the white balance parameters; perform image processing on the first image according to the white balance parameters to obtain the fourth image.
  • the processor 110 may be used to execute the white balance model training method of the embodiment of the present application; for example, to acquire training data, wherein the training data includes a first sample image and a second sample image, and the first sample image
  • the second sample image and the second sample image refer to the image of the first color space collected by the multi-spectral color filter array sensor under the same light source scene, and the second sample image includes a color card;
  • the first sample image is decomposed and demosaiced , get the third sample image and the fourth sample image, the third sample image is the image of the first color mode, and the fourth sample image is the image of the second color mode; get the sample color correction according to the third sample image and the fourth sample image Matrix, the sample color correction matrix is used to represent the pixel change amount of converting the third sample image into the fourth sample image;
  • the second sample image is decomposed and demosaiced to obtain the fifth sample image, and the fifth sample image is the first An image in a color mode;
  • the sample color correction matrix is used as
  • connection relationship between the modules shown in FIG. 2 is only a schematic illustration, and does not constitute a limitation on the connection relationship between the modules of the electronic device 100 .
  • each module of the electronic device 100 may also adopt a combination of various connection modes in the foregoing embodiments.
  • the wireless communication function of the electronic device 100 may be realized by components such as the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, and a baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the electronic device 100 can realize the display function through the GPU, the display screen 194 and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • Display 194 may be used to display images or video.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 , and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can optimize the algorithm of image noise, brightness and color, and ISP can also optimize parameters such as exposure and color temperature of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard red green blue (red green blue, RGB), YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3 and MPEG4.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of the electronic device 100 around three axes may be determined by the gyro sensor 180B.
  • the gyro sensor 180B can be used for image stabilization. For example, when the shutter is pressed, the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shaking of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used in scenarios such as navigation and somatosensory games.
  • the gyro sensor 180B may be used to collect shake information, and the shake information may be used to indicate the change of the pose of the electronic device during the shooting process.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally x-axis, y-axis and z-axis). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. The acceleration sensor 180E can also be used to identify the posture of the electronic device 100 as an input parameter for application programs such as horizontal and vertical screen switching and pedometer.
  • the distance sensor 180F is used to measure distance.
  • the electronic device 100 may measure the distance by infrared or laser. In some embodiments, for example, in a shooting scene, the electronic device 100 can use the distance sensor 180F for distance measurement to achieve fast focusing.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement functions such as unlocking, accessing the application lock, taking pictures, and answering incoming calls.
  • the touch sensor 180K is also referred to as a touch device.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a touch screen.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor 180K may transmit the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 and disposed at a different position from the display screen 194 .
  • white balance processing refers to the operation of reversely eliminating the influence of the color of the light source in the image through accurate estimation of the color of the light source, so as to achieve A white light shooting effect.
  • the existing image white balance processing relies on a single frame image, for example, 3-channel image information; for some shooting scenes, such as the scene of shooting a solid-color object at close range, the accuracy of the existing white balance processing needs to be improved.
  • an embodiment of the present application provides a white balance processing method and an electronic device.
  • the first image is collected by a multi-spectral color filter array sensor, and the first image is decomposed, processed and Demosaic processing can obtain the second image and the third image in different color modes;
  • the color correction matrix can be obtained according to the second image and the third image, and the white balance parameter can be obtained by inputting the color correction matrix into the pre-trained white balance model; through
  • the white balance parameter performs image processing on the first image to obtain the fourth image; since the image collected by the multi-spectral color filter array sensor (for example, a 6-channel Raw image) has more color information than a single frame image (for example, a 3-channel image ), the white balance parameters obtained from the first image can improve the accuracy of white balance processing and the color accuracy of the image.
  • the white balance processing method in the embodiment of the present application can be applied to the field of photographing, recording video, video calling or other image processing fields; the white balance processing method is performed on the image through the white balance processing method in the embodiment of the present application , which can improve the color accuracy of the image.
  • (a) in Figure 4 shows the preview image of the subject 220 obtained by sampling the existing white balance processing method; (b) in Figure 4 is provided by the embodiment of the present application
  • the white balance processing method obtains the preview image of the subject 220 of the video call; the preview image shown in (b) in Fig. 4 is compared with the preview image shown in (a) in Fig. 4 , (b) in Fig. 4 ) shown in the preview image of the subject 220 has a higher color reproducibility; therefore, the color accuracy of the image can be improved by performing white balance processing on the image through the white balance processing method of the embodiment of the present application.
  • FIG. 5 is a schematic diagram of a system architecture applicable to the white balance processing method of the present application.
  • the system architecture 300 may include a multi-spectral color filter array sensor 310, a decomposition module 320, a white balance parameter calculation module 330, and an image signal processor 340; wherein, the image signal processor 340 may also include a white balance module 341 .
  • the multispectral color filter array sensor 310 may be used to acquire Raw images; for example, the Raw images acquired by the multispectral color filter array sensor 310 may include RGB color modes and other color modes.
  • the Raw image collected by the multispectral color filter array sensor 310 may refer to an RGBCYM image, or an RGBCYGM image, or an image in other color modes.
  • the Raw image acquired by the multispectral color filter array sensor 310 may be an RGBCMY image; wherein, R represents red (red), G represents green (green), B represents blue (blue), and C represents cyan ( cyan), M for dark red (mayenta), Y for yellow (yellow).
  • the Raw image acquired by the multi-spectral color filter array sensor 310 may be an RGBCYGM image; wherein, R represents red (red), G represents green (green), B represents blue (blue), and C represents cyan ( cyan), Y means yellow (yellow), M means dark red (mayenta).
  • RGBCMY image and the RGBCYGM image are used as examples for illustration above, and this application does not make any limitation thereto.
  • the Raw image acquired by the multispectral color filter array sensor 310 may be a multi-channel image; for example, a 6-channel Raw image, an 8-channel Raw image, or a Raw image with other numbers of channels.
  • the Raw image collected by the multispectral color filter array sensor can be decomposed into an image in RGB color mode and an image in other color modes, and the spectral response curves of the two images satisfy that any two curves are different.
  • Fig. 6 shows the spectral response curve of a 6-channel RGBCMY image, where Curve 1 represents the spectral response curve corresponding to blue (blue, B); Curve 2 represents the spectral response curve corresponding to cyan (cyan, C) Curve 3 represents the corresponding spectral response curve of deep red (mayenta, M); Curve 4 represents the corresponding spectral response curve of yellow (yellow, Y); Curve 5 represents the corresponding spectral response curve of green (green, G); Curve 6 represents Red (red, R) corresponding spectral response curve; in the spectral response curve of the RGBCMY image shown in Figure 6, curve 6 and curve 3, curve 5 and curve 4, curve 1 and curve 2 correspond to each other; from Figure 6 It can be seen that the spectral range of curve 3 is wider than that of curve 6, and the light input of curve 3 is better than that of curve 6; the spectral range of curve 4 is wider than that of curve 5, and the light input of curve 4 is better than that of curve 5 ; Curve 2
  • the decomposing module 320 is used to decompose the Raw image collected by the multi-spectral color filter array sensor 310 into a first Bayer array image and a second Bayer array image, wherein the first Bayer array image may refer to a first color mode (eg, RGB color mode) Bayer array image, the second Bayer array image may refer to a second color mode (eg, CYM color mode, CYGM color mode or other color mode) Bayer array image.
  • first color mode eg, RGB color mode
  • second Bayer array image may refer to a second color mode (eg, CYM color mode, CYGM color mode or other color mode) Bayer array image.
  • the spectral response curve of the first Bayer array image is different from any two curves in the spectral response curve of the second Bayer array image.
  • the decomposing module 320 may first perform demosaic processing (demosaic) on the Raw image collected by the multi-spectral color filter array sensor, and then perform scaling processing (resize).
  • demosaic processing demosaic
  • scaling processing resize
  • the size of the first Bayer array image and the second Bayer array image may be 68*48*3.
  • the first Bayer array image can refer to an image in RGB color mode
  • the second Bayer array image can refer to images in other color modes; for example, the second Bayer array image can refer to an image in CMY color mode; or the second Bayer array image can refer to an image in CMY color mode;
  • a two-Bayer array image may refer to an image in a CYGM color mode.
  • the Raw image collected by the multi-spectral color filter array sensor 310 can be a multi-channel image; the multi-channel image can be divided into a first Bayer array image of 3 channels (for example, RGB color mode) and other channel numbers The second Bayer array image.
  • the Raw image collected by the multi-spectral color filter array sensor 310 may be a 6-channel RGBCMY image, which may be decomposed into a 3-channel first Bayer array image (RGB image) and a 3-channel second Bayer array image (CMY image).
  • RGB image 3-channel first Bayer array image
  • CMY image 3-channel second Bayer array image
  • the Raw image collected by the multispectral color filter array sensor 310 may be a 7-channel RGBCYGM image, which may be decomposed into a first Bayer array image (RGB image) of 3 channels and a second Bayer array image of 4 channels ( CYGM image).
  • RGB image first Bayer array image
  • CYGM image second Bayer array image
  • the Raw image collected by the multispectral color filter array sensor 310 may be an 8-channel Raw image, which may be decomposed into a 3-channel first Bayer array image (RGB image) and a 5-channel second Bayer array image.
  • the white balance parameter calculation module 330 is used to calculate the white balance parameter according to the first Bayer array image and the second Bayer array image, and the specific process can be referred to from step S430 to step S450 shown in FIG. 7 .
  • the white balance parameter calculation module 330 may also transmit the obtained white balance parameters to the image signal processor 340 .
  • the white balance module 341 may perform white balance processing on the image collected by the multi-spectral color filter array sensor 310 according to the white balance parameter to obtain a processed image.
  • the white balance module 341 may perform automatic white balance processing (automatic white balance, AWB) on the Raw image collected by the multi-spectral color filter array sensor 310 according to the white balance parameter.
  • automatic white balance processing automated white balance, AWB
  • the white balance parameter calculation module 330 can be a module in CPU, GPU or other computing example hardware; perhaps, the white balance parameter calculation module 330 can be a module in the image signal processor 340; or, the white balance parameter calculation module The functions of 330 may also be performed in the white balance module 341 .
  • system architecture shown in FIG. 5 can be the system architecture in the electronic device shown in FIG. 2; the system architecture shown in FIG. 5 can execute the white balance processing method shown in FIG. The white balance processing method shown in 7 is described in detail.
  • FIG. 7 is a schematic diagram of a white balance processing method provided by an embodiment of the present application; the method 400 includes steps S410 to S470, and the steps S410 to S470 are described in detail below.
  • Step S410 acquiring a Raw image (an example of the first image) collected by the multi-spectral color filter array sensor; wherein, the Raw image refers to a Raw image collected by the multi-spectral color filter array sensor 310 as shown in FIG. 5 .
  • the multi-spectral color filter array sensor 310 can capture 6-channel Raw images, and the Raw images can include RGB color modes and other color modes.
  • the color filter array sensor refers to the sensor covering the mosaic color filter array above the pixel sensor, which is used to collect the color information of the image; the Raw image of more channels can be obtained through the color filter array sensor; the color filter array sensor collects The spectral range of the spectral response curve of the Raw image is wider, that is, the amount of light entering the color filter array sensor is more, making the brightness value of the image better.
  • Step S420 performing decomposition processing on the Raw image.
  • decomposition processing is performed on the Raw image collected by the multi-spectral color filter array sensor.
  • two Bayer array images are extracted at a preset position of the Raw image, namely a first Bayer array image and a second Bayer array image.
  • Step S430 obtaining the first Bayer array image.
  • the first Bayer array image is obtained according to the decomposition process.
  • Step S440 obtaining a second Bayer array image.
  • the second Bayer array image is obtained according to the decomposition process.
  • the corresponding pixel (pixel) can be extracted according to the RGB mask (mask) of the Raw image to form the RGGB Bayer image, and the first sample Bayer array image can be obtained; it can be extracted according to the CYM mask (mask) of the Raw image
  • the corresponding pixels (pixels) form a CYYM Bayer image to obtain a second sample Bayer array image.
  • the corresponding pixel (pixel) can be extracted according to the RGB mask (mask) of the Raw image to form the RGGB Bayer image, and the first sample Bayer array image can be obtained; it can be extracted according to the CYGM mask (mask) of the Raw image
  • the corresponding pixels (pixels) form a CYGM Bayer image to obtain a second sample Bayer array image.
  • the first Bayer array image refers to an RGGB image
  • the second Bayer array image can refer to a CYYM image, a CYGM image, or an image in other color modes
  • the second Bayer array image and the first Bayer array image satisfy any of the spectral response curves.
  • the two curves are different, and the present application does not make any limitation on the color mode of the second Bayer array image.
  • Step S450 calculating a color correction matrix.
  • a color correction matrix is calculated according to the first Bayer array image and the second Bayer array image.
  • demosaic processing may be performed on the first Bayer array image and the second Bayer array image to obtain an image in the first color mode (an example of the second image) and an image in the second color mode (an example of the third image).
  • a color correction matrix can be obtained, where the color correction matrix can be used to represent the conversion of the image in the first color mode to the second color
  • the pixel change amount of the image of the mode According to the pixel difference between the image in the first color mode and the image in the second color mode, a color correction matrix can be obtained, where the color correction matrix can be used to represent the conversion of the image in the first color mode to the second color The pixel change amount of the image of the mode.
  • the first Bayer array image is an RGGB image
  • the second Bayer array image is a CMMY image
  • the image in the first color mode can be an RGB image
  • the image in the second color mode can be a CMY image
  • for each pixel in the RGB image Can correspond to a group of RGB values (an example of the first pixel, the second pixel and the third pixel)
  • each pixel in the CMY image can correspond to a group of CMY values (one of the fourth pixel, the fifth pixel and the sixth pixel) Example)
  • a 3 ⁇ 1 matrix can be obtained by converting the RGB values corresponding to B pixels to CMY values corresponding to Y pixels to obtain a 3 ⁇ 1 matrix (an example of the third vector );
  • Step S460 inputting the color correction matrix into the white balance model to obtain white balance parameters.
  • the white balance model can obtain corresponding white balance parameters according to the input color correction matrix;
  • the white balance model can be a pre-trained fully connected neural network model, and the training method of the white balance model can be referred to in the following figure 8.
  • the white balance parameter may refer to the color information of the light source; for example, it may refer to the RGB value of the light source; applying the color information of the light source to the image can eliminate the influence of the light source on the color of the object to be photographed, thereby avoiding the color of the object being photographed. Distortion: According to the color information of the light source, the color information of the object in the image can be corrected, so as to improve the accuracy of the color of the object in the image.
  • Step S470 performing white balance processing on the Raw image according to the white balance parameters.
  • white balance processing (an example of image processing) is performed on the Raw image according to the white balance parameter to obtain a processed image (an example of a fourth image).
  • the first image is collected by a multi-spectral color filter array sensor, and the second image and the third image in different color modes can be obtained by decomposing and demosaicing the first image; according to the second
  • the color correction matrix can be obtained from the image and the third image, and the white balance parameters can be obtained by inputting the color correction matrix to the pre-trained white balance model; the first image is processed through the white balance parameters to obtain the fourth image; due to the multispectral color
  • the color information in the image collected by the filter array sensor (for example, 6-channel Raw image) is more than the color information of a single frame image (for example, 3-channel image), so the white balance parameters obtained by the first image can improve the white balance processing accuracy and improve the color accuracy of the image.
  • Fig. 8 is a schematic diagram of the training method of the white balance model provided by the embodiment of the present application; the training method can be executed by the electronic device shown in Fig. 2; the training method 500 includes step S510 to step S560, and the steps from step S510 to step S560 are respectively described below S560 is described in detail.
  • Step S510 acquiring training data of different light source scenes.
  • the training data may include the first sample image not including the color card and the second sample image including the color card in the same light source scene, and the second sample image is used to calibrate the true Value (Ground truth).
  • the color card when acquiring the sample image, can be placed under the illumination of the main light source for shooting.
  • Step S520 decomposing the first sample image to obtain the first sample Bayer array image and the second sample Bayer array image.
  • the corresponding pixel can be extracted according to the RGB mask (mask) of the first sample image to form the RGGB Bayer image, and the first sample Bayer array image can be obtained; it can be obtained according to the CYM mask of the first sample image
  • the mask extracts corresponding pixels to form a CYYM Bayer image, and obtains a second sample Bayer array image.
  • the corresponding pixel can be extracted according to the RGB mask (mask) of the first sample image to form the RGGB Bayer image, and the first sample Bayer array image can be obtained; it can be obtained according to the CYGM mask of the first sample image
  • the corresponding pixels are extracted from the mask to form a CYGM Bayer image, and a second sample Bayer array image is obtained.
  • the first sample Bayer array image may refer to an RGGB image
  • the second sample Bayer array image may refer to a CYYM image, a CYGM image, or an image in other color modes
  • the second sample Bayer array image is the same as the first sample Bayer array image
  • the image satisfies the fact that any two curves in the spectral response curves are different, and this application does not make any limitation on the color mode of the second sample Bayer array image.
  • Step S530 performing demosaic processing on the first sample Bayer array image and the second sample Bayer array image to obtain a third sample image and a fourth sample image.
  • the third sample image may refer to an image in the first color mode
  • the fourth sample image may refer to an image in the second color mode
  • the first color mode may refer to the RGB color mode
  • the second color mode may refer to the CMY color mode; or, the second color mode may also refer to other color modes.
  • the first sample Bayer array image can be an RGGB image, and the RGGB image is demosaiced to obtain an RGB image;
  • the second sample Bayer array image can be a CYYM image, and the CYYM image is demosaiced to obtain a CYM image .
  • the first sample Bayer array image can be an RGGB image, and the RGGB image is demosaiced to obtain an RGB image;
  • the second sample Bayer array image can be a CYGM image, and the CYGM image is demosaiced to obtain a CYGM image .
  • Step S540 obtain a sample color correction matrix according to the third sample image and the fourth sample image.
  • a sample color correction matrix can be obtained.
  • the third sample image can be an RGB image
  • the fourth sample image can be a CMY image
  • each pixel in the RGB image can correspond to a set of RGB values
  • each pixel in the CMY image can correspond to a set of CMY values
  • Matrix convert the RGB value corresponding to the B pixel to the CMY value corresponding to the Y pixel to obtain a 3 ⁇ 1 matrix
  • the 3 ⁇ 3 sample color correction matrix may be reorganized to obtain a one-dimensional vector, that is, a 9 ⁇ 1 sample color correction matrix.
  • Step S550 calibrate the color value corresponding to the neutral color block included in the color card in the second sample image, and calibrate the color value as the color value of the light source.
  • the neutral material of the neutral color block reflects light of various wavelengths uniformly, any color of light shining on the neutral color block will reflect the light of that color; for example, red light shining on the color card Red light is reflected on the neutral color block of the color card; green light is reflected on the neutral color block of the color card; therefore, the neutral color card of the color card can be used to mark the color of the light source.
  • the neutral color block may refer to the gray color block in the color card; for example, the RGB pixel value of the gray color block in the target image is marked as the light source color of the light source, that is, the RGB pixel value of the light source.
  • the calibration process of the color value of the light source includes the following steps:
  • Step 1 Decompose the second sample image to obtain the third sample Bayer array image (RGGB Bayer array image) and the fourth sample Bayer array image;
  • Step 2 Demosaicing the third sample Bayer array image and the fourth sample Bayer array image to obtain an RGB image (an example of the fifth sample image) and images in other color modes;
  • Step 3 Mark the RGB pixel value corresponding to the neutral color block of the color card included in the RGGB image as the color value of the light source (an example of the first pixel information).
  • the neutral color block may refer to the gray color block in the color card; for example, the RGB pixel value of the gray color block in the target image is marked as the light source color of the light source, that is, the RGB pixel value of the light source.
  • the second sample image may be processed by a demosaic algorithm to obtain a processed image; the pixel value corresponding to the neutral color block of the color card is obtained in the processed image The pixel value in RGB.
  • Step S560 by using the sample color correction matrix as input data and using the color value of the light source as the target value to train the white balance model to obtain a trained white balance model.
  • the sample color correction matrix is input into the white balance model to obtain the predicted white balance parameter (an example of predicted pixel information); the predicted white balance parameter is compared with the color value of the light source; The parameters are iterated until the white balance model converges, and the trained white balance model is obtained.
  • the predicted white balance parameter an example of predicted pixel information
  • the white balance model can be trained iteratively in reverse by predicting the angular error loss (Angular Error Loss) between the white balance parameters and the color value of the light source.
  • Angular Error Loss angular error loss
  • the white balance model may be a fully connected neural network.
  • the same light source is collected by the multi-spectral color filter array sensor, and the first sample image not including the color card and the second sample image including the color card are collected in the same scene;
  • the input data for training the white balance model can be obtained through processing;
  • the target data for training the white balance model can be obtained through the second sample image;
  • the trained white balance model obtained through the training method of the white balance model in the embodiment of the present application can be used for white Balance processing, according to the white balance parameters output by the white balance model, the Raw image collected by the multi-spectral color filter array sensor can be processed for white balance;
  • the image collected by the filter array sensor, the color information in the image collected by the multi-spectral color filter array sensor (for example, 6-channel Raw image) is more than the color information of a single frame image (for example, 3-channel image), so the training obtained
  • the parameters output by the white balance model for white balance processing can improve the accuracy of white balance processing and improve the color accuracy of the image.
  • FIG. 9 is a schematic diagram showing the effect of the white balance processing method provided by the embodiment of the present application.
  • the color restoration mode can be enabled in the camera application program of the electronic device, then the electronic device can obtain the white balance parameters through the white balance model provided by the embodiment of the present application, and filter the multispectral color according to the white balance parameters White balance processing is performed on the Raw image collected by the array sensor, so as to output the processed image or video.
  • FIG. 10 shows a graphical user interface (graphical user interface, GUI) of an electronic device.
  • the GUI shown in (a) in FIG. 10 is the desktop 610 of the electronic device; when the electronic device detects that the user clicks the operation of the camera application (application, APP) icon 620 on the desktop 610, the camera application can be started, and the display is as follows: Another GUI shown in (b) in Figure 10; the GUI shown in (b) in Figure 10 can be the display interface of the camera APP in the camera mode, and the GUI can include a shooting interface 630; Including a viewfinder frame 631 and controls; for example, the shooting interface 630 may include a control 632 for instructing shooting and a control 633 for instructing settings; in the preview state, a preview image may be displayed in real time in the viewfinder 631; wherein, the preview The state may mean that before the user turns on the camera and does not press the photo/record button, the preview image can be displayed in the viewfinder in real time.
  • the setting interface shown in (c) in Figure 10 is displayed; the color restoration mode control 634 may be included in the setting interface; it is detected that the user clicks the color restoration mode control
  • the electronic device can turn on the color restoration mode; after the electronic device can turn on the color restoration mode, the white balance parameters can be obtained through the white balance model provided by the embodiment of the present application, and the multispectral color filter can be adjusted according to the white balance parameters
  • the Raw image collected by the array sensor is processed for white balance, so as to output the processed image.
  • the photographing interface 630 may further include a control 635, which is used to indicate to turn on/off the color reproduction mode; after the electronic device detects that the user clicks on the control 635, The electronic device can turn on the color restoration mode, obtain the white balance parameters through the white balance model provided by the embodiment of the present application, and perform white balance processing on the Raw images collected by the multi-spectral color filter array sensor according to the white balance parameters, so as to output the processed image or video.
  • a control 635 which is used to indicate to turn on/off the color reproduction mode
  • the training method and white balance processing method of the white balance model provided by the embodiment of the present application are described in detail above with reference to FIG. 1 to FIG. 11 ; the device embodiment of the present application will be described in detail below with reference to FIG. 12 to FIG. 14 . It should be understood that the devices in the embodiments of the present application can execute the various methods in the foregoing embodiments of the present application, that is, the specific working processes of the following various products can refer to the corresponding processes in the foregoing method embodiments.
  • FIG. 12 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the electronic device 700 can execute the training method shown in FIG. 8 ; the electronic device 700 includes an acquisition module 710 and a processing module 720 .
  • the acquisition module 710 is used to acquire training data, wherein the training data includes a first sample image and a second sample image, and the first sample image and the second sample image refer to the same light source
  • the processing module 720 is used to decompose, process and remove the first sample image Mosaic processing to obtain a third sample image and a fourth sample image, the third sample image is an image in the first color mode, and the fourth sample image is an image in the second color mode; according to the third sample image and the The fourth sample image obtains a sample color correction matrix, and the sample color correction matrix is used to represent the pixel change amount of converting the third sample image into the fourth sample image; decomposing the second sample image processing and demosaicing to obtain a fifth sample image, the fifth sample image being an image in the first color mode; using the sample color correction matrix as input data, and using the first color correction matrix in the fifth sample image
  • the processing module 720 is specifically configured to:
  • the white balance model is trained according to the predicted pixel information and the first pixel information to obtain the trained white balance model.
  • the parameters of the trained white balance model are obtained by iteratively performing a backpropagation algorithm according to a difference between the predicted pixel information and the first pixel information.
  • the multispectral color filter array sensor refers to a sensor that covers a mosaic color filter array above the pixel sensor.
  • the white balance model is a fully connected neural network.
  • module here may be implemented in the form of software and/or hardware, which is not specifically limited.
  • a “module” may be a software program, a hardware circuit or a combination of both to realize the above functions.
  • the hardware circuitry may include application specific integrated circuits (ASICs), electronic circuits, processors (such as shared processors, dedicated processors, or group processors) for executing one or more software or firmware programs. etc.) and memory, incorporating logic, and/or other suitable components to support the described functionality.
  • ASICs application specific integrated circuits
  • processors such as shared processors, dedicated processors, or group processors for executing one or more software or firmware programs. etc.
  • memory incorporating logic, and/or other suitable components to support the described functionality.
  • FIG. 13 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the electronic device 800 can execute the white balance processing method shown in FIG. 7 ; the electronic device 800 includes a display module 810 and a processing module 820 .
  • the display module 810 is used to display a first interface, and the first interface includes a first control; the processing module 820 is used to detect a first operation on the first control; in response to the first operation, Acquiring a first image, where the first image refers to an image in a first color space collected by a multi-spectral color filter array sensor; performing decomposition processing and demosaic processing on the first image to obtain a second image and a third image , the second image is an image in the first color mode, and the third image is an image in the second color mode; a color correction matrix is obtained according to the second image and the third image, and the color correction matrix is used To represent the amount of pixel change for converting the second image into the third image; input the color correction matrix to the white balance model to obtain white balance parameters; perform image processing on the first image according to the white balance parameters , to obtain a fourth image; wherein, the white balance model is used to calculate parameters of white balance processing, and the white balance model is obtained by using the sample color correction matrix as input
  • processing module 820 is specifically configured to:
  • the first Bayer array image is a Bayer array image in the first color mode
  • the second Bayer array image The image is a Bayer array image of the second color mode
  • the second image includes a first pixel, a second pixel, and a third pixel
  • the third image includes a fourth pixel, a fifth pixel, and a sixth pixel
  • the processing module 820 Specifically for:
  • the color correction matrix is composed of the first vector, the second vector and the third vector.
  • the parameters of the white balance model are obtained by iterating through backpropagation according to the difference between the predicted pixel information and the first pixel information, and the predicted pixel information refers to the The output information obtained by inputting the sample color correction matrix into the white balance model.
  • the multispectral color filter array sensor refers to a sensor that covers a mosaic color filter array above the pixel sensor.
  • the white balance model is a fully connected neural network.
  • the first interface refers to a home screen interface of the electronic device
  • the home screen interface includes a camera application
  • the first control refers to a control corresponding to the camera application.
  • the first interface refers to a photographing interface
  • the first control refers to a control for instructing photographing
  • the first interface refers to a video shooting interface
  • the first control refers to a control for instructing video shooting.
  • module here may be implemented in the form of software and/or hardware, which is not specifically limited.
  • a “module” may be a software program, a hardware circuit or a combination of both to realize the above functions.
  • the hardware circuitry may include application specific integrated circuits (ASICs), electronic circuits, processors (such as shared processors, dedicated processors, or group processors) for executing one or more software or firmware programs. etc.) and memory, incorporating logic, and/or other suitable components to support the described functionality.
  • ASICs application specific integrated circuits
  • processors such as shared processors, dedicated processors, or group processors for executing one or more software or firmware programs. etc.
  • memory incorporating logic, and/or other suitable components to support the described functionality.
  • the units of each example described in the embodiments of the present application can be realized by electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are executed by hardware or software depends on the specific application and design constraints of the technical solution. Those skilled in the art may use different methods to implement the described functions for each specific application, but such implementation should not be regarded as exceeding the scope of the present application.
  • FIG. 14 shows a schematic structural diagram of an electronic device provided by the present application.
  • the dotted line in FIG. 14 indicates that this unit or this module is optional; the electronic device 900 can be used to implement the methods described in the foregoing method embodiments.
  • the electronic device 900 includes one or more processors 901, and the one or more processors 901 can support the electronic device 900 to implement the white balance model training method or the white balance processing method in the method embodiment.
  • the processor 901 may be a general purpose processor or a special purpose processor.
  • the processor 901 may be a central processing unit (central processing unit, CPU), a digital signal processor (digital signal processor, DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), a field programmable gate array (field programmable gate array, FPGA) or other programmable logic devices such as discrete gates, transistor logic devices, or discrete hardware components.
  • the processor 901 can be used to control the electronic device 900, execute software programs, and process data of the software programs.
  • the electronic device 900 may also include a communication unit 905, configured to implement signal input (reception) and output (send).
  • the electronic device 900 can be a chip, and the communication unit 905 can be an input and/or output circuit of the chip, or the communication unit 905 can be a communication interface of the chip, and the chip can be used as a component of a terminal device or other electronic devices .
  • the electronic device 900 may be a terminal device, and the communication unit 905 may be a transceiver of the terminal device, or the communication unit 905 may be a transceiver circuit of the terminal device.
  • the electronic device 900 may include one or more memories 902, on which there is a program 904, and the program 904 may be run by the processor 901 to generate an instruction 903, so that the processor 901 executes the training method described in the above method embodiment according to the instruction 903 , or the white balance processing method.
  • data may also be stored in the memory 902 .
  • the processor 901 may also read data stored in the memory 902, the data may be stored in the same storage address as the program 904, and the data may also be stored in a different storage address from the program 904.
  • the processor 901 and the memory 902 may be set separately, or may be integrated together, for example, integrated on a system-on-chip (system on chip, SOC) of the terminal device.
  • SOC system on chip
  • the memory 902 can be used to store the related program 904 of the white balance model training method provided in the embodiment of the present application, and the processor 901 can be used to call the white balance model stored in the memory 902 when executing the training white balance model
  • the related program 904 of the training method executes the training method of the white balance model in the embodiment of the present application; for example, acquiring training data, wherein the training data includes a first sample image and a second sample image, and the first sample image This image and the second sample image refer to the image of the first color space collected by the multi-spectral color filter array sensor under the same light source scene, and the second sample image includes a color card; for the first sample The image is decomposed and demosaiced to obtain a third sample image and a fourth sample image, the third sample image is an image in the first color mode, and the fourth sample image is an image in the second color mode; according to the Obtaining a sample color correction matrix from the third sample image and the fourth sample image, the sample color correction matrix is used to represent the
  • the memory 902 can be used to store the relevant program 904 of the white balance processing method provided in the embodiment of the present application, and the processor 901 can be used to call the relevant program 904 of the white balance processing method stored in the memory 902 when executing the white balance processing.
  • Program 904 execute the white balance processing method of the embodiment of the present application; for example, display a first interface, the first interface includes a first control; detect a first operation on the first control; respond to the first Operation, acquiring a first image, the first image refers to an image in a first color space collected by a multi-spectral color filter array sensor; performing decomposition processing and demosaic processing on the first image to obtain a second image and a first image Three images, the second image is an image in a first color mode, and the third image is an image in a second color mode; a color correction matrix is obtained according to the second image and the third image, and the color correction The matrix is used to represent the amount of pixel change for converting the second image into the third image; the color correction matrix is input into the white balance model to obtain white balance parameters; the first image is processed according to the white balance parameters Image processing to obtain a fourth image; wherein, the white balance model is used to calculate parameters for white balance processing, and the white balance model is obtained by training with the sample color correction
  • the present application also provides a computer program product.
  • the computer program product is executed by the processor 901, the training method or the white balance processing method described in any method embodiment in the present application is implemented.
  • the computer program product can be stored in the memory 902 , such as a program 904 , and the program 904 is finally converted into an executable object file executable by the processor 901 through processes such as preprocessing, compiling, assembling and linking.
  • the present application also provides a computer-readable storage medium, on which a computer program is stored.
  • a computer program When the computer program is executed by a computer, the white balance processing method described in any method embodiment in the present application is implemented.
  • the computer program may be a high-level language program or an executable object program.
  • the computer readable storage medium is, for example, the memory 902 .
  • the memory 902 may be a volatile memory or a nonvolatile memory, or, the memory 902 may include both a volatile memory and a nonvolatile memory.
  • the non-volatile memory can be read-only memory (read-only memory, ROM), programmable read-only memory (programmable ROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically programmable Erases programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • Volatile memory can be random access memory (RAM), which acts as external cache memory.
  • RAM random access memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • DRAM synchronous dynamic random access memory
  • SDRAM double data rate synchronous dynamic random access memory
  • double data rate SDRAM double data rate SDRAM
  • DDR SDRAM enhanced synchronous dynamic random access memory
  • ESDRAM enhanced synchronous dynamic random access memory
  • serial link DRAM SLDRAM
  • direct memory bus random access memory direct rambus RAM, DR RAM
  • the disclosed systems, devices and methods may be implemented in other ways. For example, some features of the method embodiments described above may be omitted, or not implemented.
  • the device embodiments described above are only illustrative, and the division of units is only a logical function division. In actual implementation, there may be other division methods, and multiple units or components may be combined or integrated into another system.
  • the coupling between the various units or the coupling between the various components may be direct coupling or indirect coupling, and the above coupling includes electrical, mechanical or other forms of connection.
  • sequence numbers of the processes do not mean the order of execution, and the execution order of the processes should be determined by their functions and internal logic, rather than by the embodiments of the present application.
  • the implementation process constitutes any limitation.
  • system and “network” are often used herein interchangeably.
  • the term “and/or” in this article is just an association relationship describing associated objects, which means that there can be three relationships, for example, A and/or B, which can mean: A exists alone, A and B exist simultaneously, and A and B exist alone. There are three cases of B.
  • the character "/" in this article generally indicates that the contextual objects are an "or” relationship.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

一种白平衡处理方法与电子设备,涉及图像处理领域;该白平衡处理方法包括:显示第一界面,第一界面包括第一控件;检测到对第一控件的第一操作;响应于第一操作,获取第一图像,第一图像是指多光谱色彩过滤器阵列传感器采集的第一颜色空间的图像;对第一图像进行分解处理与去马赛克处理,得到第二图像与第三图像,第二图像是第一颜色模式的图像,第三图像是第二颜色模式的图像;根据第二图像与第三图像得到颜色校正矩阵,颜色校正矩阵用于表示将第二图像转换为第三图像的像素变化量;将颜色校正矩阵输入至白平衡模型,得到白平衡参数;根据白平衡参数对第一图像进行图像处理,得到第四图像。基于本申请的技术方案,能够提高图像中颜色的准确性。

Description

白平衡处理方法与电子设备
本申请要求于2021年09月15日提交国家知识产权局、申请号为202111081629.X、申请名称为“利用多光谱图像传感器计算白平衡的方案”的中国专利申请的优先权,以及要求于2021年12月20日提交国家知识产权局、申请号为202111560454.0、申请名称为“白平衡处理方法与电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像处理领域,具体地,涉及一种白平衡处理方法与电子设备。
背景技术
电子设备在获取图像时,图像中物体的颜色会受到光源颜色的影响;对于图像传感器而言,无法在任何颜色的光源下,均能够对物体的颜色做出准确的判断;因此,需要通过白平衡处理、颜色转换矩阵以及三维查找表等一系列的操作对图像颜色进行调整;白平衡处理是指通过对光源颜色的准确估计,将图像中光源颜色产生的影响进行反向消除的操作,从而达到一种白色光拍摄的效果。
目前,现有的图像白平衡处理依赖于单帧图像,比如,3通道的图像信息;对于一些拍摄场景,比如近距离拍摄纯色物体的场景,现有的白平衡处理的准确性还有待提高;因此,如何对图像进行白平衡处理,提高图像的颜色准确性成为一个亟需解决的问题。
发明内容
本申请提供了一种白平衡处理方法与电子设备,能够提高图像中颜色的准确性。
第一方面,提供了一种白平衡处理方法,应用于电子设备,包括:
显示第一界面,所述第一界面包括第一控件;
检测到对所述第一控件的第一操作;
响应于所述第一操作,获取第一图像,所述第一图像是指多光谱色彩过滤器阵列传感器采集的第一颜色空间的图像;
对所述第一图像进行分解处理与去马赛克处理,得到第二图像与第三图像,所述第二图像是第一颜色模式的图像,所述第三图像是第二颜色模式的图像;
根据所述第二图像与所述第三图像得到颜色校正矩阵,所述颜色校正矩阵用于表示将所述第二图像转换为所述第三图像的像素变化量;
将颜色校正矩阵输入至白平衡模型,得到白平衡参数;
根据所述白平衡参数对所述第一图像进行图像处理,得到第四图像;
其中,所述白平衡模型用于计算白平衡处理的参数,所述白平衡模型是通过以样本颜色校正矩阵为输入数据,以第一像素信息为目标数据训练得到的,所述样本颜色 校正矩阵是根据第三样本图像与第四样本图像得到的,所述样本颜色校正矩阵用于表示将所述第三样本图像转换为所述第四样本图像的像素变化量,所述第三样本图像与所述第四样本图像是对第一样本图像进行所述分解处理与所述去马赛克处理得到的,所述第一像素信息是指第五样本图像中色卡包括的中性色块对应的像素值,所述第五样本图像是对第二样本图像进行所述分解处理与所述去马赛克处理得到的,所述第一样本图像与所述第二样本图像是指在同一光源场景下所述多光谱色彩过滤器阵列传感器采集的所述第一颜色空间的图像,所述第二样本图像中包括所述色卡。
示例性地,第一颜色空间可以是Raw颜色空间;色卡可以是指包括不同颜色的色板,色板中包括中性色;中性色可以是指由黑色、白色及由黑白调和的各种深浅不同的灰色系列。
应理解,色彩过滤器阵列(color filter array,CFA)传感器是指在像素传感器上方覆盖马赛克滤色镜阵列的传感器,用于采集图像的色彩信息;一般的光电传感器只能感应光的强度,不能区分光的波长(色彩);过滤器阵列传感器可以通过色彩过滤(color filter)以获取像素点的色彩信息。
可选地,色彩过滤器阵列传感器可以采集的Raw图像可以包括RGB颜色模式与其他颜色模式;例如,采集的Raw图像可以是指RGBCYM图像,或者,RGBCYGM图像,或者其他颜色模式的图像。
应理解,由于中性色块的中性材料对各种波长的光反射是均匀的,因此任何颜色的光照射在色卡上就反射该颜色的光;比如,红光照射在色卡的中性色块上就反射红光;绿光照射在色卡的中性色块上就反射绿光;因此可以采用色卡的中性色卡来标记光源的颜色。
在本申请的实施例中,通过多光谱色彩过滤器阵列传感器采集第一图像,通过对第一图像进行分解处理与去马赛克处理可以得到不同颜色模式的第二图像与第三图像;根据第二图像与第三图像可以得到颜色校正矩阵,将颜色校正矩阵输入至预先训练的白平衡模型可以得到白平衡参数;通过白平衡参数对第一图像进行图像处理,得到第四图像;由于多光谱色彩过滤器阵列传感器采集的图像(例如,6通道Raw图)中的颜色信息多于单帧图像(例如,3通道图像)的颜色信息,因此通过第一图像得到的白平衡参数能够提高白平衡处理的准确性,提高图像的颜色准确性。
结合第一方面,在第一方面的某些实现方式中,所述对所述第一图像进行分解处理与去马赛克处理,得到第二图像与第三图像,包括:
对所述第一图像进行所述分解处理,得到第一拜耳阵列图像与第二拜耳阵列图像,所述第一拜耳阵列图像为所述第一颜色模式的拜耳阵列图像,所述第二拜耳阵列图像为所述第二颜色模式的拜耳阵列图像;
对所述第一拜耳阵列图像进行所述去马赛克处理,得到所述第二图像;
对所述第二拜耳阵列图像进行所述去马赛克处理,得到所述第三图像。
可选地,在一种可能的实现方式中,第一颜色模式可以是指RGB颜色模式,第一拜耳阵列图像可以是指RGGB图像;第二颜色模式可以是指CYM颜色模式,第二拜耳阵列图像可以是指CYYM图像。
可选地,在一种可能的实现方式中,第一颜色模式可以是指RGB颜色模式,第一拜耳阵列图像可以是指RGGB图像;第二颜色模式可以是指CYGM颜色模式,第二 拜耳阵列图像可以是指CYGM图像。
需要说明的是,多光谱色彩过滤器阵列传感器采集第一图像可以包括RGB颜色模式与其他颜色模式,其他颜色模式可以是指CYM颜色模式、CYGM颜色模式或者其他颜色模式。
结合第一方面,在第一方面的某些实现方式中,所述第二图像包括第一像素、第二像素与第三像素,所述第三图像包括第四像素、第五像素与第六像素,所述根据所述第二图像与所述第三图像得到颜色校正矩阵,包括:
根据所述第一像素与所述第四像素之间的差值得到第一向量;
根据所述第二像素与所述第五像素之间的差值得到第二向量;
根据所述第三像素与所述第六像素直接的差值得到第三向量;
由所述第一向量、所述第二向量与所述第三向量组成所述颜色校正矩阵。
可选地,在一种可能的实现方式中,第一拜耳阵列图像为RGGB图像,第二拜耳阵列图像为CMMY图像;对于RGB图像中的每个像素可以对应一组RGB值,CMY图像中的每个像素可以对应一组CMY值;将R像素对应的RGB值转换为C像素对应的CMY值可以得到一个3×1的矩阵;同理,将G像素对应的RGB值转换为M像素对应的CMY值可以得到一个3×1的矩阵;将B像素对应的RGB值转换为Y像素对应的CMY值可以得到一个3×1的矩阵;通过3个一维矩阵得到一个3×3的颜色校正矩阵。
结合第一方面,在第一方面的某些实现方式中,所述白平衡模型的参数是根据预测像素信息与所述第一像素信息之间的差异通过反向传播算进行迭代得到的,所述预测像素信息是指将所述样本颜色校正矩阵输入所述白平衡模型得到的输出信息。
结合第一方面,在第一方面的某些实现方式中,所述多光谱色彩过滤器阵列传感器是指在像素传感器上方覆盖马赛克滤色镜阵列的传感器。
结合第一方面,在第一方面的某些实现方式中,所述白平衡模型为全连接神经网络。
结合第一方面,在第一方面的某些实现方式中,所述第一界面是指所述电子设备的主屏界面,所述主屏界面包括相机应用程序,所述第一控件是指所述相机应用程序对应的控件。
可选地,在一种可能的实现方式中,第一操作是指点击相机应用程序的操作。
结合第一方面,在第一方面的某些实现方式中,所述第一界面是指拍照界面,所述第一控件是指用于指示拍照的控件。
可选地,在一种可能的实现方式中,第一操作是指点击用于指示拍照的控件的操作。
结合第一方面,在第一方面的某些实现方式中,所述第一界面是指拍摄视频界面,所述第一控件是指用于指示拍摄视频的控件。
可选地,在一种可能的实现方式中,第一操作是指点击指示拍摄视频的控件的操作。
上述以第一操作为点击操作为例进行举例说明;第一操作还可以包括语音指示操作,或者其它的指示电子设备进行拍照或者拍摄视频的操作;上述为举例说明,并不对本申请作任何限定。
第二方面,提供了二种白平衡模型的训练方法,包括:
获取训练数据,其中,所述训练数据包括第一样本图像与第二样本图像,所述第一样本图像与所述第二样本图像是指在同一光源场景下多光谱色彩过滤器阵列传感器采集的第一颜色空间的图像,所述第二样本图像中包括色卡;
对所述第一样本图像进行分解处理与去马赛克处理,得到第三样本图像与第四样本图像,所述第三样本图像为第一颜色模式的图像,所述第四样本图像为第二颜色模式的图像;
根据所述第三样本图像与所述第四样本图像得到样本颜色校正矩阵,所述样本颜色校正矩阵用于表示将所述第三样本图像转换为所述第四样本图像的像素变化量;
对所述第二样本图像进行分解处理与去马赛克处理,得到第五样本图像,所述第五样本图像为所述第一颜色模式的图像;
以所述样本颜色校正矩阵为输入数据,以所述第五样本图像中的第一像素信息为目标数据训练白平衡模型,得到训练后的白平衡模型,其中,所述白平衡模型用于计算白平衡处理的参数,所述第一像素信息是指所述第五样本图像中所述色卡包括的中性色块对应的像素值。
示例性地,第一颜色空间可以是Raw颜色空间;色卡可以是指包括不同颜色的色板,色板中包括中性色;中性色可以是指由黑色、白色及由黑白调和的各种深浅不同的灰色系列。
应理解,色彩过滤器阵列(color filter array,CFA)传感器是指在像素传感器上方覆盖马赛克滤色镜阵列的传感器,用于采集图像的色彩信息;一般的光电传感器只能感应光的强度,不能区分光的波长(色彩);过滤器阵列传感器可以通过色彩过滤(color filter)以获取像素点的色彩信息。
可选地,色彩过滤器阵列传感器可以采集的Raw图像可以包括RGB颜色模式与其他颜色模式;例如,采集的Raw图像可以是指RGBCYM图像,或者,RGBCYGM图像,或者其他颜色模式的图像。
可选地,多光谱色彩过滤器阵列传感器获取的Raw图像可以是RGBCMY图像;其中,R表示红色(red)、G表示绿色(green)、B表示蓝色(blue)、C表示青色(cyan)、M表示深红色(mayenta)、Y表示黄色(yellow)。
可选地,多光谱色彩过滤器阵列传感器获取的Raw图像可以是RGBCYGM图像;其中,R表示红色(red)、G表示绿色(green)、B表示蓝色(blue)、C表示青色(cyan)、Y表示黄色(yellow)、M表示深红色(mayenta)。
应理解,由于中性色块的中性材料对各种波长的光反射是均匀的,因此任何颜色的光照射在色卡上就反射该颜色的光;比如,红光照射在色卡的中性色块上就反射红光;绿光照射在色卡的中性色块上就反射绿光;因此可以采用色卡的中性色卡来标记光源的颜色。
可选地,在一种可能的实现方式中,在获取训练白平衡模型的目标数据时,还可以是对第二样本图像进行去马赛克算法处理,得到处理后的图像;在处理后的图像中获取色卡的中性色块对应的像素值中RGB的像素值。
在本申请的实施例中,通过多光谱色彩过滤器阵列传感器采集同一光源,相同场景下不包括色卡的第一样本图像与包括色卡的第二样本图像;通过对第一样本图像处 理可以得到训练白平衡模型的输入数据;通过第二样本图像可以得到训练白平衡模型的目标数据;通过本申请实施例的白平衡模型的训练方法得到的训练后的白平衡模型可以用于计算白平衡处理的参数,根据参数可以多光谱色彩过滤器阵列传感器采集的Raw图进行白平衡处理;在本申请实施例中,由于训练白平衡模型的训练数据是通过多光谱色彩过滤器阵列传感器采集的图像,多光谱色彩过滤器阵列传感器采集的图像(例如,6通道Raw图)中的颜色信息多于单帧图像(例如,3通道图像)的颜色信息,因此训练得到的白平衡模型输出的用于白平衡处理的参数能够提高白平衡处理的准确性,提高图像的颜色准确性。
结合第二方面,在第二方面的某些实现方式中,所述以所述样本颜色校正矩阵为输入数据,以所述第五样本图像中的第一像素信息为目标数据训练白平衡模型,包括:
将所述样本颜色校正矩阵输入所述白平衡模型,得到预测像素信息;
根据所述预测像素信息与所述第一像素信息训练所述白平衡模型,得到所述训练后的白平衡模型。
应理解,预测像素信息是指预测的光源RGB值。
结合第二方面,在第二方面的某些实现方式中,所述训练后的白平衡模型的参数是根据所述预测像素信息与所述第一像素信息之间的差异通过反向传播算进行迭代得到的。
可选地,可以预测像素信息与所述第一像素信息之间的角误差损失(Angular Error Loss)反向迭代训练白平衡模型。
结合第二方面,在第二方面的某些实现方式中,所述多光谱色彩过滤器阵列传感器是指在像素传感器上方覆盖马赛克滤色镜阵列的传感器。
结合第二方面,在第二方面的某些实现方式中,所述白平衡模型为全连接神经网络。
第三方面,提供一种电子设备,包括用于执行第一方面或第一方面中任一种方法的模块/单元。
第四方面,提供一种电子设备,包括用于执行第二方面或第二方面中任一种训练方法的模块/单元。
第五方面,提供了一种电子设备,所述电子设备包括一个或多个处理器和存储器;所述存储器与所述一个或多个处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行:
显示第一界面,所述第一界面包括第一控件;
检测到对所述第一控件的第一操作;
响应于所述第一操作,获取第一图像,所述第一图像是指多光谱色彩过滤器阵列传感器采集的第一颜色空间的图像;
对所述第一图像进行分解处理与去马赛克处理,得到第二图像与第三图像,所述第二图像是第一颜色模式的图像,所述第三图像是第二颜色模式的图像;
根据所述第二图像与所述第三图像得到颜色校正矩阵,所述颜色校正矩阵用于表示将所述第二图像转换为所述第三图像的像素变化量;
将颜色校正矩阵输入至白平衡模型,得到白平衡参数;
根据所述白平衡参数对所述第一图像进行图像处理,得到第四图像;
其中,所述白平衡模型用于计算白平衡处理的参数,所述白平衡模型是通过以样本颜色校正矩阵为输入数据,以第一像素信息为目标数据训练得到的,所述样本颜色校正矩阵是根据第三样本图像与第四样本图像得到的,所述样本颜色校正矩阵用于表示将所述第三样本图像转换为所述第四样本图像的像素变化量,所述第三样本图像与所述第四样本图像是对第一样本图像进行所述分解处理与所述去马赛克处理得到的,所述第一像素信息是指第五样本图像中色卡包括的中性色块对应的像素值,所述第五样本图像是对第二样本图像进行所述分解处理与所述去马赛克处理得到的,所述第一样本图像与所述第二样本图像是指在同一光源场景下所述多光谱色彩过滤器阵列传感器采集的所述第一颜色空间的图像,所述第二样本图像中包括所述色卡。
结合第五方面,在第五方面的某些实现方式中,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行:
对所述第一图像进行所述分解处理,得到第一拜耳阵列图像与第二拜耳阵列图像,所述第一拜耳阵列图像为所述第一颜色模式的拜耳阵列图像,所述第二拜耳阵列图像为所述第二颜色模式的拜耳阵列图像;
对所述第一拜耳阵列图像进行所述去马赛克处理,得到所述第二图像;
对所述第二拜耳阵列图像进行所述去马赛克处理,得到所述第三图像。
结合第五方面,在第五方面的某些实现方式中,所述第二图像包括第一像素、第二像素与第三像素,所述第三图像包括第四像素、第五像素与第六像素,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行:
根据所述第一像素与所述第四像素之间的差值得到第一向量;
根据所述第二像素与所述第五像素之间的差值得到第二向量;
根据所述第三像素与所述第六像素直接的差值得到第三向量;
由所述第一向量、所述第二向量与所述第三向量组成所述颜色校正矩阵。
结合第五方面,在第五方面的某些实现方式中,所述白平衡模型的参数是根据预测像素信息与所述第一像素信息之间的差异通过反向传播算进行迭代得到的,所述预测像素信息是指将所述样本颜色校正矩阵输入所述白平衡模型得到的输出信息。
结合第五方面,在第五方面的某些实现方式中,所述多光谱色彩过滤器阵列传感器是指在像素传感器上方覆盖马赛克滤色镜阵列的传感器。
结合第五方面,在第五方面的某些实现方式中,所述白平衡模型为全连接神经网络。
结合第五方面,在第五方面的某些实现方式中,所述第一界面是指所述电子设备的主屏界面,所述主屏界面包括相机应用程序,所述第一控件是指所述相机应用程序对应的控件。
结合第五方面,在第五方面的某些实现方式中,所述第一界面是指拍照界面,所述第一控件是指用于指示拍照的控件。
结合第五方面,在第五方面的某些实现方式中,所述第一界面是指拍摄视频界面,所述第一控件是指用于指示拍摄视频的控件。
第六方面,提供了一种电子设备,所述电子设备包括一个或多个处理器和存储器;所述存储器与所述一个或多个处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行:
获取训练数据,其中,所述训练数据包括第一样本图像与第二样本图像,所述第一样本图像与所述第二样本图像是指在同一光源场景下多光谱色彩过滤器阵列传感器采集的第一颜色空间的图像,所述第二样本图像中包括色卡;
对所述第一样本图像进行分解处理与去马赛克处理,得到第三样本图像与第四样本图像,所述第三样本图像为第一颜色模式的图像,所述第四样本图像为第二颜色模式的图像;
根据所述第三样本图像与所述第四样本图像得到样本颜色校正矩阵,所述样本颜色校正矩阵用于表示将所述第三样本图像转换为所述第四样本图像的像素变化量;
对所述第二样本图像进行分解处理与去马赛克处理,得到第五样本图像,所述第五样本图像为所述第一颜色模式的图像;
以所述样本颜色校正矩阵为输入数据,以所述第五样本图像中的第一像素信息为目标数据训练白平衡模型,得到训练后的白平衡模型,其中,所述白平衡模型用于计算白平衡处理的参数,所述第一像素信息是指所述第五样本图像中所述色卡包括的中性色块对应的像素值。
结合第六方面,在第六方面的某些实现方式中,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行:
将所述样本颜色校正矩阵输入所述白平衡模型,得到预测像素信息;
根据所述预测像素信息与所述第一像素信息训练所述白平衡模型,得到所述训练后的白平衡模型。
结合第六方面,在第六方面的某些实现方式中,所述训练后的白平衡模型的参数是根据所述预测像素信息与所述第一像素信息之间的差异通过反向传播算进行迭代得到的。
结合第六方面,在第六方面的某些实现方式中,所述多光谱色彩过滤器阵列传感器是指在像素传感器上方覆盖马赛克滤色镜阵列的传感器。
结合第六方面,在第六方面的某些实现方式中,所述白平衡模型为全连接神经网络。
第七方面,提供了一种电子设备,该电子设备用于白平衡处理,所述电子设备包括:一个或多个处理器和存储器;所述存储器与所述一个或多个处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行第一方面中或第一方面中的任一种方法。
第八方面,提供了一种电子设备,该电子设备用于白平衡模型的训练,所述电子设备包括:一个或多个处理器和存储器;所述存储器与所述一个或多个处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行第二方面或者第二方面 中的任一种训练方法。
第九方面,提供了一种芯片系统,所述芯片系统应用于电子设备,所述芯片系统包括一个或多个处理器,所述处理器用于调用计算机指令以使得所述电子设备执行第一方面或第二方面中的任一种方法。
第十方面,提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序代码,当所述计算机程序代码被电子设备运行时,使得该电子设备执行第一方面或第二方面中的任一种方法。
第十一方面,提供了一种计算机程序产品,所述计算机程序产品包括:计算机程序代码,当所述计算机程序代码被电子设备运行时,使得该电子设备执行第一方面或第二方面中的任一种方法。
在本申请的实施例中,通过多光谱色彩过滤器阵列传感器采集同一光源,相同场景下不包括色卡的第一样本图像与包括色卡的第二样本图像;通过对第一样本图像处理可以得到训练白平衡模型的输入数据;通过第二样本图像可以得到训练白平衡模型的目标数据;通过本申请实施例的白平衡模型的训练方法得到的训练后的白平衡模型可以用于计算白平衡处理的参数,根据参数可以多光谱色彩过滤器阵列传感器采集的Raw图进行白平衡处理;在本申请实施例中,由于训练白平衡模型的训练数据是通过多光谱色彩过滤器阵列传感器采集的图像,多光谱色彩过滤器阵列传感器采集的图像(例如,6通道Raw图)中的颜色信息多于单帧图像(例如,3通道图像)的颜色信息,因此训练得到的白平衡模型输出的用于白平衡处理的参数能够提高白平衡处理的准确性,提高图像的颜色准确性。
附图说明
图1是本申请提供一种全连接神经网络的示意图;
图2是一种适用于本申请的电子设备的硬件系统的示意图;
图3是一种适用于本申请实施例的应用场景的示意图;
图4是一种适用于本申请实施例的应用场景的示意图;
图5是一种适用于本申请的白平衡处理方法的系统架构的示意图;
图6是本申请提供的一种多光谱图像的光谱响应曲线的示意图;
图7是一种适用于本申请的白平衡处理方法的示意图;
图8是本申请实施例提供的一种白平衡模型的训练方法的示意图;
图9是本申请实施例提供的白平衡处理方法的效果示意图;
图10是本申请实施例提供的一种电子设备的显示界面的示意图;
图11是本申请实施例提供的一种电子设备的显示界面的示意图;
图12是本申请实施例提供的一种电子设备的结构示意图;
图13是本申请实施例提供的一种电子设备的结构示意图;
图14是本申请实施例提供的一种电子设备的结构示意图。
具体实施方式
在本申请的实施例中,以下术语“第一”、“第二”、“第三”、“第四”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。
为了便于对本申请实施例的理解,首先对本申请实施例中涉及的相关概念进行简要说明。
1、白平衡
白平衡是描述显示器中红、绿、蓝三基色混合生成后白色精确度的一项指标。相机设备的白平衡设置是确保获得理想的画面色彩的重要保证;白平衡处理是指通过对光源颜色的准确估计,将图像中光源颜色产生的影响进行反向消除的操作,从而达到一种白色光拍摄的效果。
2、拜耳阵列
拜耳阵列是实现电荷耦合器件(charge-coupled device,CCD)或者互补金属氧化物半导体(complementary metal oxide semiconductor,CMOS)传感器拍摄彩色图像的主要技术之一;它可以是一个4×4阵列,比如,由8个绿色、4个蓝色和4个红色像素组成,在将灰度图形转换为彩色图片时会以2×2矩阵进行9次运算,最后生成一幅彩色图像。
3、色彩过滤器阵列(color filter array,CFA)传感器
色彩过滤器阵列传感器是指在像素传感器上方覆盖马赛克滤色镜阵列的传感器,色彩过滤器阵列传感器用于采集图像的色彩信息;一般的光电传感器只能感应光的强度,不能区分光的波长(色彩);过滤器阵列传感器可以通过色彩过滤(color filter)以获取像素点的色彩信息。
4、颜色模式
颜色模式是将某种颜色表现为数字形式的模型,或者是一种记录图像颜色的方式。
5、去马赛克(demosaic)
去马赛克是指将Raw图像转换为RGB图像的图像处理过程。
6、中性色
中性色可以是指由黑色、白色及由黑白调和的各种深浅不同的灰色系列。
7、神经网络
神经网络是指将多个单一的神经单元联结在一起形成的网络,即一个神经单元的输出可以是另一个神经单元的输入;每个神经单元的输入可以与前一层的局部接受域相连,来提取局部接受域的特征,局部接受域可以是由若干个神经单元组成的区域。
8、全连接神经网络
全连接神经网络又可以称为深度神经网络(deep neural network,DNN)或者多层神经网络,可以理解为是具有多层隐含层的神经网络。按照不同层的位置对全连接神经网络进行划分,全连接神经网络内部的神经网络可以分为三类:输入层,隐含层,输出层;通常,第一层是输入层,最后一层是输出层,中间的层数都是隐含层;层与层之间是全连接的,也就是说,第i层的任意一个神经元一定与第i+1层的任意一个神经元相连。
全连接神经网络的工作原理可以通过如下线性关系表达式表示:y=a(w·x+b);其 中,x表示输入向量,y表示输出向量,b表示偏移向量,w表示权重矩阵(也称为系数),a()表示激活函数。每一层是对输入向量x经过线性表达式的操作得到输出向量y。
由于DNN层数多,系数w和偏移向量b的数量也比较多;这些参数在DNN中的定义如下所述:
如图1所示,假设在一个三层的DNN中;该三层的DNN包括输入层(第一层)、隐藏层(第二层)与输出层(第三层);比如,第二层的第4个神经元到第三层的第1个神经元的线性系数定义可以表示为w 78,下标78表示对应的是输入的第二层索引7和输出的第三层索引8。
9、反向传播算法
神经网络可以采用误差反向传播(back propagation,BP)算法在训练过程中修正初始的神经网络模型中参数的大小,使得神经网络模型的重建误差损失越来越小。具体地,前向传递输入信号直至输出会产生误差损失,通过反向传播误差损失信息来更新初始的神经网络模型中参数,从而使误差损失收敛。反向传播算法是以误差损失为主导的反向传播运动,旨在得到最优的神经网络模型的参数,例如权重矩阵。
下面将结合附图,对本申请实施例中的白平衡处理方法与电子设备进行描述。
图2示出了一种适用于本申请的电子设备的硬件系统。
电子设备100可以是手机、智慧屏、平板电脑、可穿戴电子设备、车载电子设备、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)、投影仪等等,本申请实施例对电子设备100的具体类型不作任何限制。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
需要说明的是,图2所示的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图2所示的部件更多或更少的部件,或者,电子设备100可以包括图2所示的部件中某些部件的组合,或者,电子设备100可以包括图2所示的部件中某些部件的子部件。图2示的部件可以以硬件、软件、或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元。例如,处理器110可以包括以下处理单元中的至少一个:应用处理器(application processor,AP)、调制解调处理器、图形处理器(graphics processing unit,GPU)、图像信号处理器(image signal processor, ISP)、控制器、视频编解码器、数字信号处理器(digital signal processor,DSP)、基带处理器、神经网络处理器(neural-network processing unit,NPU)。其中,不同的处理单元可以是独立的器件,也可以是集成的器件。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
示例性地,处理器110可以用于执行本申请实施例的白平衡处理方法;例如,显示第一界面,第一界面包括第一控件;检测到对第一控件的第一操作;响应于第一操作,获取第一图像,第一图像是指多光谱色彩过滤器阵列传感器采集的第一颜色空间的图像;对第一图像进行分解处理与去马赛克处理,得到第二图像与第三图像,第二图像是第一颜色模式的图像,第三图像是第二颜色模式的图像;根据第二图像与第三图像得到颜色校正矩阵,颜色校正矩阵用于表示将第二图像转换为第三图像的像素变化量;将颜色校正矩阵输入至白平衡模型,得到白平衡参数;根据白平衡参数对第一图像进行图像处理,得到第四图像。
示例性地,处理器110可以用于执行本申请实施例的白平衡模型的训练方法;例如,获取训练数据,其中,训练数据包括第一样本图像与第二样本图像,第一样本图像与第二样本图像是指在同一光源场景下多光谱色彩过滤器阵列传感器采集的第一颜色空间的图像,第二样本图像中包括色卡;对第一样本图像进行分解处理与去马赛克处理,得到第三样本图像与第四样本图像,第三样本图像为第一颜色模式的图像,第四样本图像为第二颜色模式的图像;根据第三样本图像与第四样本图像得到样本颜色校正矩阵,样本颜色校正矩阵用于表示将第三样本图像转换为第四样本图像的像素变化量;对第二样本图像进行分解处理与去马赛克处理,得到第五样本图像,第五样本图像为第一颜色模式的图像;以样本颜色校正矩阵为输入数据,以第五样本图像中的第一像素信息为目标数据训练白平衡模型,得到训练后的白平衡模型,其中,白平衡模型用于计算白平衡处理的参数,第一像素信息是指第五样本图像中色卡包括的中性色块对应的像素值。
图2所示的各模块间的连接关系只是示意性说明,并不构成对电子设备100的各模块间的连接关系的限定。可选地,电子设备100的各模块也可以采用上述实施例中多种连接方式的组合。
电子设备100的无线通信功能可以通过天线1、天线2、移动通信模块150、无线通信模块160、调制解调处理器以及基带处理器等器件实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
电子设备100可以通过GPU、显示屏194以及应用处理器实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算, 用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194可以用于显示图像或视频。
电子设备100可以通过ISP、摄像头193、视频编解码器、GPU、显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP可以对图像的噪点、亮度和色彩进行算法优化,ISP还可以优化拍摄场景的曝光和色温等参数。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的红绿蓝(red green blue,RGB),YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1、MPEG2、MPEG3和MPEG4。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x轴、y轴和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。例如,当快门被按下时,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航和体感游戏等场景。
示例性地,在本申请的实施例中陀螺仪传感器180B可以用于采集的抖动信息,抖动信息可以用于表示电子设备在拍摄过程中的位姿变化。
加速度传感器180E可检测电子设备100在各个方向上(一般为x轴、y轴和z轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。加速度传感器180E还可以用于识别电子设备100的姿态,作为横竖屏切换和计步器等应用程序的输入参数。
距离传感器180F用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,例如在拍摄场景中,电子设备100可以利用距离传感器180F测距以实现快速对焦。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮 度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现解锁、访问应用锁、拍照和接听来电等功能。
触摸传感器180K,也称为触控器件。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,触摸屏也称为触控屏。触摸传感器180K用于检测作用于其上或其附近的触摸操作。触摸传感器180K可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,并且与显示屏194设置于不同的位置。
电子设备在获取图像时图像中物体的颜色会受到光源颜色的影响;对于图像传感器而言,无法在任何颜色的光源下,均能够对物体的颜色做出准确的判断;因此,需要通过白平衡处理、颜色转换矩阵以及三维查找表等一系列的操作对图像颜色进行调整;白平衡处理是指通过对光源颜色的准确估计,将图像中光源颜色产生的影响进行反向消除的操作,从而达到一种白色光拍摄的效果。目前,现有的图像白平衡处理依赖于单帧图像,比如,3通道的图像信息;对于一些拍摄场景,比如近距离拍摄纯色物体的场景,现有的白平衡处理的准确性还有待提高。
有鉴于此,本申请实施例提供了一种白平衡处理方法与电子设备,在本申请的实施例中,通过多光谱色彩过滤器阵列传感器采集第一图像,通过对第一图像进行分解处理与去马赛克处理可以得到不同颜色模式的第二图像与第三图像;根据第二图像与第三图像可以得到颜色校正矩阵,将颜色校正矩阵输入至预先训练的白平衡模型可以得到白平衡参数;通过白平衡参数对第一图像进行图像处理,得到第四图像;由于多光谱色彩过滤器阵列传感器采集的图像(例如,6通道Raw图)中的颜色信息多于单帧图像(例如,3通道图像)的颜色信息,因此通过第一图像得到的白平衡参数能够提高白平衡处理的准确性,提高图像的颜色准确性。
下面结合图3与图4对本申请实施例提供的白平衡处理方法的应用场景进行举例说明。
示例性地,本申请实施例中的白平衡处理方法可以应用于拍照领域、录制视频领域、视频通话领域或者其他图像处理领域;通过本申请实施例中的白平衡处理方法对图像进行白平衡处理,能够提高图像的颜色准确性。
应用场景一:拍照领域
如图3所示,在暗光场景下(例如,夜景环境)拍照时,电子设备的进光量少导致图像的信噪比较低;现有的白平衡处理即基于单帧图像的白平衡算法容易错误,从而导致获取的图像存在较明显的颜色失真;图3中的(a)所示的是采样现有的白平衡处理方法得到拍摄对象210的预览图像;图3中的(b)所示的是通过本申请实施例提供的白平衡处理方法得到拍摄对象210的预览图像;图3中的(b)所示的预览图像与图3中的(a)所示的预览图像相比,图3中的(b)所示的预览图像中拍摄对象210的颜色还原度更高;因此,通过本申请实施例的白平衡处理方法对图像进行白平衡处理,能够提高图像的颜色准确性。
应用场景二:视频通话
如图4所示,图4中的(a)所示的是采样现有的白平衡处理方法得到视频通话的拍摄对象220的预览图像;图4中的(b)是通过本申请实施例提供的白平衡处理方法得到视频通话的拍摄对象220的预览图像;图4中的(b)所示的预览图像与图4中的(a)所示的预览图像相比,图4中的(b)所示的预览图像中拍摄对象220的预览图像的颜色还原性更高;因此,通过本申请实施例的白平衡处理方法对图像进行白平衡处理,能够提高图像的颜色准确性。
应理解,上述为对应用场景的举例说明,并不对本申请的应用场景作任何限定。
下面结合图5与图11对本申请实施例提供的白平衡处理方法进行详细描述。
图5是一种适用于本申请的白平衡处理方法的系统架构的示意图。
如图5所示,系统架构300中可以包括多光谱色彩过滤器阵列传感器310、分解模块320、白平衡参数计算模块330与图像信号处理器340;其中,图像信号处理器340中还可以包括白平衡模块341。
示例性地,多光谱色彩过滤器阵列传感器310可以用于获取Raw图像;比如,多光谱色彩过滤器阵列传感器310采集的Raw图像可以包括RGB颜色模式与其他颜色模式。
示例性地,多光谱色彩过滤器阵列传感器310采集的Raw图像可以是指RGBCYM图像,或者,RGBCYGM图像,或者其他颜色模式的图像。
在一个示例中,多光谱色彩过滤器阵列传感器310获取的Raw图像可以是RGBCMY图像;其中,R表示红色(red)、G表示绿色(green)、B表示蓝色(blue)、C表示青色(cyan)、M表示深红色(mayenta)、Y表示黄色(yellow)。
在一个示例中,多光谱色彩过滤器阵列传感器310获取的Raw图像可以是RGBCYGM图像;其中,R表示红色(red)、G表示绿色(green)、B表示蓝色(blue)、C表示青色(cyan)、Y表示黄色(yellow)、M表示深红色(mayenta)。
应理解,上述以RGBCMY图像与RGBCYGM图像进行举例说明,本申请对此不作任何限定。
示例性地,多光谱色彩过滤器阵列传感器310获取的Raw图像可以是多通道的图像;比如,6通道的Raw图像、8通道的Raw图像,或者其他通道数量的Raw图像。
应理解,多光谱色彩过滤器阵列传感器采集的Raw图像可以分解为RGB颜色模式的图像与其他颜色模式的图像,两个图像的光谱响应曲线满足任意两个曲线不相同。
例如,图6所示的为6通道的RGBCMY图像的光谱响应曲线,其中,曲线1表示蓝色(blue,B)对应的光谱响应曲线;曲线2表示青色(cyan,C)对应的光谱响应曲线;曲线3表示深红色(mayenta,M)对应的光谱响应曲线;曲线4表示黄色(yellow,Y)对应的光谱响应曲线;曲线5表示绿色(green,G)对应的光谱响应曲线;曲线6表示红色(red,R)对应的光谱响应曲线;图6所示的RGBCMY图像的光谱响应曲线中,曲线6与曲线3、曲线5与曲线4、曲线1与曲线2分别两两对应;从图6中可以看出,曲线3比曲线6的光谱范围更宽,则曲线3比曲线6的进光量更好;曲线4比曲线5的光谱范围更宽,则曲线4比曲线5的进光量更好;曲线2比曲线1的光谱范围更宽,则曲线2比曲线1的进光量更好。
示例性地,分解模块320用于将多光谱色彩过滤器阵列传感器310采集的Raw图像分 解为第一拜耳阵列图像与第二拜耳阵列图像,其中,第一拜耳阵列图像可以是指第一颜色模式(例如,RGB颜色模式)的拜耳阵列图像,第二拜耳阵列图像可以是指第二颜色模式(例如,CYM颜色模式,CYGM颜色模式或者其他颜色模式)的拜耳阵列图像。
应理解,第一拜耳阵列图像的光谱响应曲线与第二拜耳阵列图像的光谱响应曲线中任意两个曲线的不同。
示例性地,分解模块320可以对多光谱色彩过滤器阵列传感器采集的Raw图像先进行去马赛克处理(demosaic),再进行缩放处理(resize)。例如,第一拜耳阵列图像与第二拜耳阵列图像的尺寸可以是68*48*3。
可选地,第一拜耳阵列图像可以是指RGB颜色模式的图像;第二拜耳阵列图像可以是指其他颜色模式的图像;比如,第二拜耳阵列图像可以是指CMY颜色模式的图像;或者第二拜耳阵列图像可以是指CYGM颜色模式的图像。
示例性地,多光谱色彩过滤器阵列传感器310采集的Raw图像可以是多通道的图像;将多通道的图像可以分为3通道的第一拜耳阵列图像(例如,RGB颜色模式)与其他通道数的第二拜耳阵列图像。
可选地,多光谱色彩过滤器阵列传感器310采集的Raw图像可以是6通道的RGBCMY图像,则可以分解为3通道的第一拜耳阵列图像(RGB图像)与3通道的第二拜耳阵列图像(CMY图像)。
可选地,多光谱色彩过滤器阵列传感器310采集的Raw图像可以是7通道的RGBCYGM图像,则可以分解为3通道的第一拜耳阵列图像(RGB图像)与4通道的第二拜耳阵列图像(CYGM图像)。
可选地,多光谱色彩过滤器阵列传感器310采集的Raw图像可以是8通道的Raw图像,则可以分解为3通道的第一拜耳阵列图像(RGB图像)与5通道的第二拜耳阵列图像。
应理解,上述为对多光谱色彩过滤器阵列传感器310采集的Raw图像进行举例描述,本申请对此不作任何限定。
示例性地,白平衡参数计算模块330用于根据第一拜耳阵列图像与第二拜耳阵列图像计算白平衡参数,具体过程可以参见后续图7所示的步骤S430至步骤S450。
可选地,白平衡参数计算模块330还可以将得到的白平衡参数传输至图像信号处理器340。
示例性地,图像信号处理器340获取白平衡参数后白平衡模块341可以根据白平衡参数对多光谱色彩过滤器阵列传感器310采集的图像进行白平衡处理,得到处理后的图像。
例如,白平衡模块341可以根据白平衡参数进行对多光谱色彩过滤器阵列传感器310采集的Raw图像进行自动白平衡处理(automatic white balance,AWB)。
可选地,白平衡参数计算模块330可以是CPU、GPU或者其他算例硬件中的模块;或者,白平衡参数计算模块330可以是图像信号处理器340中的模块;或者,白平衡参数计算模块330的功能也可以在白平衡模块341中执行。
需要说明的是,图5所示的系统架构可以是如图2所示的电子设备中的系统架构;图5所示的系统架构可以执行图7所示的白平衡处理方法,下面对图7所示的白平衡处理方法进行详细描述。
图7是本申请实施例提供的一种白平衡处理方法的示意图;该方法400包括步骤S410 至步骤S 470,下面分别对步骤S410至步骤S470进行详细的描述。
步骤S410、获取多光谱色彩过滤器阵列传感器采集的Raw图像(第一图像的一个示例);其中,Raw图像是指如图5所示的多光谱色彩过滤器阵列传感器310采集的Raw图像。
例如,多光谱色彩过滤器阵列传感器310可以采集6通道的Raw图像,该Raw图像可以包括RGB颜色模式与其他颜色模式。
应理解,色彩过滤器阵列传感器是指在像素传感器上方覆盖马赛克滤色镜阵列的传感器,用于采集图像的色彩信息;通过色彩过滤器阵列传感器可以获取更多通道的Raw图像;色彩过滤器阵列传感器采集的Raw图像的光谱响应曲线的光谱范围更宽,即色彩过滤器阵列传感器的进光量更多,使得图像的亮度值更好。
步骤S420、对Raw图像进行分解处理。
可选地,对多光谱色彩过滤器阵列传感器采集的Raw图像进行分解处理。
示例性地,在Raw图像的预设位置抽取出两个拜耳阵列图像,分别为第一拜耳阵列图像与第二拜耳阵列图像。
步骤S430、得到第一拜耳阵列图像。
可选地,根据分解处理得到第一拜耳阵列图像。
步骤S440、得到第二拜耳阵列图像。
可选地,根据分解处理得到第二拜耳阵列图像。
示例性地,可以根据Raw图像的RGB掩膜(mask)抽取出对应的像素(pixel)组成RGGB拜耳图像,得到第一样本拜耳阵列图像;可以根据Raw图像的CYM掩膜(mask)抽取出对应的像素(pixel)组成CYYM拜耳图像,得到第二样本拜耳阵列图像。
示例性地,可以根据Raw图像的RGB掩膜(mask)抽取出对应的像素(pixel)组成RGGB拜耳图像,得到第一样本拜耳阵列图像;可以根据Raw图像的CYGM掩膜(mask)抽取出对应的像素(pixel)组成CYGM拜耳图像,得到第二样本拜耳阵列图像。
应理解,第一拜耳阵列图像是指RGGB图像,第二拜耳阵列图像可以是指CYYM图像、CYGM图像或者其他颜色模式的图像;第二拜阵列图像与第一拜耳阵列图像满足光谱响应曲线中任意两两曲线不同,本申请对第二拜耳阵列图像的颜色模式不作任何限定。
步骤S450、计算颜色校正矩阵。
可选地,根据第一拜耳阵列图像与第二拜耳阵列图像计算颜色校正矩阵。
示例性地,可以对第一拜耳阵列图像与第二拜耳阵列图像进行去马赛克处理,得到第一颜色模式的图像(第二图像的一个示例)与第二颜色模式的图像(第三图像的一个示例);根据第一颜色模式的图像与第二颜色模式的图像之间的像素差值可以得到颜色校正矩阵,其中,颜色校正矩阵可以用于表示将第一颜色模式的图像转换为第二颜色模式的图像的像素变化量。
例如,第一拜耳阵列图像为RGGB图像,第二拜耳阵列图像为CMMY图像;第一颜色模式的图像可以为RGB图像,第二颜色模式的图像可以为CMY图像;对于RGB图像中的每个像素可以对应一组RGB值(第一像素、第二像素与第三像素的一个示例),CMY图像中的每个像素可以对应一组CMY值(第四像素、第五像素与第六像素的一个示例);将R像素对应的RGB值转换为C像素对应的CMY值可以得到一个3×1的矩阵(第一向 量的一个示例);同理,将G像素对应的RGB值转换为M像素对应的CMY值可以得到一个3×1的矩阵(第二向量的一个示例);将B像素对应的RGB值转换为Y像素对应的CMY值可以得到一个3×1的矩阵(第三向量的一个示例);通过3个一维矩阵得到一个3×3的颜色校正矩阵。
步骤S460、将颜色校正矩阵输入至白平衡模型得到白平衡参数。
其中,白平衡模型可以根据输入的颜色校正矩阵得到对应的白平衡参数;白平衡模型可以是一个预先训练的全连接神经网络模型,白平衡模型的训练方法可以参见后续图8所示。
应理解,白平衡参数可以是指光源的颜色信息;比如,可以是指光源的RGB值;将光源的颜色信息应用到图像中可以消除光源对拍摄物体颜色产生的影响,从而避免拍摄物体的颜色失真;根据光源的颜色信息可以对图像中拍摄物体的颜色信息进行修正,从而提高图像中拍摄物体颜色的准确性。
步骤S470、根据白平衡参数对Raw图像进行白平衡处理。
可选地,根据白平衡参数对Raw图像进行白平衡处理(图像处理的一个示例),得到处理后的图像(第四图像的一个示例)。
在本申请的实施例中,通过多光谱色彩过滤器阵列传感器采集第一图像,通过对第一图像进行分解处理与去马赛克处理可以得到不同颜色模式的第二图像与第三图像;根据第二图像与第三图像可以得到颜色校正矩阵,将颜色校正矩阵输入至预先训练的白平衡模型可以得到白平衡参数;通过白平衡参数对第一图像进行图像处理,得到第四图像;由于多光谱色彩过滤器阵列传感器采集的图像(例如,6通道Raw图)中的颜色信息多于单帧图像(例如,3通道图像)的颜色信息,因此通过第一图像得到的白平衡参数能够提高白平衡处理的准确性,提高图像的颜色准确性。
图8是本申请实施例提供的白平衡模型的训练方法的示意图;该训练方法可以由图2所示的电子设备执行;该训练方法500包括步骤S510至步骤S560,下面分别对步骤S510至步骤S560进行详细的描述。
步骤S510、获取不同光源场景的训练数据。
可选地,对于同一个拍摄场景,训练数据可以包括同一光源场景中,不包括色卡的第一样本图像与包括色卡的第二样本图像,第二样本图像用于标定不同光源的真值(Ground truth)。
需要说明的是,获取样本图像时色卡可以放置在主要光源的照射下进行拍摄。
可选地,对于颜色闪色和颜色偏色的场景,说明当前场景的自动白平衡较差;可以针对该场景重点进行样本图像的采集。
步骤S520、对第一样本图像进行分解,得到第一样本拜耳阵列图像与第二样本拜耳阵列图像。
示例性地,可以根据第一样本图像的RGB掩膜(mask)抽取出对应的像素(pixel)组成RGGB拜耳图像,得到第一样本拜耳阵列图像;可以根据第一样本图像的CYM掩膜(mask)抽取出对应的像素(pixel)组成CYYM拜耳图像,得到第二样本拜耳阵列图像。
示例性地,可以根据第一样本图像的RGB掩膜(mask)抽取出对应的像素(pixel)组成RGGB拜耳图像,得到第一样本拜耳阵列图像;可以根据第一样本图像的CYGM掩 膜(mask)抽取出对应的像素(pixel)组成CYGM拜耳图像,得到第二样本拜耳阵列图像。
应理解,第一样本拜耳阵列图像可以是指RGGB图像,第二样本拜耳阵列图像可以是指CYYM图像、CYGM图像或者其他颜色模式的图像;第二样本拜阵列图像与第一样本拜耳阵列图像满足光谱响应曲线中任意两两曲线不同,本申请对第二样本拜耳阵列图像的颜色模式不作任何限定。
步骤S530、对第一样本拜耳阵列图像与第二样本拜耳阵列图像进行去马赛克处理,得到第三样本图像与第四样本图像。
其中,第三样本图像可以是指第一颜色模式的图像,第四样本图像可以是指第二颜色模式的图像。
例如,第一颜色模式可以是指RGB颜色模式,第二颜色模式可以是指CMY颜色模式;或者,第二颜色模式还可以是指其他颜色模式。
示例性地,第一样本拜耳阵列图像可以是RGGB图像,对RGGB图像进行去马赛克处理,得到RGB图像;第二样本拜耳阵列图像可以是CYYM图像,对CYYM图像进行去马赛克处理,得到CYM图像。
示例性地,第一样本拜耳阵列图像可以是RGGB图像,对RGGB图像进行去马赛克处理,得到RGB图像;第二样本拜耳阵列图像可以是CYGM图像,对CYGM图像进行去马赛克处理,得到CYGM图像。
步骤S540、根据第三样本图像与第四样本图像,得到样本颜色校正矩阵。
例如,通过将第三样本图像中的像素转换为第四样本图像中的像素,可以得到样本颜色校正矩阵。
示例性地,第三样本图像可以是RGB图像,第四样本图像可以是CMY图像;对于RGB图像中的每个像素可以对应一组RGB值,CMY图像中的每个像素可以对应一组CMY值;将R像素对应的RGB值转换为C像素对应的CMY值可以得到一个3×1的矩阵;同理,将G像素对应的RGB值转换为M像素对应的CMY值可以得到一个3×1的矩阵;将B像素对应的RGB值转换为Y像素对应的CMY值可以得到一个3×1的矩阵;通过3个一维矩阵得到一个3×3的样本颜色校正矩阵。
可选地,为了便于对白平衡模型的训练,可以将3×3的样本颜色校正矩阵进行重组,得到一维向量即9×1的样本颜色校正矩阵。
步骤S550、对第二样本图像中色卡包括的中性色块对应的颜色值进行标定,将该颜色值标定为光源的颜色值。
应理解,由于中性色块的中性材料对各种波长的光反射是均匀的,因此任何颜色的光照射在中性色块上就反射该颜色的光;比如,红光照射在色卡的中性色块上就反射红光;绿光照射在色卡的中性色块上就反射绿光;因此可以采用色卡的中性色卡来标记光源的颜色。
示例性地,中性色块可以是指色卡中的灰色色块;比如,将目标图像中灰色色块的RGB像素值标定为该光源的光源颜色,即该光源的RGB像素值。
示例性地,光源的颜色值的标定过程包括以下步骤:
步骤一:对第二样本图像进行分解,得到第三样本拜耳阵列图像(RGGB拜耳阵列图 像)与第四样本拜耳阵列图像;
步骤二:对第三样本拜耳阵列图像与第四样本拜耳阵列图像进行去马赛克处理,得到RGB图像(第五样本图像的一个示例)与其他颜色模式的图像;
步骤三:对RGGB图像中包括的色卡的中性色块对应的RGB像素值标定为光源的颜色值(第一像素信息的一个示例)。
示例性地,中性色块可以是指色卡中的灰色色块;比如,将目标图像中灰色色块的RGB像素值标定为该光源的光源颜色,即该光源的RGB像素值。
可选地,在一种可能的实现方式中,可以是对第二样本图像进行去马赛克算法处理,得到处理后的图像;在处理后的图像中获取色卡的中性色块对应的像素值中RGB的像素值。
步骤S560、通过以样本颜色校正矩阵为输入数据,以光源的颜色值为目标值对白平衡模型进行训练,得到训练后的白平衡模型。
示例性地,将样本颜色校正矩阵输入白平衡模型,得到预测白平衡参数(预测像素信息的一个示例);根据预测白平衡参数与光源的颜色值进行比较;通过反向传播算法对白平衡模型的参数进行迭代,直至白平衡模型收敛,得到训练后的白平衡模型。
例如,可以通过预测白平衡参数与光源的颜色值之间的角误差损失(Angular Error Loss)反向迭代训练白平衡模型。
可选地,在本申请的实施例中白平衡模型可以是全连接神经网络。
在本申请的实施例中,通过多光谱色彩过滤器阵列传感器采集同一光源,相同场景下不包括色卡的第一样本图像与包括色卡的第二样本图像;通过对第一样本图像处理可以得到训练白平衡模型的输入数据;通过第二样本图像可以得到训练白平衡模型的目标数据;通过本申请实施例的白平衡模型的训练方法得到的训练后的白平衡模型可以用于白平衡处理,根据白平衡模型输出的白平衡参数可以对多光谱色彩过滤器阵列传感器采集的Raw图进行白平衡处理;在本申请实施例中,由于训练白平衡模型的训练数据是通过多光谱色彩过滤器阵列传感器采集的图像,多光谱色彩过滤器阵列传感器采集的图像(例如,6通道Raw图)中的颜色信息多于单帧图像(例如,3通道图像)的颜色信息,因此训练得到的白平衡模型输出的用于白平衡处理的参数能够提高白平衡处理的准确性,提高图像的颜色准确性。
图9根据是本申请实施例提供的白平衡处理方法的效果示意图。
如图9所示,图9中的(a)是通过现有技术中的白平衡处理方法得到的输出图像;图9中的(b)是通过本申请实施例提供的白平衡处理方法得到的输出图像;如图9中的(a)所示的预览图像可以看出色卡601的颜色出现了严重失真;与图9中的(a)所示的预览图像相比,图9中的(b)所示的预览图像对色卡601的颜色还原性较高,即通过本申请实施例提供的白平衡处理方法对图像进行白平衡处理,能够提高图像的颜色准确性。
在一个示例中,可以在电子设备的相机应用程序中开启色彩还原模式,则在电子设备可以通过本申请实施例提供的白平衡模型可以得到白平衡参数,并根据白平衡参数对多光谱色彩过滤器阵列传感器采集的Raw图像进行白平衡处理,从而输出处理后的图像或者视频。
图10示出了电子设备的一种图形用户界面(graphical user interface,GUI)。
图10中的(a)所示的GUI为电子设备的桌面610;当电子设备检测到用户点击桌面610上的相机应用(application,APP)的图标620的操作后,可以启动相机应用,显示如图10中的(b)所示的另一GUI;图10中的(b)所示的GUI可以是相机APP在拍照模式下的显示界面,在GUI可以包括拍摄界面630;拍摄界面630中可以包括取景框631与控件;比如,拍摄界面630中可以包括用于指示拍摄的控件632与用于指示设置的控件633;在预览状态下,该取景框631内可以实时显示预览图像;其中,预览状态下可以是指用户打开相机且未按下拍照/录像按钮之前,此时取景框内可以实时显示预览图。
在电子设备检测到用户点击设置的控件633的操作后,显示如图10中的(c)所示的设置界面;在设置界面中可以包括色彩还原模式控件634;检测到用户点击色彩还原模式控件634的操作后,电子设备可以开启色彩还原模式;在电子设备可以开启色彩还原模式后,可以通过本申请实施例提供的白平衡模型得到白平衡参数,并根据白平衡参数对多光谱色彩过滤器阵列传感器采集的Raw图像进行白平衡处理,从而输出处理后的图像。
在一个示例中,如图11所示在拍照模式下,拍摄界面630中还可以包括控件635,控件635用于指示开启/关闭色彩还原模式;在电子设备检测到用户点击控件635的操作后,电子设备可以开启色彩还原模式,通过本申请实施例提供的白平衡模型得到白平衡参数,并根据白平衡参数对多光谱色彩过滤器阵列传感器采集的Raw图像进行白平衡处理,从而输出处理后的图像或者视频。
应理解,上述举例说明是为了帮助本领域技术人员理解本申请实施例,而非要将本申请实施例限于所例示的具体数值或具体场景。本领域技术人员根据所给出的上述举例说明,显然可以进行各种等价的修改或变化,这样的修改或变化也落入本申请实施例的范围内。
上文结合图1至图11详细描述了本申请实施例提供的白平衡模型的训练方法与白平衡处理方法;下面将结合图12至图14详细描述本申请的装置实施例。应理解,本申请实施例中的装置可以执行前述本申请实施例的各种方法,即以下各种产品的具体工作过程,可以参考前述方法实施例中的对应过程。
图12是本申请实施例提供的一种电子设备的结构示意图。该电子设备700可以执行如图8所示的训练方法;该电子设备700包括获取模块710与处理模块720。
其中,所述获取模块710用于获取训练数据,其中,所述训练数据包括第一样本图像与第二样本图像,所述第一样本图像与所述第二样本图像是指在同一光源场景下多光谱色彩过滤器阵列传感器采集的第一颜色空间的图像,所述第二样本图像中包括色卡;所述处理模块720用于对对所述第一样本图像进行分解处理与去马赛克处理,得到第三样本图像与第四样本图像,所述第三样本图像为第一颜色模式的图像,所述第四样本图像为第二颜色模式的图像;根据所述第三样本图像与所述第四样本图像得到样本颜色校正矩阵,所述样本颜色校正矩阵用于表示将所述第三样本图像转换为所述第四样本图像的像素变化量;对所述第二样本图像进行分解处理与去马赛克处理,得到第五样本图像,所述第五样本图像为所述第一颜色模式的图像;以所述样本颜色校正矩阵为输入数据,以所述第五样本图像中的第一像素信息为目标数据训练白平衡模型,得到训练后的白平衡模型,其中,所述白平衡模型用于计算白平衡处理的参数,所述第一像素信息是指所述第五样本图像中所述色卡包括的中性色块对应的像素值。
可选地,作为一个实施例,所述处理模块720具体用于:
将所述样本颜色校正矩阵输入所述白平衡模型,得到预测像素信息;
根据所述预测像素信息与所述第一像素信息训练所述白平衡模型,得到所述训练后的白平衡模型。
可选地,作为一个实施例,所述训练后的白平衡模型的参数是根据所述预测像素信息与所述第一像素信息之间的差异通过反向传播算进行迭代得到的。
可选地,作为一个实施例,所述多光谱色彩过滤器阵列传感器是指在像素传感器上方覆盖马赛克滤色镜阵列的传感器。
可选地,作为一个实施例,所述白平衡模型为全连接神经网络。
需要说明的是,上述电子设备700以功能模块的形式体现。这里的术语“模块”可以通过软件和/或硬件形式实现,对此不作具体限定。
例如,“模块”可以是实现上述功能的软件程序、硬件电路或二者结合。所述硬件电路可能包括应用特有集成电路(application specific integrated circuit,ASIC)、电子电路、用于执行一个或多个软件或固件程序的处理器(例如共享处理器、专有处理器或组处理器等)和存储器、合并逻辑电路和/或其它支持所描述的功能的合适组件。
图13是本申请实施例提供的一种电子设备的结构示意图。该电子设备800可以执行如图7所示的白平衡处理方法;该电子设备800包括显示模块810与处理模块820。
其中,所述显示模块810用于显示第一界面,所述第一界面包括第一控件;处理模块820用于检测到对所述第一控件的第一操作;响应于所述第一操作,获取第一图像,所述第一图像是指多光谱色彩过滤器阵列传感器采集的第一颜色空间的图像;对所述第一图像进行分解处理与去马赛克处理,得到第二图像与第三图像,所述第二图像是第一颜色模式的图像,所述第三图像是第二颜色模式的图像;根据所述第二图像与所述第三图像得到颜色校正矩阵,所述颜色校正矩阵用于表示将所述第二图像转换为所述第三图像的像素变化量;将颜色校正矩阵输入至白平衡模型,得到白平衡参数;根据所述白平衡参数对所述第一图像进行图像处理,得到第四图像;其中,所述白平衡模型用于计算白平衡处理的参数,所述白平衡模型是通过以样本颜色校正矩阵为输入数据,以第一像素信息为目标数据训练得到的,所述样本颜色校正矩阵是根据第三样本图像与第四样本图像得到的,所述样本颜色校正矩阵用于表示将所述第三样本图像转换为所述第四样本图像的像素变化量,所述第三样本图像与所述第四样本图像是对第一样本图像进行所述分解处理与所述去马赛克处理得到的,所述第一像素信息是指第五样本图像中色卡包括的中性色块对应的像素值,所述第五样本图像是对第二样本图像进行所述分解处理与所述去马赛克处理得到的,所述第一样本图像与所述第二样本图像是指在同一光源场景下所述多光谱色彩过滤器阵列传感器采集的所述第一颜色空间的图像,所述第二样本图像中包括所述色卡。
可选地,作为一个实施例,所述处理模块820具体用于:
对所述第一图像进行所述分解处理,得到第一拜耳阵列图像与第二拜耳阵列图像,所述第一拜耳阵列图像为所述第一颜色模式的拜耳阵列图像,所述第二拜耳阵列图像为所述第二颜色模式的拜耳阵列图像;
对所述第一拜耳阵列图像进行所述去马赛克处理,得到所述第二图像;
对所述第二拜耳阵列图像进行所述去马赛克处理,得到所述第三图像。
可选地,作为一个实施例,所述第二图像包括第一像素、第二像素与第三像素,所述第三图像包括第四像素、第五像素与第六像素,所述处理模块820具体用于:
根据所述第一像素与所述第四像素之间的差值得到第一向量;
根据所述第二像素与所述第五像素之间的差值得到第二向量;
根据所述第三像素与所述第六像素直接的差值得到第三向量;
由所述第一向量、所述第二向量与所述第三向量组成所述颜色校正矩阵。
可选地,作为一个实施例,所述白平衡模型的参数是根据预测像素信息与所述第一像素信息之间的差异通过反向传播算进行迭代得到的,所述预测像素信息是指将所述样本颜色校正矩阵输入所述白平衡模型得到的输出信息。
可选地,作为一个实施例,所述多光谱色彩过滤器阵列传感器是指在像素传感器上方覆盖马赛克滤色镜阵列的传感器。
可选地,作为一个实施例,所述白平衡模型为全连接神经网络。
可选地,作为一个实施例,所述第一界面是指所述电子设备的主屏界面,所述主屏界面包括相机应用程序,所述第一控件是指所述相机应用程序对应的控件。
可选地,作为一个实施例,所述第一界面是指拍照界面,所述第一控件是指用于指示拍照的控件。
可选地,作为一个实施例,所述第一界面是指拍摄视频界面,所述第一控件是指用于指示拍摄视频的控件。
需要说明的是,上述电子设备800以功能模块的形式体现。这里的术语“模块”可以通过软件和/或硬件形式实现,对此不作具体限定。
例如,“模块”可以是实现上述功能的软件程序、硬件电路或二者结合。所述硬件电路可能包括应用特有集成电路(application specific integrated circuit,ASIC)、电子电路、用于执行一个或多个软件或固件程序的处理器(例如共享处理器、专有处理器或组处理器等)和存储器、合并逻辑电路和/或其它支持所描述的功能的合适组件。
因此,在本申请的实施例中描述的各示例的单元,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
图14示出了本申请提供的一种电子设备的结构示意图。图14中的虚线表示该单元或该模块为可选的;电子设备900可以用于实现上述方法实施例中描述的方法。
电子设备900包括一个或多个处理器901,该一个或多个处理器901可支持电子设备900实现方法实施例中的白平衡模型的训练方法,或者白平衡处理方法。处理器901可以是通用处理器或者专用处理器。例如,处理器901可以是中央处理器(central processing unit,CPU)、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现场可编程门阵列(field programmable gate array,FPGA)或者其它可编程逻辑器件,如分立门、晶体管逻辑器件或分立硬件组件。
处理器901可以用于对电子设备900进行控制,执行软件程序,处理软件程序的数据。电子设备900还可以包括通信单元905,用以实现信号的输入(接收)和输出 (发送)。
例如,电子设备900可以是芯片,通信单元905可以是该芯片的输入和/或输出电路,或者,通信单元905可以是该芯片的通信接口,该芯片可以作为终端设备或其它电子设备的组成部分。
又例如,电子设备900可以是终端设备,通信单元905可以是该终端设备的收发器,或者,通信单元905可以是该终端设备的收发电路。
电子设备900中可以包括一个或多个存储器902,其上存有程序904,程序904可被处理器901运行,生成指令903,使得处理器901根据指令903执行上述方法实施例中描述的训练方法,或者白平衡处理方法。
可选地,存储器902中还可以存储有数据。可选地,处理器901还可以读取存储器902中存储的数据,该数据可以与程序904存储在相同的存储地址,该数据也可以与程序904存储在不同的存储地址。
处理器901和存储器902可以单独设置,也可以集成在一起,例如,集成在终端设备的系统级芯片(system on chip,SOC)上。
示例性地,存储器902可以用于存储本申请实施例中提供的白平衡模型的训练方法的相关程序904,处理器901可以用于在执行训练白平衡模型时调用存储器902中存储的白平衡模型的训练方法的相关程序904,执行本申请实施例的白平衡模型的训练方法;例如,获取训练数据,其中,所述训练数据包括第一样本图像与第二样本图像,所述第一样本图像与所述第二样本图像是指在同一光源场景下多光谱色彩过滤器阵列传感器采集的第一颜色空间的图像,所述第二样本图像中包括色卡;对所述第一样本图像进行分解处理与去马赛克处理,得到第三样本图像与第四样本图像,所述第三样本图像为第一颜色模式的图像,所述第四样本图像为第二颜色模式的图像;根据所述第三样本图像与所述第四样本图像得到样本颜色校正矩阵,所述样本颜色校正矩阵用于表示将所述第三样本图像转换为所述第四样本图像的像素变化量;对所述第二样本图像进行分解处理与去马赛克处理,得到第五样本图像,所述第五样本图像为所述第一颜色模式的图像;以所述样本颜色校正矩阵为输入数据,以所述第五样本图像中的第一像素信息为目标数据训练白平衡模型,得到训练后的白平衡模型,其中,所述白平衡模型用于计算白平衡处理的参数,所述第一像素信息是指所述第五样本图像中所述色卡包括的中性色块对应的像素值。
示例性地,存储器902可以用于存储本申请实施例中提供的白平衡处理方法的相关程序904,处理器901可以用于在执行白平衡处理时调用存储器902中存储的白平衡处理方法的相关程序904,执行本申请实施例的白平衡处理方法;例如,显示第一界面,所述第一界面包括第一控件;检测到对所述第一控件的第一操作;响应于所述第一操作,获取第一图像,所述第一图像是指多光谱色彩过滤器阵列传感器采集的第一颜色空间的图像;对所述第一图像进行分解处理与去马赛克处理,得到第二图像与第三图像,所述第二图像是第一颜色模式的图像,所述第三图像是第二颜色模式的图像;根据所述第二图像与所述第三图像得到颜色校正矩阵,所述颜色校正矩阵用于表示将所述第二图像转换为所述第三图像的像素变化量;将颜色校正矩阵输入至白平衡模型,得到白平衡参数;根据所述白平衡参数对所述第一图像进行图像处理,得到第 四图像;其中,所述白平衡模型用于计算白平衡处理的参数,所述白平衡模型是通过以样本颜色校正矩阵为输入数据,以第一像素信息为目标数据训练得到的,所述样本颜色校正矩阵是根据第三样本图像与第四样本图像得到的,所述样本颜色校正矩阵用于表示将所述第三样本图像转换为所述第四样本图像的像素变化量,所述第三样本图像与所述第四样本图像是对第一样本图像进行所述分解处理与所述去马赛克处理得到的,所述第一像素信息是指第五样本图像中色卡包括的中性色块对应的像素值,所述第五样本图像是对第二样本图像进行所述分解处理与所述去马赛克处理得到的,所述第一样本图像与所述第二样本图像是指在同一光源场景下所述多光谱色彩过滤器阵列传感器采集的所述第一颜色空间的图像,所述第二样本图像中包括所述色卡。
本申请还提供了一种计算机程序产品,该计算机程序产品被处理器901执行时实现本申请中任一方法实施例所述的训练方法或者白平衡处理方法。
该计算机程序产品可以存储在存储器902中,例如是程序904,程序904经过预处理、编译、汇编和链接等处理过程最终被转换为能够被处理器901执行的可执行目标文件。
本申请还提供了一种计算机可读存储介质,其上存储有计算机程序,该计算机程序被计算机执行时实现本申请中任一方法实施例所述的白平衡处理方法。该计算机程序可以是高级语言程序,也可以是可执行目标程序。
该计算机可读存储介质例如是存储器902。存储器902可以是易失性存储器或非易失性存储器,或者,存储器902可以同时包括易失性存储器和非易失性存储器。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(dynamic RAM,DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。
本领域的技术人员可以清楚地了解到,为了描述的方便和简洁,上述描述的装置和设备的具体工作过程以及产生的技术效果,可以参考前述方法实施例中对应的过程和技术效果,在此不再赘述。
在本申请所提供的几个实施例中,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的方法实施例的一些特征可以忽略,或不执行。以上所描述的装置实施例仅仅是示意性的,单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,多个单元或组件可以结合或者可以集成到另一个系统。另外,各单元之间的耦合或各个组件之间的耦合可以是直接耦合,也可以是间接耦合,上述耦合包括电的、机械的或其它形式的连接。
应理解,在本申请的各种实施例中,各过程的序号的大小并不意味着执行顺序的 先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请的实施例的实施过程构成任何限定。
另外,本文中术语“系统”和“网络”在本文中常被可互换使用。本文中的术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/”,一般表示前后关联对象是一种“或”的关系。
总之,以上所述仅为本申请技术方案的较佳实施例而已,并非用于限定本申请的保护范围。凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (13)

  1. 一种白平衡处理方法,其特征在于,应用于电子设备,包括:
    显示第一界面,所述第一界面包括第一控件;
    检测到对所述第一控件的第一操作;
    响应于所述第一操作,获取第一图像,所述第一图像是指多光谱色彩过滤器阵列传感器采集的第一颜色空间的图像;
    对所述第一图像进行分解处理与去马赛克处理,得到第二图像与第三图像,所述第二图像是第一颜色模式的图像,所述第三图像是第二颜色模式的图像;
    根据所述第二图像与所述第三图像得到颜色校正矩阵,所述颜色校正矩阵用于表示将所述第二图像转换为所述第三图像的像素变化量;
    将所述颜色校正矩阵输入至白平衡模型,得到白平衡参数;
    根据所述白平衡参数对所述第一图像进行图像处理,得到第四图像;
    其中,所述白平衡模型用于计算白平衡处理的参数,所述白平衡模型是通过以样本颜色校正矩阵为输入数据,以第一像素信息为目标数据训练得到的,所述样本颜色校正矩阵是根据第三样本图像与第四样本图像得到的,所述样本颜色校正矩阵用于表示将所述第三样本图像转换为所述第四样本图像的像素变化量,所述第三样本图像与所述第四样本图像是对第一样本图像进行所述分解处理与所述去马赛克处理得到的,所述第一像素信息是指第五样本图像中色卡包括的中性色块对应的像素值,所述第五样本图像是对第二样本图像进行所述分解处理与所述去马赛克处理得到的,所述第一样本图像与所述第二样本图像是指在同一光源场景下所述多光谱色彩过滤器阵列传感器采集的所述第一颜色空间的图像,所述第二样本图像中包括所述色卡。
  2. 如权利要求1所述的白平衡处理方法,其特征在于,所述对所述第一图像进行分解处理与去马赛克处理,得到第二图像与第三图像,包括:
    对所述第一图像进行所述分解处理,得到第一拜耳阵列图像与第二拜耳阵列图像,所述第一拜耳阵列图像为所述第一颜色模式的拜耳阵列图像,所述第二拜耳阵列图像为所述第二颜色模式的拜耳阵列图像;
    对所述第一拜耳阵列图像进行所述去马赛克处理,得到所述第二图像;
    对所述第二拜耳阵列图像进行所述去马赛克处理,得到所述第三图像。
  3. 如权利要求1或2所述的白平衡处理方法,其特征在于,所述第二图像包括第一像素、第二像素与第三像素,所述第三图像包括第四像素、第五像素与第六像素,所述根据所述第二图像与所述第三图像得到颜色校正矩阵,包括:
    根据所述第一像素与所述第四像素之间的差值得到第一向量;
    根据所述第二像素与所述第五像素之间的差值得到第二向量;
    根据所述第三像素与所述第六像素直接的差值得到第三向量;
    由所述第一向量、所述第二向量与所述第三向量组成所述颜色校正矩阵。
  4. 如权利要求1至3中任一项所述的白平衡处理方法,其特征在于,所述白平衡模型的参数是根据预测像素信息与所述第一像素信息之间的差异通过反向传播算进行迭代得到的,所述预测像素信息是指将所述样本颜色校正矩阵输入所述白平衡模型得到的输出信息。
  5. 如权利要求1至4中任一项所述的白平衡处理方法,其特征在于,所述多光谱色彩过滤器阵列传感器是指在像素传感器上方覆盖马赛克滤色镜阵列的传感器。
  6. 如权利要求1至5中任一项所述的白平衡处理方法,其特征在于,所述白平衡模型为全连接神经网络。
  7. 如权利要求1至6中任一项所述的白平衡处理方法,其特征在于,所述第一界面是指所述电子设备的主屏界面,所述主屏界面包括相机应用程序,所述第一控件是指所述相机应用程序对应的控件。
  8. 如权利要求1至6中任一项所述的白平衡处理方法,其特征在于,所述第一界面是指拍照界面,所述第一控件是指用于指示拍照的控件。
  9. 如权利要求1至6中任一项所述的白平衡处理方法,其特征在于,所述第一界面是指拍摄视频界面,所述第一控件是指用于指示拍摄视频的控件。
  10. 一种电子设备,其特征在于,应用于白平衡处理,所述电子设备包括:
    一个或多个处理器和存储器;
    所述存储器与所述一个或多个处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行如权利要求1至9中任一项所述的白平衡处理方法。
  11. 一种芯片系统,其特征在于,所述芯片系统应用于电子设备,所述芯片系统包括一个或多个处理器,所述处理器用于调用计算机指令以使得所述电子设备执行如权利要求1至9中任一项所述的方法。
  12. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储了计算机程序,当所述计算机程序被处理器执行时,使得处理器执行权利要求1至9中任一项所述的方法。
  13. 一种计算机程序产品,其特征在于,所述计算机程序产品包括计算机程序代码,当所述计算机程序代码被处理器执行时,使得处理器执行权利要求1至9中任一项所述的方法。
PCT/CN2022/117586 2021-09-15 2022-09-07 白平衡处理方法与电子设备 WO2023040725A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/006,180 US20240129446A1 (en) 2021-09-15 2022-09-07 White Balance Processing Method and Electronic Device
EP22839975.4A EP4175275A4 (en) 2021-09-15 2022-09-07 WHITE BALANCE PROCESSING METHOD AND ELECTRONIC DEVICE

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202111081629 2021-09-15
CN202111081629.X 2021-09-15
CN202111560454.0 2021-12-20
CN202111560454.0A CN115835034B (zh) 2021-09-15 2021-12-20 白平衡处理方法与电子设备

Publications (1)

Publication Number Publication Date
WO2023040725A1 true WO2023040725A1 (zh) 2023-03-23

Family

ID=85382660

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/117586 WO2023040725A1 (zh) 2021-09-15 2022-09-07 白平衡处理方法与电子设备

Country Status (4)

Country Link
US (1) US20240129446A1 (zh)
EP (1) EP4175275A4 (zh)
CN (1) CN115835034B (zh)
WO (1) WO2023040725A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116761082A (zh) * 2023-08-22 2023-09-15 荣耀终端有限公司 图像处理方法及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102404582A (zh) * 2010-09-01 2012-04-04 苹果公司 用于自动白平衡处理的灵活的颜色空间选择
CN107578390A (zh) * 2017-09-14 2018-01-12 长沙全度影像科技有限公司 一种使用神经网络进行图像白平衡校正的方法及装置
CN109523485A (zh) * 2018-11-19 2019-03-26 Oppo广东移动通信有限公司 图像颜色校正方法、装置、存储介质及移动终端
US20210058596A1 (en) * 2019-08-22 2021-02-25 Mahmoud Afifi Systems and methods for sensor-independent illuminant determination
CN112565728A (zh) * 2020-12-22 2021-03-26 京东方科技集团股份有限公司 白平衡调整方法、系统及装置

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4534340B2 (ja) * 2000-10-31 2010-09-01 ソニー株式会社 色再現補正装置
US8588522B2 (en) * 2011-04-13 2013-11-19 Hewlett-Packard Development Company, L.P. Method and system for dynamic color correction
JP5744945B2 (ja) * 2013-03-25 2015-07-08 キヤノン株式会社 画像処理装置及び方法、及び撮像装置
CN108377373A (zh) * 2018-05-10 2018-08-07 杭州雄迈集成电路技术有限公司 一种基于像素的色彩还原装置及方法
GB201903816D0 (en) * 2019-03-20 2019-05-01 Spectral Edge Ltd Multispectral image decorrelation method and system
CN111314684B (zh) * 2020-04-13 2021-06-29 杭州国芯科技股份有限公司 一种基于同色异谱的白平衡校正的方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102404582A (zh) * 2010-09-01 2012-04-04 苹果公司 用于自动白平衡处理的灵活的颜色空间选择
CN107578390A (zh) * 2017-09-14 2018-01-12 长沙全度影像科技有限公司 一种使用神经网络进行图像白平衡校正的方法及装置
CN109523485A (zh) * 2018-11-19 2019-03-26 Oppo广东移动通信有限公司 图像颜色校正方法、装置、存储介质及移动终端
US20210058596A1 (en) * 2019-08-22 2021-02-25 Mahmoud Afifi Systems and methods for sensor-independent illuminant determination
CN112565728A (zh) * 2020-12-22 2021-03-26 京东方科技集团股份有限公司 白平衡调整方法、系统及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4175275A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116761082A (zh) * 2023-08-22 2023-09-15 荣耀终端有限公司 图像处理方法及装置
CN116761082B (zh) * 2023-08-22 2023-11-14 荣耀终端有限公司 图像处理方法及装置

Also Published As

Publication number Publication date
CN115835034B (zh) 2024-04-05
EP4175275A1 (en) 2023-05-03
US20240129446A1 (en) 2024-04-18
CN115835034A (zh) 2023-03-21
EP4175275A4 (en) 2024-02-07

Similar Documents

Publication Publication Date Title
CN115550570B (zh) 图像处理方法与电子设备
CN114693580B (zh) 图像处理方法及其相关设备
US11825179B2 (en) Auto exposure for spherical images
CN115802183B (zh) 图像处理方法及其相关设备
WO2023040725A1 (zh) 白平衡处理方法与电子设备
WO2023060921A1 (zh) 图像处理方法与电子设备
CN116437198B (zh) 图像处理方法与电子设备
WO2023124202A1 (zh) 图像处理方法与电子设备
CN117135471A (zh) 一种图像处理方法和电子设备
CN115550575B (zh) 图像处理方法及其相关设备
CN115767290A (zh) 图像处理方法和电子设备
US20230342977A1 (en) Method for Determining Chromaticity Information and Related Electronic Device
CN116668862A (zh) 图像处理方法与电子设备
US20230058472A1 (en) Sensor prioritization for composite image capture
CN116258633A (zh) 图像去反光的方法、图像去反光模型的训练方法与装置
CN116128739A (zh) 下采样模型的训练方法、图像处理方法及装置
CN114298889A (zh) 图像处理电路和图像处理方法
CN115955611B (zh) 图像处理方法与电子设备
CN115767287B (zh) 图像处理方法与电子设备
JP2020191546A (ja) 画像処理装置、画像処理方法、およびプログラム
EP4231621A1 (en) Image processing method and electronic device
CN116029914B (zh) 图像处理方法与电子设备
CN116668838B (zh) 图像处理方法与电子设备
CN115633262B (zh) 图像处理方法和电子设备
WO2023160190A1 (zh) 自动曝光方法与电子设备

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 18006180

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2022839975

Country of ref document: EP

Effective date: 20230119

NENP Non-entry into the national phase

Ref country code: DE