WO2021179142A1 - 一种图像处理方法及相关装置 - Google Patents

一种图像处理方法及相关装置 Download PDF

Info

Publication number
WO2021179142A1
WO2021179142A1 PCT/CN2020/078476 CN2020078476W WO2021179142A1 WO 2021179142 A1 WO2021179142 A1 WO 2021179142A1 CN 2020078476 W CN2020078476 W CN 2020078476W WO 2021179142 A1 WO2021179142 A1 WO 2021179142A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
adjustment factor
adjusted
channel data
chrominance
Prior art date
Application number
PCT/CN2020/078476
Other languages
English (en)
French (fr)
Inventor
李蒙
胡慧
郑成林
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2020/078476 priority Critical patent/WO2021179142A1/zh
Priority to CN202080096698.5A priority patent/CN115088252A/zh
Publication of WO2021179142A1 publication Critical patent/WO2021179142A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals

Definitions

  • the present invention relates to the field of image processing technology, in particular to an image processing method and related devices.
  • the image signal processor (Image Signal Processor, ISP) is an important part of the camera equipment.
  • ISP Image Signal Processor
  • the camera device can obtain the Bayer image corresponding to the target scene through the lens.
  • the digital image signal ie RAW data
  • the RAW data is passed through the ISP Perform a series of calculation optimizations, such as noise reduction, color adjustment, brightness adjustment, and exposure adjustment, etc., and finally generate the target image for display on the display screen.
  • Deep learning is mainly based on various methods of artificial neural networks to achieve computational optimization, and its applications are becoming more and more extensive.
  • multiple processing steps such as noise reduction, color adjustment, and brightness adjustment are usually executed in a serial manner, for example, through a neural network that can realize multiple processing steps (multiple processing steps are performed in serial Method execution) to achieve the processing of RAW data to the target image, or to achieve the processing of RAW data to the target image through multiple neural networks in series (each neural network can achieve different processing steps).
  • the relevant parameters of each neural network are fixed, and the effect of the obtained target image is also fixed.
  • the effect is not good or you need to switch between different image effects, etc. to adjust.
  • the input data of the latter neural network is completely dependent on the output data of the previous neural network. The training difficulty of the neural network increases.
  • the embodiment of the invention discloses an image processing method and related devices, which can simplify the image processing process and improve the quality of the output image.
  • an embodiment of the present application provides an image processing method.
  • the method includes: acquiring a linear RGB image, processing a global correlation matrix, and a scale adjustment factor.
  • the scale adjustment factor includes a contrast adjustment factor and a chroma adjustment factor.
  • the contrast adjustment factor is used to adjust contrast and noise reduction, and the chroma adjustment factor is used to adjust saturation and noise reduction; perform global processing on the linear RGB image according to the global processing correlation matrix to obtain a bright color separation image;
  • the contrast adjustment factor adjusts the brightness channel data of the bright color separated image, and adjusts the chroma channel data of the bright color separated image according to the chroma adjustment factor to obtain an adjusted bright color separated image.
  • the linear RGB image is subjected to color conversion matrix conversion, gamma correction and color space conversion matrix conversion to obtain a bright color separation image, and then the brightness channel data and chroma channel data of the bright color separation image are synchronized based on the corresponding contrast
  • the adjustment factor and the chroma adjustment factor are optimized to obtain the adjusted bright color separation image.
  • the processing efficiency is high, which is also conducive to the adjustment of the corresponding data.
  • the contrast adjustment and noise reduction processing can be done at one time based on the contrast adjustment factor, and the saturation adjustment and noise reduction processing can be done at the same time based on the chroma adjustment factor; this kind of processing for the luminance channel data and the chroma channel data The way is simpler and more efficient.
  • the global processing correlation matrix includes a color correction matrix, an automatic white balance matrix, a gamma correction coefficient, and a color space conversion matrix;
  • the global processing correlation matrix performs global processing on the linear RGB image to obtain a bright color separation image, including: performing white balance calibration on the linear RGB image according to the automatic white balance matrix to obtain a linear RGB image after white balance calibration;
  • the color correction matrix performs color correction on the linear RGB image after the white balance calibration to obtain a color-corrected linear RGB image; performs gamma correction on the color-corrected linear RGB image according to the gamma correction coefficient, Obtain a non-linear RGB image; perform color space conversion on the non-linear RGB image according to the color space conversion matrix to obtain a bright color separation image.
  • the brightness channel of the bright-color separated image is determined according to the contrast adjustment factor. Adjusting the data, and adjusting the chromaticity channel data of the bright-color separated image according to the chromaticity adjustment factor to obtain the adjusted bright-color separated image, including: adjusting the brightness of the bright-color separated image according to the contrast adjustment factor Channel data is adjusted for contrast and noise reduction to obtain adjusted luminance channel data; according to the chroma adjustment factor, saturation and noise reduction are performed on the chroma channel data of the bright-color separation image to obtain the adjusted chroma channel Data; according to the adjusted luminance channel data and the adjusted chrominance channel data, determine the adjusted bright-color separation image.
  • the brightness channel of the bright-color separated image is determined according to the contrast adjustment factor.
  • the data is adjusted for contrast and noise reduction to obtain adjusted brightness channel data, including: brightness channel data in the bright color separated image, the contrast adjustment factor of the brightness channel, and the second configured second corresponding to the brightness channel.
  • the optimization parameter determines the intermediate data of the brightness channel; the adjusted brightness channel data is determined according to the intermediate data of the brightness channel and the target parameter, and the target parameter is the detail enhancement factor of the brightness channel according to the configured third optimization parameter. Optimized parameters.
  • the second optimized parameter and the third optimized parameter are introduced, and the second optimized parameter and the third optimized parameter are manually configured so that the obtained image can more quickly and accurately meet the user's requirements for the display effect.
  • the color of the bright-color separated image is adjusted according to the chromaticity adjustment factor.
  • Saturation and noise reduction adjustments are performed on the channel data to obtain adjusted chroma channel data, including: separating at least two channels of chroma channel data in the image according to the bright color, and the respective chromaticities of the at least two chroma channels
  • the adjustment factor and the configured at least two first optimization parameters respectively corresponding to the at least two chrominance channels determine the adjusted at least two chrominance channel data.
  • the first optimized parameter is introduced, and the first optimized parameter is manually configured so that the obtained image can meet the user's requirements for the display effect more quickly and accurately.
  • the bright color separation image is a YUV image
  • the linear RGB image linearGRB and adjustment The resulting bright color separation image YUV′ satisfies the following relationship:
  • V′ V*Deta_V*URatio
  • T awb is the automatic white balance matrix
  • T ccm is the color correction matrix
  • 1.0/2.2 is the correction parameter of gamma correction
  • sRGB is the nonlinear RGB image
  • YUV is the bright color separation image
  • T csc is the color space conversion matrix
  • U is the chroma channel data of the bright color separated image YUV
  • V is another chroma channel data of the bright color separated image YUV
  • YRatio is the brightness channel data Contrast adjustment factor
  • URatio is the chrominance adjustment factor of the one chrominance channel
  • VRatio is the chrominance adjustment factor of the another chrominance channel
  • Sharpness is the detail enhancement factor of the luminance channel
  • Deta_Y is the configured and all The third optimization parameter corresponding to the luminance channel
  • Deta_U is the configured first optimization parameter corresponding to the one chro
  • said acquiring a linear RGB image, processing a global correlation matrix and a scale adjustment factor includes :
  • the linear RGB image is obtained by demosaicing and/or denoising the source image through a low-level network; extracting the global processing correlation matrix from the source image through a global processing network; The image extracts the scale adjustment factor.
  • the low-level network, the global processing network, and the proportional adjustment network can be executed in parallel, so they are independent of each other and do not affect each other, and the output error of the network can be minimized.
  • the low-level network, the global processing network, and the proportional adjustment network are all It is a convolutional neural network.
  • an embodiment of the present application provides an image processing device that includes a processor and a memory, where the memory is used to store a computer program, and the processor is used to call the computer program to perform the following operations: obtain A linear RGB image, a global processing correlation matrix, and a scale adjustment factor.
  • the scale adjustment factor includes a contrast adjustment factor and a chroma adjustment factor.
  • the contrast adjustment factor is used to adjust contrast and noise reduction
  • the chroma adjustment factor is To adjust saturation and noise reduction; perform global processing on the linear RGB image according to the global processing correlation matrix to obtain a bright color separated image; adjust the brightness channel data of the bright color separated image according to the contrast adjustment factor, and according to The chrominance adjustment factor adjusts the chrominance channel data of the bright color separation image to obtain an adjusted bright color separation image.
  • the linear RGB image is subjected to color conversion matrix conversion, gamma correction and color space conversion matrix conversion to obtain a bright color separation image, and then the brightness channel data and chroma channel data of the bright color separation image are synchronized based on the corresponding contrast
  • the adjustment factor and the chroma adjustment factor are optimized to obtain the adjusted bright color separation image.
  • the processing efficiency is high, which is also conducive to the adjustment of the corresponding data.
  • the contrast adjustment and noise reduction processing can be done at one time based on the contrast adjustment factor, and the saturation adjustment and noise reduction processing can be done at the same time based on the chroma adjustment factor; this kind of processing for the luminance channel data and the chroma channel data The way is simpler and more efficient.
  • the global processing correlation matrix includes a color correction matrix, an automatic white balance matrix, a gamma correction coefficient, and a color space conversion matrix;
  • the processor is specifically configured to: perform white balance calibration on the linear RGB image according to the automatic white balance matrix to obtain a white balance calibrated Linear RGB image; perform color correction on the white balance-calibrated linear RGB image according to the color correction matrix to obtain a color-corrected linear RGB image; perform color correction on the color-corrected linear RGB according to the gamma correction coefficient Performing gamma correction on the image to obtain a non-linear RGB image; performing color space conversion on the non-linear RGB image according to the color space conversion matrix to obtain a bright color separation image.
  • the brightness channel data of the bright-color separated image is determined according to the contrast adjustment factor.
  • the processor is specifically configured to: The brightness channel data of the bright color separated image is adjusted for contrast and noise reduction to obtain adjusted brightness channel data; the chroma channel data of the bright color separated image is adjusted for saturation and noise reduction according to the chroma adjustment factor to obtain Adjusted chrominance channel data; determining an adjusted bright-color separation image according to the adjusted luminance channel data and the adjusted chrominance channel data.
  • the brightness channel data of the bright-color separated image is determined according to the contrast adjustment factor.
  • the processor is specifically configured to: separate the brightness channel data in the image according to the bright color, the contrast adjustment factor of the brightness channel, and the configuration and the said brightness channel data.
  • the second optimization parameter corresponding to the brightness channel determines the intermediate data of the brightness channel; the adjusted brightness channel data is determined according to the intermediate data of the brightness channel and the target parameter, and the target parameter is the comparison of the brightness channel data according to the configured third optimization parameter.
  • Parameter obtained by optimizing the detail enhancement factor of the channel In this method, the second optimized parameter and the third optimized parameter are introduced, and the second optimized parameter and the third optimized parameter are manually configured so that the obtained image can more quickly and accurately meet the user's requirements for the display effect.
  • the chrominance of the bright-color separated image is adjusted according to the chrominance adjustment factor.
  • Channel data is adjusted for saturation and noise reduction to obtain adjusted chroma channel data.
  • the processor is specifically configured to: separate at least two channels of chroma channel data and at least two channels of color in the image according to the bright color.
  • the respective chromaticity adjustment factors of the chrominance channels, and the configured at least two first optimization parameters respectively corresponding to the at least two chrominance channels determine the adjusted at least two chrominance channel data.
  • the first optimized parameter is introduced, and the first optimized parameter is manually configured so that the obtained image can meet the user's requirements for the display effect more quickly and accurately.
  • the bright color separation image is a YUV image
  • the linear RGB image linearGRB and adjustment The resulting bright color separation image YUV′ satisfies the following relationship:
  • V′ V*Deta_V*URatio
  • T awb is the automatic white balance matrix
  • T ccm is the color correction matrix
  • 1.0/2.2 is the correction parameter of gamma correction
  • sRGB is the nonlinear RGB image
  • YUV is the bright color separation image
  • T csc is the color space conversion matrix
  • U is the chroma channel data of the bright color separated image YUV
  • V is another chroma channel data of the bright color separated image YUV
  • YRatio is the brightness channel data Contrast adjustment factor
  • URatio is the chrominance adjustment factor of the one chrominance channel
  • VRatio is the chrominance adjustment factor of the another chrominance channel
  • Sharpness is the detail enhancement factor of the luminance channel
  • Deta_Y is the configured and all The third optimization parameter corresponding to the luminance channel
  • Deta_U is the configured first optimization parameter corresponding to the one chro
  • the processing is specifically used for: performing demosaic processing and/or denoising processing on the source image through a low-level network to obtain the linear RGB image; extracting the global processing correlation matrix from the source image through a global processing network; The network extracts the scale adjustment factor from the source image.
  • the low-level network, the global processing network, and the proportional adjustment network can be executed in parallel, so they are independent of each other and do not affect each other, and the output error of the network can be minimized.
  • the low-level network, the global processing network, and the proportional adjustment network are all It is a convolutional neural network.
  • an embodiment of the present application provides an image processing device.
  • the device includes: an acquisition unit configured to acquire a linear RGB image, a global processing correlation matrix, and a scale adjustment factor, the scale adjustment factor including a contrast adjustment factor and a chroma An adjustment factor, wherein the contrast adjustment factor is used to adjust contrast and noise reduction, and the chroma adjustment factor is used to adjust saturation and noise reduction;
  • the RGB image is processed globally to obtain a bright color separation image;
  • an adjustment unit is configured to adjust the brightness channel data of the bright color separation image according to the contrast adjustment factor, and adjust the color of the bright color separation image according to the chromaticity adjustment factor.
  • the degree channel data is adjusted to obtain the adjusted bright color separation image.
  • the linear RGB image is converted by color conversion matrix, gamma correction and color space conversion matrix conversion to obtain bright color separated images, and then the brightness channel data and chroma channel data of the bright color separated images are synchronized based on the corresponding contrast
  • the adjustment factor and the chroma adjustment factor are optimized to obtain the adjusted bright color separation image.
  • the processing efficiency is high, which is also conducive to the adjustment of the corresponding data.
  • the contrast adjustment and noise reduction processing can be done at one time based on the contrast adjustment factor, and the saturation adjustment and noise reduction processing can be done at the same time based on the chroma adjustment factor; this kind of processing for the luminance channel data and the chroma channel data The way is simpler and more efficient.
  • the global processing correlation matrix includes a color correction matrix, an automatic white balance matrix, a gamma correction coefficient, and a color space conversion matrix;
  • the global processing unit is specifically configured to: perform white balance calibration on the linear RGB image according to the automatic white balance matrix to obtain white balance calibration Color correction of the linear RGB image after the white balance calibration according to the color correction matrix to obtain a linear RGB image after color correction; according to the gamma correction coefficient, the linear RGB image after color correction Performing gamma correction on the RGB image to obtain a non-linear RGB image; performing color space conversion on the non-linear RGB image according to the color space conversion matrix to obtain a bright color separation image.
  • the brightness channel data of the bright-color separated image is determined according to the contrast adjustment factor.
  • the adjustment unit is specifically configured to: The brightness channel data of the bright color separated image is adjusted for contrast and noise reduction to obtain adjusted brightness channel data; the chroma channel data of the bright color separated image is adjusted for saturation and noise reduction according to the chroma adjustment factor to obtain Adjusted chrominance channel data; determining an adjusted bright-color separation image according to the adjusted luminance channel data and the adjusted chrominance channel data.
  • the brightness channel data of the bright-color separated image is determined according to the contrast adjustment factor.
  • the adjustment unit is specifically configured to: separate the brightness channel data in the image according to the bright color, the contrast adjustment factor of the brightness channel, and the configuration and the brightness
  • the second optimization parameter corresponding to the channel determines the intermediate data of the brightness channel; the adjusted brightness channel data is determined according to the intermediate data of the brightness channel and the target parameter, and the target parameter is the comparison of the brightness channel according to the configured third optimization parameter.
  • the detail enhancement factor is optimized to obtain the parameters.
  • the second optimized parameter and the third optimized parameter are introduced, and the second optimized parameter and the third optimized parameter are manually configured so that the obtained image can meet the user's requirements for the display effect more quickly and accurately.
  • the chrominance of the bright-color separated image is adjusted according to the chrominance adjustment factor.
  • the channel data is adjusted for saturation and noise reduction to obtain adjusted chroma channel data.
  • the adjustment unit is specifically configured to: separate at least two channels of chroma channel data and the at least two channels of chroma in the image according to the bright color.
  • the respective chromaticity adjustment factors of the channels and at least two first optimization parameters configured respectively corresponding to the at least two chromaticity channels determine the adjusted at least two chromaticity channel data.
  • the first optimized parameter is introduced, and the first optimized parameter is manually configured so that the obtained image can more quickly and accurately meet the user's requirements for the display effect.
  • the bright color separation image is a YUV image
  • the linear RGB image linearGRB and adjustment The resulting bright color separation image YUV′ satisfies the following relationship:
  • V′ V*Deta_V*URatio
  • T awb is the automatic white balance matrix
  • T ccm is the color correction matrix
  • 1.0/2.2 is the correction parameter of gamma correction
  • sRGB is the nonlinear RGB image
  • YUV is the bright color separation image
  • T csc is the color space conversion matrix
  • U is the chroma channel data of the bright color separated image YUV
  • V is another chroma channel data of the bright color separated image YUV
  • YRatio is the brightness channel data Contrast adjustment factor
  • URatio is the chrominance adjustment factor of the one chrominance channel
  • VRatio is the chrominance adjustment factor of the another chrominance channel
  • Sharpness is the detail enhancement factor of the luminance channel
  • Deta_Y is the configured and all The third optimization parameter corresponding to the luminance channel
  • Deta_U is the configured first optimization parameter corresponding to the one chro
  • the acquiring unit is specifically configured to: perform demosaic processing and/or denoising processing on the source image through a low-level network to obtain the linear RGB image; extract the global processing correlation matrix from the source image through a global processing network; The adjustment network extracts the scale adjustment factor from the source image.
  • the low-level network, the global processing network, and the proportional adjustment network can be executed in parallel, so they are independent of each other and do not affect each other, and the output error of the network can be minimized.
  • the low-level network, the global processing network, and the proportional adjustment network are all It is a convolutional neural network.
  • embodiments of the present application provide a computer-readable storage medium, where the computer-readable storage medium is used to store a computer program, and when the computer program runs on a processor, the first aspect or the first aspect is implemented. Any one of the possible implementations of the method described.
  • embodiments of the present application provide a computer program product, which, when the computer program product runs on a processor, implements the first aspect or the method described in any one of the possible implementation manners of the first aspect.
  • FIG. 1 is a schematic structural diagram of an image processing device provided by an embodiment of the present invention.
  • FIG. 2 is a schematic flowchart of an image processing method provided by an embodiment of the present invention.
  • FIG. 3 is a schematic flowchart of another image processing method provided by an embodiment of the present invention.
  • FIG. 4 is a schematic flowchart of another image processing method provided by an embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of another image processing method provided by an embodiment of the present invention.
  • Fig. 6 is a schematic structural diagram of another image processing device provided by an embodiment of the present invention.
  • At least one refers to one or more, and “multiple” refers to two or more.
  • And/or describes the association relationship of the associated object, indicating that there can be three relationships, for example, A and/or B, which can mean: A alone exists, A and B exist at the same time, and B exists alone, where A, B can be singular or plural.
  • the following at least one item (a) or similar expressions refers to any combination of these items, including any combination of a single item (a) or a plurality of items (a).
  • At least one of a, b, or c can represent: a, b, c, a-b, a-c, b-c or a-b-c, where a, b, and c can be single or multiple.
  • the character "/" generally indicates that the associated objects before and after are “or”.
  • words such as "first” and “second” do not limit the quantity and execution order.
  • Fig. 1 is a schematic structural diagram of an image processing device provided by an embodiment of the application.
  • the image processing device may be a mobile phone, a tablet computer, a computer, a notebook computer, a video camera, a camera, a wearable device, a vehicle-mounted device, or a terminal device.
  • the above-mentioned devices are collectively referred to as image processing devices in this application.
  • the image processing device is a mobile phone as an example for description.
  • the mobile phone includes: a memory 101, a processor 102, a sensor component 103, a multimedia component 104, an audio component 105, a power supply component 106, and so on.
  • the memory 101 can be used to store data, software programs, and modules; it mainly includes a storage program area and a storage data area, where the storage program area can store an operating system and at least one application program required for a function, such as a sound playback function, an image playback function, etc. ;
  • the storage data area can store data created based on the use of the mobile phone, such as audio data, image data, phone book, etc.
  • the mobile phone may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices.
  • the processor 102 is the control center of the mobile phone. It uses various interfaces and lines to connect various parts of the entire device. Various functions and processing data of the mobile phone can be used to monitor the mobile phone as a whole.
  • the processor 102 may be a single-processor structure, a multi-processor structure, a single-threaded processor, a multi-threaded processor, etc.; in some feasible embodiments, the processor 102 may include a central processing unit Unit, general-purpose processor, digital signal processor, microcontroller or microprocessor, etc.
  • the processor 102 may further include other hardware circuits or accelerators, such as application specific integrated circuits, field programmable gate arrays or other programmable logic devices, transistor logic devices, hardware components, or any combination thereof.
  • the processor 102 may also be a combination for realizing computing functions, for example, including a combination of one or more microprocessors, a combination of a digital signal processor and a microprocessor, and so on.
  • the sensor component 103 includes one or more sensors, which are used to provide various aspects of the state evaluation for the mobile phone.
  • the sensor component 103 may include a light sensor, such as a Complementary Metal-Oxide-Semiconductor (CMOS) or Charge Coupled Device (CCD) image sensor, which is used to detect the distance between an external object and the mobile phone. Or used in imaging applications, that is, it becomes an integral part of the camera or camera.
  • the sensor component 103 can also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor or a temperature sensor. Through the sensor component 103, the acceleration/deceleration, orientation, open/close state of the mobile phone, the relative positioning of the components, or The temperature change of the mobile phone, etc.
  • the multimedia component 104 provides a screen with an output interface between the mobile phone and the user.
  • the screen may be a touch panel, and when the screen is a touch panel, the screen may be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touch, sliding, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure related to the touch or slide operation.
  • the multimedia component 104 further includes at least one camera.
  • the multimedia component 104 includes a front camera and/or a rear camera. When the mobile phone is in an operating mode, such as shooting mode or video mode, the front camera and/or the rear camera can receive external multimedia data.
  • Each front camera and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
  • the audio component 105 may provide an audio interface between the user and the mobile phone.
  • the audio component 105 may include an audio circuit, a speaker, and a microphone.
  • the audio circuit can transmit the electrical signal converted from the received audio data to the speaker, which is converted into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is converted into audio after being received by the audio circuit Data, and then output audio data to send to, for example, another mobile phone, or output audio data to the processor 102 for further processing.
  • the power supply component 106 is used to provide power to various components of the mobile phone.
  • the power supply component 106 may include a power management system, one or more power supplies, and other components associated with the generation, management, and distribution of power by the mobile phone.
  • the mobile phone may also include a wireless fidelity (Wireless Fidelity, WiFi) module, a Bluetooth module, etc., which will not be repeated in the embodiment of the present application.
  • a wireless fidelity Wireless Fidelity, WiFi
  • WiFi Wireless Fidelity
  • Bluetooth Bluetooth
  • FIG. 1 does not constitute a limitation on the mobile phone, and may include more or fewer components than those shown in the figure, or a combination of some components, or different component arrangements.
  • FIG. 2 is a schematic flowchart of an image processing method according to an embodiment of the application. The method may be executed by the image processing device shown in FIG. 1. Referring to FIG. 2, the method may include the following steps.
  • Step S201 Obtain a linear RGB image from the source image, globally process the correlation matrix, and the scale adjustment factor.
  • the source image is a RAW image, which may represent raw image data, and specifically may be a Bayer image corresponding to the target scene, or a RAW image in Bayer format obtained after the Bayer image is converted from analog to digital.
  • the Bayer image can be obtained by the sensor component 103 in the image processing device shown in FIG.
  • a RAW image can include multiple pixel array units, and one pixel array unit can include 2 green (G) pixels, 1 blue (blue, B) pixel and 1 red (red, R) pixel, For example, 4 pixels are one pixel array unit.
  • G green
  • B blue
  • red red
  • R red
  • H the height of the RAW image
  • W the width of the RAW image.
  • a pixel of a RAW image has only one color, that is, one of red, green, or blue, and each pixel has a pixel value.
  • the linear RGB image, the global processing correlation matrix, and the scale adjustment factor listed here can be obtained in parallel, that is, the acquisition of any one of them does not depend on whether the other parameters listed here have been obtained; in addition, except here In addition to the listed items, other parameters or data can be obtained.
  • the various data exemplified here can be specifically obtained through multiple data processing networks in parallel. For example, as shown in Figure 3, they can be respectively passed through a low-level network (LowLevel Net, LL Net) and a global processing network (also called Color processing network (color Net) and scale adjustment network (also known as brightness processing network (lumnet Net)) are processed.
  • LowLevel Net LowLevel Net
  • LL Net global processing network
  • color Net color Net
  • scale adjustment network also known as brightness processing network (lumnet Net)
  • the low-level network LL Net performs demosaic processing and bayer denoising processing on the source image to obtain a linear RGB image.
  • a linear RGB image is an RGB image in which the color change of the pixel point can be represented by the linear change of the pixel value data.
  • the RGB image can also be called three primary color image data.
  • the global processing network extracts the color information in the source image, completes image white balance, color correction (CCM), and RGB2YUV, thereby outputting a global processing correlation matrix, for example, the color correction matrix T ccm and the automatic white balance matrix T awb , gamma correction coefficient 1.0/2.2, and color space conversion matrix T csc, etc.
  • a global processing correlation matrix for example, the color correction matrix T ccm and the automatic white balance matrix T awb , gamma correction coefficient 1.0/2.2, and color space conversion matrix T csc, etc.
  • the scale adjustment network extracts image scale adjustment factors such as contrast adjustment factors and chrominance adjustment factors from the source image (ie, the digital image signal RAW in the Bayer format). Specifically, the proportion adjustment network processes the functions of contrast adjustment, noise reduction, detail enhancement, and chroma channel noise reduction and saturation adjustment of some brightness channels in the source image, thereby outputting the contrast adjustment factor Yratio and Yratio corresponding to the brightness channel.
  • the chrominance channel may include multiple channels. For example, the number of added chrominance channels is two, corresponding to one of the chrominance channels.
  • the chroma adjustment factor of the degree channel is expressed as Uratio
  • the chroma adjustment factor corresponding to the other chroma channel is expressed as Vratio; of course, this link may also generate other factors related to chroma and brightness. Other factors are here Not one by one.
  • multiple data processing networks in the embodiment of the present application may be a convolutional neural network (Convolutional Neural Network).
  • the convolutional neural network can learn many characteristics of the source image to predict the corresponding characteristics of the new input source image. This prediction method is more efficient and the prediction effect is more accurate when there are more training samples. In some scenes (such as dark night, water, etc.), the source image may be inaccurate when making predictions.
  • Step S202 Perform global processing on the linear RGB image according to the global processing correlation matrix to obtain a bright color separated image.
  • global pixel processing refers to processing all pixels of the image data, and global pixel processing can be used to adjust certain image characteristics of the overall image, such as the color, contrast, and exposure of the overall image.
  • the image linearRGB awb performs white balance calibration on the linear linear RGB image according to the automatic white balance matrix T awb to obtain the linear RGB image linear RGB awb after the white balance calibration; and then perform the white balance calibration on the linear RGB after the white balance calibration according to the color correction matrix T ccm
  • the image linearRGB awb performs color correction to obtain the linear RGB image linearRGB ccm after color correction; then according to the gamma correction coefficient 1.0/2.2, the linear RGB image linearRGB ccm after the color correction is gamma corrected to obtain the nonlinear RGB image sRGB, where the correction coefficient can also be 1/2.2 or 1.0/2.0 or 1.0/3.0 or other values; then the non-linear RGB image sRGB is converted in color space according to the color space conversion matrix T csc to obtain a bright color separation image, for example, It can be a YUV image.
  • Step S203 Adjust the bright color separated image according to the scale adjustment factor to obtain the adjusted bright color separated image.
  • the scale adjustment factor may include the contrast adjustment factor of the luminance channel and the chrominance adjustment factor of the chrominance channel, wherein the contrast adjustment factor is used to adjust contrast and noise reduction, and the chrominance adjustment factor is used to adjust saturation.
  • noise reduction the way to obtain the adjusted bright color separation image can be as follows: according to the contrast adjustment factor, the brightness channel data of the bright color separation image is processed for contrast and noise reduction to obtain the adjusted brightness channel data; according to the chroma adjustment factor Saturation and noise reduction are performed on the chroma channel data of the bright color separation image to obtain adjusted chroma channel data; and then determine according to the adjusted brightness channel data and the adjusted chroma channel data The adjusted bright color separates the image.
  • the brightness channel data input the brightness channel data Y of the bright color separation image, the contrast adjustment factor YRadio of the brightness channel of the bright color separation image, and the detail enhancement factor Sharpeness of the brightness channel of the bright color separation image into the corresponding calculation network, and the output of the calculation network is
  • the brightness channel data Y′ after adjusting the contrast and noise reduction of the bright color separated image for example:
  • chromaticity channel data if there are two chromaticity channels in the bright-color separated image, one chromaticity channel data U of the bright-color separated image and the chromaticity adjustment factor URadio corresponding to the one chromaticity channel are input into the corresponding calculation network, The output of the computing network is used as a chromaticity channel U′ after adjusting the saturation and noise reduction of the bright color separation image, for example:
  • V′ V*URatio Formula 1-7
  • the adjustment is determined After the bright color separation image YUV'.
  • the brightness channel data Substitute the brightness channel data Y of the bright color separated image, the contrast adjustment factor YRadio of the brightness channel of the bright color separated image, and the second optimization parameter Deta_Y corresponding to the brightness channel of the bright color separated image into the corresponding calculation network.
  • This calculation network The output of is used as the intermediate data Y"; and according to the configured third optimization parameter Deta_S, the detail enhancement factor Sharpness of the brightness channel of the bright color separation image is optimized to obtain the target parameter Sharpness"; then the intermediate data Y" and the target parameter Sharpeness" are input to
  • the output of the calculation network is the brightness channel data Y′ after adjusting the contrast of the bright color separation image and the noise reduction, for example:
  • chroma channel data if there are two chroma channels in the bright color separated image, one chroma channel data U of the bright color separated image, the chroma adjustment factor URadio corresponding to the chroma channel and the configured chroma channel
  • the first optimization parameter Deta_U corresponding to the channel is input to the corresponding calculation network, and the output of the calculation network is used as a chromaticity channel U′ after adjusting the saturation and noise reduction of the bright color separation image, for example:
  • the other chrominance channel data V of the bright color separated image, the chrominance adjustment factor VRadio corresponding to the other chrominance channel, and the configured first optimization parameter Deta_V corresponding to the other chrominance channel are input to the corresponding calculation network
  • the output of the calculation network is used as another chroma channel V′ after adjusting the saturation and noise reduction of the bright-color separated image, for example:
  • V′ V*Deta_V*URatio Formula 1-12
  • the chrominance channel U′ after adjusting the saturation and noise reduction, and the other chrominance channel V′ after adjusting the saturation and noise reduction determine The adjusted bright color separation image YUV'.
  • the first optimization parameter, the second optimization parameter, and the third optimization parameter are artificially pre-configured parameters in order to make the image processing effect not only limited to the impact of the data output by the deep learning network, but also Affected by the parameters actively set by the user, which caters to the user's special needs for image processing effects.
  • the effect of the deep learning network on the source image in the dark night, underwater and other scenes is far from the actual effect seen by the human eye.
  • these parameters can be set artificially so that after processing the source images in scenes such as dark night and water, the final effect presented to the user is closer to the effect when the user sees the real scene of dark night and water.
  • the second optimization parameter Deta_Y 1
  • the third optimization parameter Deta_S 1 .
  • the second optimization parameter Deta_Y 1
  • the third optimization parameter Deta_S 1 .
  • the second optimization parameter Deta_Y 0.8
  • the third optimization parameter Deta_S 0.8 .
  • the second optimization parameter Deta_Y 0.75
  • the third optimization parameter Deta_S 0.7 .
  • the users mentioned in the embodiments of the present application may be R&D personnel, production personnel, or technical support personnel of related products (such as the above-mentioned image processing equipment).
  • the users mentioned in the embodiments of the present application may be customers who use related products (such as the aforementioned image processing equipment).
  • color conversion matrix conversion, gamma correction, and color space conversion matrix conversion are performed on a linear RGB image to obtain a bright color separated image, and then the brightness channel data and chroma channel data of the bright color separated image are synchronized. Optimize based on the corresponding contrast adjustment factor and chroma adjustment factor to obtain the adjusted bright color separation image.
  • the processing efficiency is high, which is also conducive to the adjustment of the corresponding data.
  • the contrast adjustment and noise reduction processing can be done at one time based on the contrast adjustment factor, and the saturation adjustment and noise reduction processing can be done at the same time based on the chroma adjustment factor; this kind of processing for the luminance channel data and the chroma channel data The way is simpler and more efficient.
  • the first optimization parameter, the second optimization parameter, etc. can also be introduced.
  • the obtained image can meet the user's requirements for the display effect more quickly and more accurately, especially when combined with deep learning
  • image processing not only can the advantages of deep neural networks be fully utilized, but also the problem of unsatisfactory output results of deep neural networks in certain specific scenarios can be alleviated by artificially configuring relevant parameters.
  • FIG. 6 is a schematic structural diagram of an image processing device 60 provided by an embodiment of the present invention.
  • the device 60 may include an acquiring unit 601, a global processing unit 602, and an adjusting unit 603.
  • the detailed description of each unit is as follows.
  • the acquiring unit 601 is configured to acquire a linear RGB image, a global processing correlation matrix, and a scale adjustment factor.
  • the scale adjustment factor includes a contrast adjustment factor and a chroma adjustment factor, wherein the contrast adjustment factor is used to adjust contrast and reduce noise, The chromaticity adjustment factor is used to adjust saturation and noise reduction;
  • the global processing unit 602 is configured to perform global processing on the linear RGB image according to the global processing correlation matrix to obtain a bright color separated image
  • the adjusting unit 603 is configured to adjust the brightness channel data of the bright color separated image according to the contrast adjustment factor, and adjust the chroma channel data of the bright color separated image according to the chromaticity adjustment factor to obtain the adjusted The bright colors separate the image.
  • the linear RGB image is converted by color conversion matrix, gamma correction and color space conversion matrix conversion to obtain bright color separated images, and then the brightness channel data and chroma channel data of the bright color separated images are synchronized based on the corresponding contrast
  • the adjustment factor and the chroma adjustment factor are optimized to obtain the adjusted bright color separation image.
  • the processing efficiency is high, which is also conducive to the adjustment of the corresponding data.
  • the contrast adjustment and noise reduction processing can be done at one time based on the contrast adjustment factor, and the saturation adjustment and noise reduction processing can be done at the same time based on the chroma adjustment factor; this kind of processing for the luminance channel data and the chroma channel data The way is simpler and more efficient.
  • the global processing correlation matrix includes a color correction matrix, an automatic white balance matrix, a gamma correction coefficient, and a color space conversion matrix; the linear RGB image is processed according to the global processing correlation matrix.
  • the global processing unit is specifically configured to:
  • the brightness channel data of the bright color separated image is adjusted according to the contrast adjustment factor
  • the chroma channel data of the bright color separated image is adjusted according to the chroma adjustment factor.
  • the adjustment unit is specifically configured to:
  • an adjusted bright-color separation image is determined.
  • the adjustment unit after performing contrast and noise reduction adjustments on the brightness channel data of the bright-color separated image according to the contrast adjustment factor to obtain adjusted brightness channel data, the adjustment unit is specifically configured to:
  • the adjusted brightness channel data is determined according to the intermediate data of the brightness channel and the target parameter, and the target parameter is a parameter obtained by optimizing the detail enhancement factor of the brightness channel according to the configured third optimization parameter.
  • the second optimized parameter and the third optimized parameter are introduced, and the second optimized parameter and the third optimized parameter are manually configured so that the obtained image can meet the user's requirements for the display effect more quickly and accurately.
  • the adjustment unit Specifically used for:
  • the respective chrominance adjustment factors of the at least two channels of chrominance, and at least two first channels configured to correspond to the at least two channels of chrominance, respectively The optimization parameters determine at least two channels of chrominance channel data after adjustment.
  • the first optimized parameter is introduced, and the first optimized parameter is manually configured so that the obtained image can more quickly and accurately meet the user's requirements for the display effect.
  • the bright color separation image is a YUV image
  • the linear RGB image linearGRB and the adjusted bright color separation image YUV′ satisfy the following relationship:
  • V′ V*Deta_V*URatio
  • T awb is the automatic white balance matrix
  • T ccm is the color correction matrix
  • 1.0/2.2 is the correction parameter of gamma correction
  • sRGB is the nonlinear RGB image
  • YUV is the bright color separation image
  • T csc is the color space conversion matrix
  • U is the chroma channel data of the bright color separated image YUV
  • V is another chroma channel data of the bright color separated image YUV
  • YRatio is the brightness channel data Contrast adjustment factor
  • URatio is the chrominance adjustment factor of the one chrominance channel
  • VRatio is the chrominance adjustment factor of the another chrominance channel
  • Sharpness is the detail enhancement factor of the luminance channel
  • Deta_Y is the configured and all The third optimization parameter corresponding to the luminance channel
  • Deta_U is the configured first optimization parameter corresponding to the one chro
  • the acquiring unit is specifically configured to:
  • the scale adjustment factor is extracted from the source image through a scale adjustment network.
  • the low-level network, the global processing network, and the proportional adjustment network can be executed in parallel, so they are independent of each other and do not affect each other, and the output error of the network can be minimized.
  • the low-level network, the global processing network, and the scale adjustment network are all convolutional neural networks.
  • each unit and the beneficial effects brought about by the coordinated operation of these units can also be referred to the corresponding description of the method embodiment shown in FIG. 2.
  • An embodiment of the present invention also provides a chip system, the chip system includes at least one processor, a memory, and an interface circuit.
  • the memory, the transceiver, and the at least one processor are interconnected by wires, and the at least one memory Instructions are stored in the processor; when the instructions are executed by the processor, the method flow shown in FIG. 2 is implemented.
  • the embodiment of the present invention also provides a computer-readable storage medium in which a computer program is stored, and when it runs on a processor, the method flow shown in FIG. 2 is realized.
  • the embodiment of the present invention also provides a computer program product.
  • the computer program product runs on a processor, the method flow shown in FIG. 2 is realized.
  • the process can be completed by a computer program instructing relevant hardware.
  • the program can be stored in a computer readable storage medium. , May include the processes of the foregoing method embodiments.
  • the aforementioned storage media include: ROM or random storage RAM, magnetic disks or optical disks and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Color Image Communication Systems (AREA)
  • Facsimile Image Signal Circuits (AREA)

Abstract

本申请实施例提供一种图像处理方法及相关装置,该方法包括:获取线性RGB图像、全局处理相关矩阵和比例调整因子,所述比例调整因子包括对比度调整因子和色度调整因子,其中,所述对比度调整因子用于调整对比度和降噪,所述色度调整因子用于调整饱和度和降噪;根据所述全局处理相关矩阵对所述线性RGB图像进行全局处理获得亮色分离图像;根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行调整,及根据所述色度调整因子对所述亮色分离图像的色度通道数据进行调整,得到调整后的亮色分离图像。采用本发明实施例能够简化图像处理过程,提升输出的图像的质量。

Description

一种图像处理方法及相关装置 技术领域
本发明涉及图像处理技术领域,尤其涉及一种图像处理方法及相关装置。
背景技术
图像作为人类感知世界的视觉基础,是人类获取信息、表达信息和传递信息的重要手段。图像信号处理器(Image Signal Processor,ISP),是拍照设备的重要组成部分。当我们在拍照图像时,拍照设备能够通过镜头获得目标景物对应的拜耳(Bayer)图像,将该拜耳图像经过模拟到数字的转换即可得到数字图像信号(即RAW数据),该RAW数据通过ISP进行一系列的计算优化,如降噪、颜色调整、亮度调整、以及曝光度调整等,最终生成在显示屏上进行展示的目标图像。
目前,图像处理通常采用深度学习技术来实现,深度学习主要是基于人工神经网络的各种方法来实现计算优化,其应用越来越广泛。现有技术中,通常是将降噪、颜色调整和亮度调整等多个处理步骤以串行方式来执行,比如,通过一个能够实现多个处理步骤的神经网络(多个处理步骤以串行的方式执行)实现RAW数据到目标图像的处理,或者通过串行的多个神经网络(每个神经网络能够实现不同的处理步骤)来实现RAW数据到目标图像的处理。
但是,上述通过一个神经网络或者多个串行的神经网络处理图像的方式中,每个神经网络的相关参数都是固定不变的,从而得到的目标图像的效果也是固定的,无法在目标图像的效果不佳或者需要切换不同的图像效果等情况下进行调节。此外,上述通过多个串行的神经网络处理图像的方式中,后一个神经网络的输入数据完全依赖于前一个神经网络的输出数据,即前后神经网络依赖性较强,从而导致比较靠后的神经网络的训练难度增加。
发明内容
本发明实施例公开了一种图像处理方法及相关装置,能够简化图像处理过程,提升输出的图像的质量。
第一方面,本申请实施例提供一种图像处理方法,该方法包括:获取线性RGB图像、全局处理相关矩阵和比例调整因子,所述比例调整因子包括对比度调整因子和色度调整因子,其中,所述对比度调整因子用于调整对比度和降噪,所述色度调整因子用于调整饱和度和降噪;根据所述全局处理相关矩阵对所述线性RGB图像进行全局处理获得亮色分离图像;根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行调整,及根据所述色度调整因子对所述亮色分离图像的色度通道数据进行调整,得到调整后的亮色分离图像。
在上述方法中,对线性RGB图像进行颜色转换矩阵的转换、伽玛校正和色彩空间转换矩阵转换得到亮色分离图像,然后针对亮色分离图像的亮度通道数据和色度通道数据同步进行基于对应的对比度调整因子、色度调整因子进行优化,得到调整后的亮色分离图像。一方面,由于对色度通道数据的处理和对亮度通道数据的处理都是并行的,因此处理效率 较高,也有利于对相应数据的调整。另一方面,基于对比度调整因子可以一次性完成对比对调整和降噪处理,基于色度调整因子可以一次性完成饱和度调整和降噪处理;这种针对亮度通道数据和色度通道数据的处理方式更简单、高效。
结合第一方面,在第一方面的第一种可能的实现方式中,所述全局处理相关矩阵包括颜色矫正矩阵、自动白平衡矩阵、伽玛校正系数和色彩空间转换矩阵;所述根据所述全局处理相关矩阵对所述线性RGB图像进行全局处理获得亮色分离图像,包括:根据所述自动白平衡矩阵对所述线性RGB图像进行白平衡校准,得到白平衡校准后的线性RGB图像;根据所述颜色矫正矩阵对所述白平衡校准后的线性RGB图像进行颜色矫正,得到颜色矫正后的线性RGB图像;根据所述伽玛校正系数对所述颜色矫正后的线性RGB图像进行伽玛校正,得到非线性RGB图像;根据所述色彩空间转换矩阵对所述非线性RGB图像进行色彩空间转换,得到亮色分离图像。
结合第一方面,或者第一方面的上述任一种可能的实现方式,在第一方面的第二种可能的实现方式中,所述根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行调整,及根据所述色度调整因子对所述亮色分离图像的色度通道数据进行调整,得到调整后的亮色分离图像,包括:根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行对比度和降噪调整,得到调整后的亮度通道数据;根据所述色度调整因子对所述亮色分离图像的色度通道数据进行饱和度和降噪调整,得到调整后的色度通道数据;根据所述调整后的亮度通道数据和所述调整后的色度通道数据,确定调整后的亮色分离图像。
结合第一方面,或者第一方面的上述任一种可能的实现方式,在第一方面的第三种可能的实现方式中,所述根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行对比度和降噪调整,得到调整后的亮度通道数据,包括:根据所述亮色分离图像中的亮度通道数据、所述亮度通道的对比度调整因子、配置的与所述亮度通道对应的第二优化参数确定亮度通道的中间数据;根据所述亮度通道的中间数据和目标参数确定调整后的亮度通道数据,所述目标参数为根据配置的第三优化参数对所述亮度通道的细节增强因子进行优化得到的参数。
在该方法中,引入第二优化参数、第三优化参数,通过人为配置第二优化参数和第三优化参数使得获得的图像能够更快速、更准确地满足用户对显示效果的要求。
结合第一方面,或者第一方面的上述任一种可能的实现方式,在第一方面的第四种可能的实现方式中,所述根据所述色度调整因子对所述亮色分离图像的色度通道数据进行饱和度和降噪调整,得到调整后的色度通道数据,包括:根据所述亮色分离图像中的至少两路色度通道数据、所述至少两路色度通道各自的色度调整因子、配置的与所述至少两路色度通道分别对应的至少两个第一优化参数确定调整后的至少两路色度通道数据。
在该方法中,引入第一优化参数,通过人为配置第一优化参数使得获得的图像能够更快速、更准确地满足用户对显示效果的要求。
结合第一方面,或者第一方面的上述任一种可能的实现方式,在第一方面的第五种可能的实现方式中,所述亮色分离图像为YUV图像,所述线性RGB图像linearGRB与调整后的亮色分离图像YUV′满足如下关系:
linearRGB awb=linearRGB*T awb
linearRGB ccm=linearRGB awb*T ccm
sRGB=(linearRGB ccm) 1.0/2.2
YUV=sRGB*T csc
Y″=Y*Deta_Y*YRatio
Sharpeness″=Deta_S*Sharpeness
Y′=Y″+Sharpeness″
U′=U*Deta_U*URatio
V′=V*Deta_V*URatio
其中,T awb为自动白平衡矩阵,T ccm为颜色矫正矩阵,1.0/2.2为伽玛矫正的矫正参数,sRGB为非线性RGB图像,YUV为亮色分离图像,T csc为色彩空间转换矩阵,Y为所述亮色分离图像YUV的亮度通道数据,U为所述亮色分离图像YUV的一路色度通道数据,V为所述亮色分离图像YUV的又一路色度通道数据,YRatio为所述亮度通道的对比度调整因子,URatio为所述一路色度通道的色度调整因子,VRatio为所述又一路色度通道的色度调整因子,Sharpeness为所述亮度通道的细节增强因子,Deta_Y为配置的与所述亮度通道对应的第三优化参数,Deta_U为配置的与所述一路色度通道对应的第一优化参数,Deta_V为配置的与所述又一路色度通道对应的第一优化参数,Deta_S为配置的与所述细节增强因子Sharpeness对应的第三优化参数,Y″为所述亮度通道的中间数据,Sharpeness″为目标参数,U′为所述一路色度通道调整后的数据,V′为所述又一路色度通道调整后的数据,Y′为所述亮度通道调整后的数据,所述Y′、U′、V′组成调整后的亮色分离图像YUV′。
结合第一方面,或者第一方面的上述任一种可能的实现方式,在第一方面的第六种可能的实现方式中,所述获取线性RGB图像、全局处理相关矩阵和比例调整因子,包括:通过低层网络对源图像进行去马赛克处理和/或去噪处理得到所述线性RGB图像;通过全局处理网络从所述源图像中提取所述全局处理相关矩阵;通过比例调整网络从所述源图像提取所述比例调整因子。
具体地,低层网络、全局处理网络和比例调整网络可以并行执行,因此相互之间互不依赖互不影响,能够尽量减少网络的输出误差。
结合第一方面,或者第一方面的上述任一种可能的实现方式,在第一方面的第七种可能的实现方式中,所述低层网络、所述全局处理网络和所述比例调整网络均为卷积神经网络。
可以理解,将卷积神经网络与上述第一优化参数、第二优化参数等进行结合,不仅可以很好地发挥深度神经网络的优势,又可以通过人为配置相关参数来缓解深度神经网络在某些特定场景下输出结果不理想的问题。
第二方面,本申请实施例提供一种图像处理设备,该设备包括处理器、存储器,其中,所述存储器用于存储计算机程序,所述处理器用于调用所述计算机程序来执行如下操作:获取线性RGB图像、全局处理相关矩阵和比例调整因子,所述比例调整因子包括对比度调整因子和色度调整因子,其中,所述对比度调整因子用于调整对比度和降噪,所述色度调整因子用于调整饱和度和降噪;根据所述全局处理相关矩阵对所述线性RGB图像进行全局处理获得亮色分离图像;根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进 行调整,及根据所述色度调整因子对所述亮色分离图像的色度通道数据进行调整,得到调整后的亮色分离图像。
在上述方法中,对线性RGB图像进行颜色转换矩阵的转换、伽玛校正和色彩空间转换矩阵转换得到亮色分离图像,然后针对亮色分离图像的亮度通道数据和色度通道数据同步进行基于对应的对比度调整因子、色度调整因子进行优化,得到调整后的亮色分离图像。一方面,由于对色度通道数据的处理和对亮度通道数据的处理都是并行的,因此处理效率较高,也有利于对相应数据的调整。另一方面,基于对比度调整因子可以一次性完成对比对调整和降噪处理,基于色度调整因子可以一次性完成饱和度调整和降噪处理;这种针对亮度通道数据和色度通道数据的处理方式更简单、高效。
结合第二方面,在第二方面的第一种可能的实现方式中,所述全局处理相关矩阵包括颜色矫正矩阵、自动白平衡矩阵、伽玛校正系数和色彩空间转换矩阵;在根据所述全局处理相关矩阵对所述线性RGB图像进行全局处理获得亮色分离图像方面,所述处理器具体用于:根据所述自动白平衡矩阵对所述线性RGB图像进行白平衡校准,得到白平衡校准后的线性RGB图像;根据所述颜色矫正矩阵对所述白平衡校准后的线性RGB图像进行颜色矫正,得到颜色矫正后的线性RGB图像;根据所述伽玛校正系数对所述颜色矫正后的线性RGB图像进行伽玛校正,得到非线性RGB图像;根据所述色彩空间转换矩阵对所述非线性RGB图像进行色彩空间转换,得到亮色分离图像。
结合第二方面,或者第二方面的上述任一种可能的实现方式,在第二方面的第二种可能的实现方式中,在根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行调整,及根据所述色度调整因子对所述亮色分离图像的色度通道数据进行调整,得到调整后的亮色分离图像方面,所述处理器具体用于:根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行对比度和降噪调整,得到调整后的亮度通道数据;根据所述色度调整因子对所述亮色分离图像的色度通道数据进行饱和度和降噪调整,得到调整后的色度通道数据;根据所述调整后的亮度通道数据和所述调整后的色度通道数据,确定调整后的亮色分离图像。
结合第二方面,或者第二方面的上述任一种可能的实现方式,在第二方面的第三种可能的实现方式中,在根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行对比度和降噪调整,得到调整后的亮度通道数据方面,所述处理器具体用于:根据所述亮色分离图像中的亮度通道数据、所述亮度通道的对比度调整因子、配置的与所述亮度通道对应的第二优化参数确定亮度通道的中间数据;根据所述亮度通道的中间数据和目标参数确定调整后的亮度通道数据,所述目标参数为根据配置的第三优化参数对所述亮度通道的细节增强因子进行优化得到的参数。在该方法中,引入第二优化参数、第三优化参数,通过人为配置第二优化参数和第三优化参数使得获得的图像能够更快速、更准确地满足用户对显示效果的要求。
结合第二方面,或者第二方面的上述任一种可能的实现方式,在第二方面的第四种可能的实现方式中,在根据所述色度调整因子对所述亮色分离图像的色度通道数据进行饱和度和降噪调整,得到调整后的色度通道数据方面,所述处理器具体用于:根据所述亮色分离图像中的至少两路色度通道数据、所述至少两路色度通道各自的色度调整因子、配置的 与所述至少两路色度通道分别对应的至少两个第一优化参数确定调整后的至少两路色度通道数据。
在该方法中,引入第一优化参数,通过人为配置第一优化参数使得获得的图像能够更快速、更准确地满足用户对显示效果的要求。
结合第二方面,或者第二方面的上述任一种可能的实现方式,在第二方面的第五种可能的实现方式中,所述亮色分离图像为YUV图像,所述线性RGB图像linearGRB与调整后的亮色分离图像YUV′满足如下关系:
linearRGB awb=linearRGB*T awb
linearRGB ccm=linearRGB awb*T ccm
sRGB=(linearRGB ccm) 1.0/2.2
YUV=sRGB*T csc
Y″=Y*Deta_Y*YRatio
Sharpeness″=Deta_S*Sharpeness
Y′=Y″+Sharpeness″
U′=U*Deta_U*URatio
V′=V*Deta_V*URatio
其中,T awb为自动白平衡矩阵,T ccm为颜色矫正矩阵,1.0/2.2为伽玛矫正的矫正参数,sRGB为非线性RGB图像,YUV为亮色分离图像,T csc为色彩空间转换矩阵,Y为所述亮色分离图像YUV的亮度通道数据,U为所述亮色分离图像YUV的一路色度通道数据,V为所述亮色分离图像YUV的又一路色度通道数据,YRatio为所述亮度通道的对比度调整因子,URatio为所述一路色度通道的色度调整因子,VRatio为所述又一路色度通道的色度调整因子,Sharpeness为所述亮度通道的细节增强因子,Deta_Y为配置的与所述亮度通道对应的第三优化参数,Deta_U为配置的与所述一路色度通道对应的第一优化参数,Deta_V为配置的与所述又一路色度通道对应的第一优化参数,Deta_S为配置的与所述细节增强因子Sharpeness对应的第三优化参数,Y″为所述亮度通道的中间数据,Sharpeness″为目标参数,U′为所述一路色度通道调整后的数据,V′为所述又一路色度通道调整后的数据,Y′为所述亮度通道调整后的数据,所述Y′、U′、V′组成调整后的亮色分离图像YUV′。
结合第二方面,或者第二方面的上述任一种可能的实现方式,在第二方面的第六种可能的实现方式中,在获取线性RGB图像、全局处理相关矩阵和比例调整因子方面,所述处理具体用于:通过低层网络对源图像进行去马赛克处理和/或去噪处理得到所述线性RGB图像;通过全局处理网络从所述源图像中提取所述全局处理相关矩阵;通过比例调整网络从所述源图像提取所述比例调整因子。
具体地,低层网络、全局处理网络和比例调整网络可以并行执行,因此相互之间互不依赖互不影响,能够尽量减少网络的输出误差。
结合第二方面,或者第二方面的上述任一种可能的实现方式,在第二方面的第七种可能的实现方式中,所述低层网络、所述全局处理网络和所述比例调整网络均为卷积神经网络。
可以理解,将卷积神经网络与上述第一优化参数、第二优化参数等进行结合,不仅可 以很好地发挥深度神经网络的优势,又可以通过人为配置相关参数来缓解深度神经网络在某些特定场景下输出结果不理想的问题。
第三方面,本申请实施例提供一种图像处理设备,该设备包括:获取单元,用于获取线性RGB图像、全局处理相关矩阵和比例调整因子,所述比例调整因子包括对比度调整因子和色度调整因子,其中,所述对比度调整因子用于调整对比度和降噪,所述色度调整因子用于调整饱和度和降噪;全局处理单元,用于根据所述全局处理相关矩阵对所述线性RGB图像进行全局处理获得亮色分离图像;调整单元,用于根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行调整,及根据所述色度调整因子对所述亮色分离图像的色度通道数据进行调整,得到调整后的亮色分离图像。
在上述设备中,对线性RGB图像进行颜色转换矩阵的转换、伽玛校正和色彩空间转换矩阵转换得到亮色分离图像,然后针对亮色分离图像的亮度通道数据和色度通道数据同步进行基于对应的对比度调整因子、色度调整因子进行优化,得到调整后的亮色分离图像。一方面,由于对色度通道数据的处理和对亮度通道数据的处理都是并行的,因此处理效率较高,也有利于对相应数据的调整。另一方面,基于对比度调整因子可以一次性完成对比对调整和降噪处理,基于色度调整因子可以一次性完成饱和度调整和降噪处理;这种针对亮度通道数据和色度通道数据的处理方式更简单、高效。
结合第三方面,在第三方面的第一种可能的实现方式中,所述全局处理相关矩阵包括颜色矫正矩阵、自动白平衡矩阵、伽玛校正系数和色彩空间转换矩阵;在根据所述全局处理相关矩阵对所述线性RGB图像进行全局处理获得亮色分离图像方面,所述全局处理单元具体用于:根据所述自动白平衡矩阵对所述线性RGB图像进行白平衡校准,得到白平衡校准后的线性RGB图像;根据所述颜色矫正矩阵对所述白平衡校准后的线性RGB图像进行颜色矫正,得到颜色矫正后的线性RGB图像;根据所述伽玛校正系数对所述颜色矫正后的线性RGB图像进行伽玛校正,得到非线性RGB图像;根据所述色彩空间转换矩阵对所述非线性RGB图像进行色彩空间转换,得到亮色分离图像。
结合第三方面,或者第三方面的上述任一种可能的实现方式,在第三方面的第二种可能的实现方式中,在根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行调整,及根据所述色度调整因子对所述亮色分离图像的色度通道数据进行调整,得到调整后的亮色分离图像方面,所述调整单元具体用于:根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行对比度和降噪调整,得到调整后的亮度通道数据;根据所述色度调整因子对所述亮色分离图像的色度通道数据进行饱和度和降噪调整,得到调整后的色度通道数据;根据所述调整后的亮度通道数据和所述调整后的色度通道数据,确定调整后的亮色分离图像。
结合第三方面,或者第三方面的上述任一种可能的实现方式,在第三方面的第三种可能的实现方式中,在根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行对比度和降噪调整,得到调整后的亮度通道数据,所述调整单元具体用于:根据所述亮色分离图像中的亮度通道数据、所述亮度通道的对比度调整因子、配置的与所述亮度通道对应的第二优化参数确定亮度通道的中间数据;根据所述亮度通道的中间数据和目标参数确定调整后的亮度通道数据,所述目标参数为根据配置的第三优化参数对所述亮度通道的细节 增强因子进行优化得到的参数。
在该设备中,引入第二优化参数、第三优化参数,通过人为配置第二优化参数和第三优化参数使得获得的图像能够更快速、更准确地满足用户对显示效果的要求。
结合第三方面,或者第三方面的上述任一种可能的实现方式,在第三方面的第四种可能的实现方式中,在根据所述色度调整因子对所述亮色分离图像的色度通道数据进行饱和度和降噪调整,得到调整后的色度通道数据,所述调整单元具体用于:根据所述亮色分离图像中的至少两路色度通道数据、所述至少两路色度通道各自的色度调整因子、配置的与所述至少两路色度通道分别对应的至少两个第一优化参数确定调整后的至少两路色度通道数据。
在该设备中,引入第一优化参数,通过人为配置第一优化参数使得获得的图像能够更快速、更准确地满足用户对显示效果的要求。
结合第三方面,或者第三方面的上述任一种可能的实现方式,在第三方面的第五种可能的实现方式中,所述亮色分离图像为YUV图像,所述线性RGB图像linearGRB与调整后的亮色分离图像YUV′满足如下关系:
linearRGB awb=linearRGB*T awb
linearRGB ccm=linearRGB awb*T ccm
sRGB=(linearRGB ccm) 1.0/2.2
YUV=sRGB*T csc
Y″=Y*Deta_Y*YRatio
Sharpeness″=Deta_S*Sharpeness
Y′=Y″+Sharpeness″
U′=U*Deta_U*URatio
V′=V*Deta_V*URatio
其中,T awb为自动白平衡矩阵,T ccm为颜色矫正矩阵,1.0/2.2为伽玛矫正的矫正参数,sRGB为非线性RGB图像,YUV为亮色分离图像,T csc为色彩空间转换矩阵,Y为所述亮色分离图像YUV的亮度通道数据,U为所述亮色分离图像YUV的一路色度通道数据,V为所述亮色分离图像YUV的又一路色度通道数据,YRatio为所述亮度通道的对比度调整因子,URatio为所述一路色度通道的色度调整因子,VRatio为所述又一路色度通道的色度调整因子,Sharpeness为所述亮度通道的细节增强因子,Deta_Y为配置的与所述亮度通道对应的第三优化参数,Deta_U为配置的与所述一路色度通道对应的第一优化参数,Deta_V为配置的与所述又一路色度通道对应的第一优化参数,Deta_S为配置的与所述细节增强因子Sharpeness对应的第三优化参数,Y″为所述亮度通道的中间数据,Sharpeness″为目标参数,U′为所述一路色度通道调整后的数据,V′为所述又一路色度通道调整后的数据,Y′为所述亮度通道调整后的数据,所述Y′、U′、V′组成调整后的亮色分离图像YUV′。
结合第三方面,或者第三方面的上述任一种可能的实现方式,在第三方面的第六种可能的实现方式中,在获取线性RGB图像、全局处理相关矩阵和比例调整因子方面,所述获取单元具体用于:通过低层网络对源图像进行去马赛克处理和/或去噪处理得到所述线性RGB图像;通过全局处理网络从所述源图像中提取所述全局处理相关矩阵;通过比例调整 网络从所述源图像提取所述比例调整因子。
具体地,低层网络、全局处理网络和比例调整网络可以并行执行,因此相互之间互不依赖互不影响,能够尽量减少网络的输出误差。
结合第三方面,或者第三方面的上述任一种可能的实现方式,在第三方面的第七种可能的实现方式中,所述低层网络、所述全局处理网络和所述比例调整网络均为卷积神经网络。
可以理解,将卷积神经网络与上述第一优化参数、第二优化参数等进行结合,不仅可以很好地发挥深度神经网络的优势,又可以通过人为配置相关参数来缓解深度神经网络在某些特定场景下输出结果不理想的问题。
第四方面,本申请实施例提供一种计算机可读存储介质,所述计算机可读存储介质用于存储计算机程序,当所述计算机程序在处理器上运行时,实现第一方面或者第一方面的任一种可能的实现方式所描述的方法。
第五方面,本申请实施例提供一种计算机程序产品,当所述计算机程序产品在处理器上运行时,实现第一方面或者第一方面的任一种可能的实现方式所描述的方法。
附图说明
图1是本发明实施例提供的一种图像处理设备的结构示意图;
图2是本发明实施例提供的一种图像处理方法的流程示意图;
图3是本发明实施例提供的又一种图像处理方法的流程示意图;
图4是本发明实施例提供的又一种图像处理方法的流程示意图;
图5是本发明实施例提供的又一种图像处理方法的流程示意图;
图6是本发明实施例提供的又一种图像处理设备的结构示意图。
具体实施方式
本申请中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或复数。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c或a-b-c,其中a、b和c可以是单个,也可以是多个。字符“/”一般表示前后关联对象是“或”的关系。另外在本申请的实施例中,“第一”、“第二”等字样并不对数量和执行次序进行限定。
需要说明的是,本申请中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其他实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
图1为本申请实施例提供的一种图像处理设备的结构示意图,该图像处理设备可以为手机、平板电脑、计算机、笔记本电脑、摄像机、照相机、可穿戴设备、车载设备、或终 端设备等。为方便描述,本申请中将上面提到的设备统称为图像处理设备。本申请实施例以该图像处理设备为手机为例进行说明,该手机包括:存储器101、处理器102、传感器组件103、多媒体组件104、音频组件105和电源组件106等。
下面结合图1对手机的各个构成部件进行具体的介绍:
存储器101可用于存储数据、软件程序以及模块;主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序,比如声音播放功能、图像播放功能等;存储数据区可存储根据手机的使用所创建的数据,比如音频数据、图像数据、电话本等。此外,手机可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。
处理器102是手机的控制中心,利用各种接口和线路连接整个设备的各个部分,通过运行或执行存储在存储器101内的软件程序和/或模块,以及调用存储在存储器101内的数据,执行手机的各种功能和处理数据,从而对手机进行整体监控。在一些可行的实施例中,处理器102可以是单处理器结构、多处理器结构、单线程处理器以及多线程处理器等;在一些可行的实施例中,处理器102可以包括中央处理器单元、通用处理器、数字信号处理器、微控制器或微处理器等。除此以外,处理器102还可进一步包括其他硬件电路或加速器,如专用集成电路、现场可编程门阵列或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。处理器102也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,数字信号处理器和微处理器的组合等。
传感器组件103包括一个或多个传感器,用于为手机提供各个方面的状态评估。其中,传感器组件103可以包括光传感器,如互补金属氧化物半导体(Complementary Metal-Oxide-Semiconductor,CMOS)或电荷耦合器件(Charge Coupled Device,CCD)图像传感器,用于检测外部物体与手机的距离,或者在成像应用中使用,即成为相机或摄像头的组成部分。此外,传感器组件103还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器,通过传感器组件103可以检测到手机的加速/减速、方位、打开/关闭状态,组件的相对定位,或手机的温度变化等。
多媒体组件104在手机和用户之间的提供一个输出接口的屏幕,该屏幕可以为触摸面板,且当该屏幕为触摸面板时,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。此外,多媒体组件104还包括至少一个摄像头,比如,多媒体组件104包括一个前置摄像头和/或后置摄像头。当手机处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。
音频组件105可提供用户与手机之间的音频接口,比如,音频组件105可以包括音频电路、扬声器和麦克风。音频电路可将接收到的音频数据转换后的电信号,传输到扬声器,由扬声器转换为声音信号输出;另一方面,麦克风将收集的声音信号转换为电信号,由音频电路接收后转换为音频数据,再输出音频数据以发送给比如另一手机,或者将音频数据 输出至处理器102以便进一步处理。
电源组件106用于为手机的各个组件提供电源,电源组件106可以包括电源管理系统,一个或多个电源,及其他与手机生成、管理和分配电力相关联的组件。
可选的,手机还可包括无线保真(Wireless Fidelity,WiFi)模块、蓝牙模块等,本申请实施例在此不再赘述。本领域技术人员可以理解,图1中示出的手机结构并不构成对手机的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
图2为本申请实施例提供一种图像处理方法的流程示意图,该方法可以由图1所示的图像处理设备来执行,参见图2,该方法可以包括以下步骤。
步骤S201:从源图像中获取线性RGB图像、全局处理相关矩阵、比例调整因子。
其中,源图像即为RAW图像,可以表示未加工的图像数据,具体可以为目标景物对应的拜耳(Bayer)图像,或者是该拜耳图像经过模拟到数字转化后得到的Bayer格式的RAW图像。拜耳图像可由上述图1所示的图像处理设备中的传感器组件103来获取,处理器102将该拜耳图像经过模拟到数字的一系列转换,得到RAW,即为Bayer格式的RAW数据。
RAW图像可包括多个像素阵列单元,一个像素阵列单元可包括2个绿色(green,G)像素点、1个蓝色(blue,B)像素点和1个红色(red,R)像素点,例如4个像素点即为一个像素阵列单元。其中H表示RAW图像的高度,W表示RAW图像的宽度。RAW图像的一个像素点只有一个颜色,即红色、绿色、或蓝色中一种,每个像素点都有一个像素值。
可选的,这里例举的线性RGB图像、全局处理相关矩阵、比例调整因子可以是并行获取的,即其中任意一项的获取不依赖于这里例举的其他参数是否已获取;另外,除这里例举的几项之外还可以获取其他参数或者数据。可选的,这里例举的各项数据可以具体为通过多个数据处理网络并行得到,例如,如图3所示,可以分别通过低层网络(LowLevel Net,LL Net)、全局处理网络(也称颜色处理网络(color Net))和比例调整网络(也称亮度处理网络(lumnet Net))处理得到,这些网络的处理流程具体如下:
低层网络LL Net对源图像进行去马赛克(Demosaic)处理和拜耳(bayer)去噪处理得到线性(linear)RGB图像。
线性(linear)RGB图像为像素点颜色的变化可以用像素值数据的线性变化来表示的RGB图像。其中,RGB图像也可以称为三原色图像数据,其一个像素点是由红色(Red,R)、绿色(Green,G)和蓝色(Blue,B)三种颜色构成的混合色,R、G、B各占一个字节,取值范围在0~255;经过组合,一个像素点可代表的颜色数为256×256×256个。例如,黑色:R=G=B=0,白色:R=G=B=255,黄色:R=G=255,B=0等。
全局处理网络提取所述源图像中的颜色信息,完成图像白平衡、颜色矫正(Color Correction Matrix,CCM)、RGB2YUV,从而输出全局处理相关矩阵,例如,颜色矫正矩阵T ccm、自动白平衡矩阵T awb、伽玛校正系数1.0/2.2,以及色彩空间转换矩阵T csc等。
比例调整网络从源图像(即Bayer格式的数字图像信号RAW)中提取图像比例调整因子例如,对比度调整因子和色度调整因子。具体地,比例调整网络处理源图像中部分亮度通道的对比度调整、降噪、细节增强等功能、色度通道的降噪、饱和度调整等功能,从而输出对应于亮度通道的对比度调整因子Yratio、对应于亮度通道的细节增强因子Ysharpness、 对应于色度通道的色度调整因子UVratio,其中,色度通道可以包括多路,例如,加过色度通道的数量为两路,对应于其中一路色度通道的色度调整因子表示为Uratio,对应于其中另一路色度通道的色度调整因子表示为Vratio;当然,这个环节还可能会生成与色度、亮度相关的其他因子,其他因子此处不一一举例。
可选的,本申请实施例中的多个数据处理网络,例如所述低层网络、所述全局处理网络和所述比例调整网络等,可以为卷积神经网络(Convolutional Neural Network)。卷积神经网络可以学习源图像的很多特性从而对新输入的源图像进行相应特性的预测,这种预测的方式效率比较高,在训练的样本比较多的情况下预测效果也比较准确,但是针对某些场景(如黑夜、水中等)中的源图像进行预测时又可能出现不准确的情况。
步骤S202:根据全局处理相关矩阵对线性RGB图像进行全局处理获得亮色分离图像。
其中,全局像素处理是指对图像数据的所有像素点均进行处理,全局像素处理可用于调整整体图像的某个图像特征,比如整体图像的颜色、对比度、曝光度等。
可选的,根据自动白平衡矩阵T awb对线性linear RGB图像进行白平衡校准,得到白平衡校准后的线性RGB图像linearRGB awb;再根据颜色矫正矩阵T ccm对所述白平衡校准后的线性RGB图像linearRGB awb进行颜色矫正,得到颜色矫正后的线性RGB图像linearRGB ccm;然后根据伽玛(gamma)校正系数1.0/2.2对颜色矫正后的线性RGB图像linearRGB ccm进行伽玛校正,得到非线性RGB图像sRGB,其中,校正系数也可以是1/2.2或者1.0/2.0或者1.0/3.0或者其他数值;接着根据色彩空间转换矩阵T csc对非线性RGB图像sRGB进行色彩空间转换,得到亮色分离图像,例如,可以为YUV图像。
为了便于理解,下面通过公式对处理过程进行举例说明。
linearRGB awb=linearRGB*T awb   公式1-1
linearRGB ccm=linearRGB awb*T ccm   公式1-2
sRGB=(linearRGB ccm) 1.0/2.2    公式1-3
YUV=sRGB*T csc    公式1-4
步骤S203:根据比例调整因子对亮色分离图像进行调整,得到调整后的亮色分离图像。
具体地,比例调整因子可以包括亮度通道的对比度调整因子和色度通道的色度调整因子,其中,所述对比度调整因子用于调整对比度和降噪,所述色度调整因子用于调整饱和度和降噪;得到调整后的亮色分离图像的方式可以如下:根据对比度调整因子对所述亮色分离图像的亮度通道数据进行对比度和降噪处理,得到调整后的亮度通道数据;根据色度调整因子对所述亮色分离图像的色度通道数据进行饱和度和降噪处理,得到调整后的色度通道数据;然后根据所述调整后的亮度通道数据和所述调整后的色度通道数据,确定调整后的亮色分离图像。
为了便于理解,下面以该亮色分离图像为YUV图像为例,例举两种可选方案。
方案一,可以结合图4所示流程进行理解。
关于亮度通道数据:将亮色分离图像的亮度通道数据Y、亮色分离图像的亮度通道的对比度调整因子YRadio、亮色分离图像的亮度通道的细节增强因子Sharpeness输入到相应计算网络中,计算网络的输出作为亮色分离图像的调整对比度和降噪后的亮度通道数据Y′,例如:
Y′=Y*YRadio+Sharpeness   公式1-5
关于色度通道数据:假若亮色分离图像存在两路色度通道,将该亮色分离图像的一路色度通道数据U,以及该一路色度通道对应的色度调整因子URadio输入到相应计算网络中,计算网络的输出作为该亮色分离图像调整饱和度和降噪后的一路色度通道U′,例如:
U′=U*URatio        公式1-6
以及将该亮色分离图像的另一路色度通道数据V、该另一路色度通道对应的色度调整因子VRadio输入到相应计算网络中,计算网络的输出作为该亮色分离图像调整饱和度和降噪后的另一路色度通道V′,例如:
V′=V*URatio     公式1-7
然后根据所述调整对比度和降噪后的亮度通道数据Y′、调整饱和度和降噪后的一路色度通道U′和调整饱和度和降噪后的另一路色度通道V′,确定调整后的亮色分离图像YUV′。
方案二,可以结合图5所示流程进行理解。
关于亮度通道数据:将亮色分离图像的亮度通道数据Y、亮色分离图像的亮度通道的对比度调整因子YRadio、亮色分离图像的亮度通道对应的第二优化参数Deta_Y代入到相应计算网络中,该计算网络的输出作为中间数据Y″;以及根据配置的第三优化参数Deta_S对该亮色分离图像亮度通道的细节增强因子Sharpeness进行优化得到目标参数Sharpeness″;再将中间数据Y″和目标参数Sharpeness″输入到计算网络中,该计算网络的输出即为亮色分离图像的调整对比度和降噪后的亮度通道数据Y′,例如:
Y″=Y*Deta_Y*YRatio     公式1-8
Sharpeness″=Deta_S*Sharpeness  公式1-9
Y′=Y″+Sharpeness″     公式1-10
关于色度通道数据:假若亮色分离图像存在两路色度通道,将该亮色分离图像的一路色度通道数据U、该一路色度通道对应的色度调整因子URadio和配置的与该一路色度通道对应的第一优化参数Deta_U输入到相应计算网络中,该计算网络的输出作为该亮色分离图像调整饱和度和降噪后的一路色度通道U′,例如:
U′=U*Deta_U*URatio      公式1-11
以及将该亮色分离图像的另一路色度通道数据V、该另一路色度通道对应的色度调整因子VRadio和配置的与该另一路色度通道对应的第一优化参数Deta_V输入到相应计算网络中,该计算网络的输出作为该亮色分离图像调整饱和度和降噪后的另一路色度通道V′,例如:
V′=V*Deta_V*URatio      公式1-12
然后根据所述调整对比度和降噪后的亮度通道数据Y′、调整饱和度和降噪后的一路色度通道U′,以及调整饱和度和降噪后的另一路色度通道V′,确定调整后的亮色分离图像YUV′。
需要说明的是,在有些图像数据处理中,会仅仅以深度学习网络的输出作为输入来进行图像处理,然而在针对某些特殊场景下的源图像进行处理时,深度学习网络的处理效果往往达不到用户的需求,本申请提出第一优化参数、第二优化参数、第三优化参数是人为预先配置的参数,就是为了让图像处理效果不仅仅局限于深度学习网络输出的数据的影响,还受用户主动设置的参数的影响,从而迎合用户对图像处理效果的特殊需求。例如,深度 学习网络对黑夜、水中等场景中的源图像进行处理的效果,与人眼实际看到的效果相差甚远,那么当引入上述第一优化参数、第二优化参数、第三优化参数这些参数后,就可以人为对这些参数进行设置以使得对黑夜、水中等场景中的源图像进行处理后,最终呈现给用户的效果,更接近用户看到黑夜、水中这些实景时的效果。
为了便于理解,下面针对优化参数的人为设置情况进行举例说明:
针对晴天,对应于一路色度通道的第一优化参数为Deta_U=1、对应于另一路色度通道的第一优化参数Deta_V=1、第二优化参数Deta_Y=1、第三优化参数Deta_S=1。
针对雨天,对应于一路色度通道的第一优化参数为Deta_U=1、对应于另一路色度通道的第一优化参数Deta_V=1、第二优化参数Deta_Y=1、第三优化参数Deta_S=1。
针对黑夜,对应于一路色度通道的第一优化参数为Deta_U=0.8、对应于另一路色度通道的第一优化参数Deta_V=0.9、第二优化参数Deta_Y=0.8、第三优化参数Deta_S=0.8。
针对水中,对应于一路色度通道的第一优化参数为Deta_U=0.7、对应于另一路色度通道的第一优化参数Deta_V=0.8、第二优化参数Deta_Y=0.75、第三优化参数Deta_S=0.7。
可选的,本申请实施例中提及的用户可以为相关产品(如上述图像处理设备)的研发人员、或者生产人员、或者技术支持人员等。可选的,本申请实施例中提及的用户可以为使用相关产品(如上述图像处理设备)的客户。
本申请实施例中,已例举的运算方式大部分是以乘法运算、指数运算为例进行说明的,实际上还可以变换为加法运算、减法运算、除法运算等其他运算方式,并且除了每个公式中已例举的参数外,还可以增加更多其他参数,其他变换的情况此处不一一举例。
本申请实施例上述各个步骤中,除语义逻辑表明了先后执行顺序的步骤外,其他各个步骤之间并无先后顺序限制,即有些步骤可并行执行,也可以分先后执行,具体不做限定。
在图2所示的方法中,对线性RGB图像进行颜色转换矩阵的转换、伽玛校正和色彩空间转换矩阵转换得到亮色分离图像,然后针对亮色分离图像的亮度通道数据和色度通道数据同步进行基于对应的对比度调整因子、色度调整因子进行优化,得到调整后的亮色分离图像。一方面,由于对色度通道数据的处理和对亮度通道数据的处理都是并行的,因此处理效率较高,也有利于对相应数据的调整。另一方面,基于对比度调整因子可以一次性完成对比对调整和降噪处理,基于色度调整因子可以一次性完成饱和度调整和降噪处理;这种针对亮度通道数据和色度通道数据的处理方式更简单、高效。
除此之外,还可引入第一优化参数、第二优化参数等,通过人为配置这些参数使得获得的图像能够更快速、更准确地满足用户对显示效果的要求,尤其是结合到基于深度学习的图像处理中时,不仅可以很好地发挥深度神经网络的优势,又可以通过人为配置相关参数来缓解深度神经网络在某些特定场景下输出结果不理想的问题。
上述详细阐述了本发明实施例的方法,下面提供了本发明实施例的装置。
请参见图6,图6是本发明实施例提供的一种图像处理设备60的结构示意图,该设备60可以包括获取单元601、全局处理单元602和调整单元603,各个单元的详细描述如下。
获取单元601,用于获取线性RGB图像、全局处理相关矩阵和比例调整因子,所述比例调整因子包括对比度调整因子和色度调整因子,其中,所述对比度调整因子用于调整对 比度和降噪,所述色度调整因子用于调整饱和度和降噪;
全局处理单元602,用于根据所述全局处理相关矩阵对所述线性RGB图像进行全局处理获得亮色分离图像;
调整单元603,用于根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行调整,及根据所述色度调整因子对所述亮色分离图像的色度通道数据进行调整,得到调整后的亮色分离图像。
在上述设备中,对线性RGB图像进行颜色转换矩阵的转换、伽玛校正和色彩空间转换矩阵转换得到亮色分离图像,然后针对亮色分离图像的亮度通道数据和色度通道数据同步进行基于对应的对比度调整因子、色度调整因子进行优化,得到调整后的亮色分离图像。一方面,由于对色度通道数据的处理和对亮度通道数据的处理都是并行的,因此处理效率较高,也有利于对相应数据的调整。另一方面,基于对比度调整因子可以一次性完成对比对调整和降噪处理,基于色度调整因子可以一次性完成饱和度调整和降噪处理;这种针对亮度通道数据和色度通道数据的处理方式更简单、高效。
在一种可能的实现方式中,所述全局处理相关矩阵包括颜色矫正矩阵、自动白平衡矩阵、伽玛校正系数和色彩空间转换矩阵;在根据所述全局处理相关矩阵对所述线性RGB图像进行全局处理获得亮色分离图像方面,所述全局处理单元具体用于:
根据所述自动白平衡矩阵对所述线性RGB图像进行白平衡校准,得到白平衡校准后的线性RGB图像;
根据所述颜色矫正矩阵对所述白平衡校准后的线性RGB图像进行颜色矫正,得到颜色矫正后的线性RGB图像;
根据所述伽玛校正系数对所述颜色矫正后的线性RGB图像进行伽玛校正,得到非线性RGB图像;
根据所述色彩空间转换矩阵对所述非线性RGB图像进行色彩空间转换,得到亮色分离图像。
在又一种可能的实现方式中,在根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行调整,及根据所述色度调整因子对所述亮色分离图像的色度通道数据进行调整,得到调整后的亮色分离图像方面,所述调整单元具体用于:
根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行对比度和降噪调整,得到调整后的亮度通道数据;
根据所述色度调整因子对所述亮色分离图像的色度通道数据进行饱和度和降噪调整,得到调整后的色度通道数据;
根据所述调整后的亮度通道数据和所述调整后的色度通道数据,确定调整后的亮色分离图像。
在又一种可能的实现方式中,在根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行对比度和降噪调整,得到调整后的亮度通道数据,所述调整单元具体用于:
根据所述亮色分离图像中的亮度通道数据、所述亮度通道的对比度调整因子、配置的与所述亮度通道对应的第二优化参数确定亮度通道的中间数据;
根据所述亮度通道的中间数据和目标参数确定调整后的亮度通道数据,所述目标参数 为根据配置的第三优化参数对所述亮度通道的细节增强因子进行优化得到的参数。
在该设备中,引入第二优化参数、第三优化参数,通过人为配置第二优化参数和第三优化参数使得获得的图像能够更快速、更准确地满足用户对显示效果的要求。
在又一种可能的实现方式中,在根据所述色度调整因子对所述亮色分离图像的色度通道数据进行饱和度和降噪调整,得到调整后的色度通道数据,所述调整单元具体用于:
根据所述亮色分离图像中的至少两路色度通道数据、所述至少两路色度通道各自的色度调整因子、配置的与所述至少两路色度通道分别对应的至少两个第一优化参数确定调整后的至少两路色度通道数据。
在该设备中,引入第一优化参数,通过人为配置第一优化参数使得获得的图像能够更快速、更准确地满足用户对显示效果的要求。
在又一种可能的实现方式中,所述亮色分离图像为YUV图像,所述线性RGB图像linearGRB与调整后的亮色分离图像YUV′满足如下关系:
linearRGB awb=linearRGB*T awb
linearRGB ccm=linearRGB awb*T ccm
sRGB=(linearRGB ccm) 1.0/2.2
YUV=sRGB*T csc
Y″=Y*Deta_Y*YRatio
Sharpeness″=Deta_S*Sharpeness
Y′=Y″+Sharpeness″
U′=U*Deta_U*URatio
V′=V*Deta_V*URatio
其中,T awb为自动白平衡矩阵,T ccm为颜色矫正矩阵,1.0/2.2为伽玛矫正的矫正参数,sRGB为非线性RGB图像,YUV为亮色分离图像,T csc为色彩空间转换矩阵,Y为所述亮色分离图像YUV的亮度通道数据,U为所述亮色分离图像YUV的一路色度通道数据,V为所述亮色分离图像YUV的又一路色度通道数据,YRatio为所述亮度通道的对比度调整因子,URatio为所述一路色度通道的色度调整因子,VRatio为所述又一路色度通道的色度调整因子,Sharpeness为所述亮度通道的细节增强因子,Deta_Y为配置的与所述亮度通道对应的第三优化参数,Deta_U为配置的与所述一路色度通道对应的第一优化参数,Deta_V为配置的与所述又一路色度通道对应的第一优化参数,Deta_S为配置的与所述细节增强因子Sharpeness对应的第三优化参数,Y″为所述亮度通道的中间数据,Sharpeness″为目标参数,U′为所述一路色度通道调整后的数据,V′为所述又一路色度通道调整后的数据,Y′为所述亮度通道调整后的数据,所述Y′、U′、V′组成调整后的亮色分离图像YUV′。
在又一种可能的实现方式中,在获取线性RGB图像、全局处理相关矩阵和比例调整因子方面,所述获取单元具体用于:
通过低层网络对源图像进行去马赛克处理和/或去噪处理得到所述线性RGB图像;
通过全局处理网络从所述源图像中提取所述全局处理相关矩阵;
通过比例调整网络从所述源图像提取所述比例调整因子。
具体地,低层网络、全局处理网络和比例调整网络可以并行执行,因此相互之间互不 依赖互不影响,能够尽量减少网络的输出误差。
在又一种可能的实现方式中,所述低层网络、所述全局处理网络和所述比例调整网络均为卷积神经网络。
可以理解,将卷积神经网络与上述第一优化参数、第二优化参数等进行结合,不仅可以很好地发挥深度神经网络的优势,又可以通过人为配置相关参数来缓解深度神经网络在某些特定场景下输出结果不理想的问题。
需要说明的是,各个单元的实现及这些单元协同运行所带来的有益效果还可以对应参照图2所示的方法实施例的相应描述。
本发明实施例还提供一种芯片系统,所述芯片系统包括至少一个处理器,存储器和接口电路,所述存储器、所述收发器和所述至少一个处理器通过线路互联,所述至少一个存储器中存储有指令;所述指令被所述处理器执行时,图2所示的方法流程得以实现。
本发明实施例还提供一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,当其在处理器上运行时,图2所示的方法流程得以实现。
本发明实施例还提供一种计算机程序产品,当所述计算机程序产品在处理器上运行时,图2所示的方法流程得以实现。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。

Claims (18)

  1. 一种图像处理方法,其特征在于,包括:
    获取线性RGB图像、全局处理相关矩阵和比例调整因子,所述比例调整因子包括对比度调整因子和色度调整因子,其中,所述对比度调整因子用于调整对比度和降噪,所述色度调整因子用于调整饱和度和降噪;
    根据所述全局处理相关矩阵对所述线性RGB图像进行全局处理获得亮色分离图像;
    根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行调整,及根据所述色度调整因子对所述亮色分离图像的色度通道数据进行调整,得到调整后的亮色分离图像。
  2. 根据权利要求1所述的方法,其特征在于,所述全局处理相关矩阵包括颜色矫正矩阵、自动白平衡矩阵、伽玛校正系数和色彩空间转换矩阵,所述根据所述全局处理相关矩阵对所述线性RGB图像进行全局处理获得亮色分离图像,包括:
    根据所述自动白平衡矩阵对所述线性RGB图像进行白平衡校准,得到白平衡校准后的线性RGB图像;
    根据所述颜色矫正矩阵对所述白平衡校准后的线性RGB图像进行颜色矫正,得到颜色矫正后的线性RGB图像;
    根据所述伽玛校正系数对所述颜色矫正后的线性RGB图像进行伽玛校正,得到非线性RGB图像;
    根据所述色彩空间转换矩阵对所述非线性RGB图像进行色彩空间转换,得到亮色分离图像。
  3. 根据权利要求1或2所述的方法,其特征在于,所述根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行调整,及根据所述色度调整因子对所述亮色分离图像的色度通道数据进行调整,得到调整后的亮色分离图像,包括:
    根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行对比度和降噪调整,得到调整后的亮度通道数据;
    根据所述色度调整因子对所述亮色分离图像的色度通道数据进行饱和度和降噪调整,得到调整后的色度通道数据;
    根据所述调整后的亮度通道数据和所述调整后的色度通道数据,确定调整后的亮色分离图像。
  4. 根据权利要求3所述的方法,其特征在于,所述根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行对比度和降噪调整,得到调整后的亮度通道数据,包括:
    根据所述亮色分离图像中的亮度通道数据、所述亮度通道的对比度调整因子、配置的与所述亮度通道对应的第二优化参数,确定亮度通道的中间数据;
    根据所述亮度通道的中间数据和目标参数确定调整后的亮度通道数据,所述目标参数为根据配置的第三优化参数,对所述亮度通道的细节增强因子进行优化得到的参数。
  5. 根据权利要求3或4所述的方法,其特征在于,所述根据所述色度调整因子对所述亮色分离图像的色度通道数据进行饱和度和降噪调整,得到调整后的色度通道数据,包括:
    根据所述亮色分离图像中的至少两路色度通道数据、所述至少两路色度通道各自的色度调整因子、配置的与所述至少两路色度通道分别对应的至少两个第一优化参数,确定调整后的至少两路色度通道数据。
  6. 根据权利要求5所述的方法,其特征在于,所述亮色分离图像为YUV图像,所述线性RGB图像linearGRB与调整后的亮色分离图像YUV′满足如下关系:
    linearRGB awb=linearRGB*T awb
    linearRGB ccm=linearRGB awb*T ccm
    sRGB=(linearRGB ccm) 1.0/2.2
    YUV=sRGB*T csc
    Y″=Y*Deta_Y*YRatio
    Sharpeness″=Deta_S*Sharpeness
    Y′=Y″+Sharpeness″
    U′=U*Deta_U*URatio
    V′=V*Deta_V*URatio
    其中,T awb为自动白平衡矩阵,T ccm为颜色矫正矩阵,1.0/2.2为伽玛矫正的矫正参数,sRGB为非线性RGB图像,YUV为亮色分离图像,T csc为色彩空间转换矩阵,Y为所述亮色分离图像YUV的亮度通道数据,U为所述亮色分离图像YUV的一路色度通道数据,V为所述亮色分离图像YUV的又一路色度通道数据,YRatio为所述亮度通道的对比度调整因子,URatio为所述一路色度通道的色度调整因子,VRatio为所述又一路色度通道的色度调整因子,Sharpeness为所述亮度通道的细节增强因子,Deta_Y为配置的与所述亮度通道对应的第三优化参数,Deta_U为配置的与所述一路色度通道对应的第一优化参数,Deta_V为配置的与所述又一路色度通道对应的第一优化参数,Deta_S为配置的与所述细节增强因子Sharpeness对应的第三优化参数,Y″为所述亮度通道的中间数据,Sharpeness″为目标参数,U′为所述一路色度通道调整后的数据,V′为所述又一路色度通道调整后的数据,Y′为所述亮度通道调整后的数据,所述Y′、U′、V′组成调整后的亮色分离图像YUV′。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,所述获取线性RGB图像、全局处理相关矩阵和比例调整因子,包括:
    通过低层网络对源图像进行去马赛克处理和/或去噪处理得到所述线性RGB图像;
    通过全局处理网络从所述源图像中提取所述全局处理相关矩阵;
    通过比例调整网络从所述源图像提取所述比例调整因子。
  8. 根据权利要求7所述的方法,其特征在于,所述低层网络、所述全局处理网络和所述比例调整网络均为卷积神经网络。
  9. 一种图像处理设备,其特征在于,包括处理器、存储器,其中,所述存储器用于存储计算机程序,所述处理器用于调用所述计算机程序来执行如下操作:
    获取线性RGB图像、全局处理相关矩阵和比例调整因子,所述比例调整因子包括对比度调整因子和色度调整因子,其中,所述对比度调整因子用于调整对比度和降噪,所述色度调整因子用于调整饱和度和降噪;
    根据所述全局处理相关矩阵对所述线性RGB图像进行全局处理获得亮色分离图像;
    根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行调整,及根据所述色度调整因子对所述亮色分离图像的色度通道数据进行调整,得到调整后的亮色分离图像。
  10. 根据权利要求9所述的设备,其特征在于,所述全局处理相关矩阵包括颜色矫正矩阵、自动白平衡矩阵、伽玛校正系数和色彩空间转换矩阵,所述处理器具体用于:
    根据所述自动白平衡矩阵对所述线性RGB图像进行白平衡校准,得到白平衡校准后的线性RGB图像;
    根据所述颜色矫正矩阵对所述白平衡校准后的线性RGB图像进行颜色矫正,得到颜色矫正后的线性RGB图像;
    根据所述伽玛校正系数对所述颜色矫正后的线性RGB图像进行伽玛校正,得到非线性RGB图像;
    根据所述色彩空间转换矩阵对所述非线性RGB图像进行色彩空间转换,得到亮色分离图像。
  11. 根据权利要求9或10所述的设备,其特征在于,所述处理器具体用于:
    根据所述对比度调整因子对所述亮色分离图像的亮度通道数据进行对比度和降噪调整,得到调整后的亮度通道数据;
    根据所述色度调整因子对所述亮色分离图像的色度通道数据进行饱和度和降噪调整,得到调整后的色度通道数据;
    根据所述调整后的亮度通道数据和所述调整后的色度通道数据,确定调整后的亮色分离图像。
  12. 根据权利要求11所述的设备,其特征在于,所述处理器具体用于:
    根据所述亮色分离图像中的亮度通道数据、所述亮度通道的对比度调整因子、配置的与所述亮度通道对应的第二优化参数,确定亮度通道的中间数据;
    根据所述亮度通道的中间数据和目标参数确定调整后的亮度通道数据,所述目标参数为根据配置的第三优化参数,对所述亮度通道的细节增强因子进行优化得到的参数。
  13. 根据权利要求12所述的设备,其特征在于,所述处理器具体用于:
    根据所述亮色分离图像中的至少两路色度通道数据、所述至少两路色度通道各自的色度调整因子、配置的与所述至少两路色度通道分别对应的至少两个第一优化参数确定调整 后的至少两路色度通道数据。
  14. 根据权利要求13所述的设备,其特征在于,所述亮色分离图像为YUV图像,所述线性RGB图像linearGRB与调整后的亮色分离图像YUV′满足如下关系:
    linearRGB awb=linearRGB*T awb
    linearRGB ccm=linearRGB awb*T ccm
    sRGB=(linearRGB ccm) 1.0/2.2
    YUV=sRGB*T csc
    Y″=Y*Deta_Y*YRatio
    Sharpeness″=Deta_S*Sharpeness
    Y′=Y″+Sharpeness″
    U′=U*Deta_U*URatio
    V′=V*Deta_V*URatio
    其中,T awb为自动白平衡矩阵,T ccm为颜色矫正矩阵,1.0/2.2为伽玛矫正的矫正参数,sRGB为非线性RGB图像,YUV为亮色分离图像,T csc为色彩空间转换矩阵,Y为所述亮色分离图像YUV的亮度通道数据,U为所述亮色分离图像YUV的一路色度通道数据,V为所述亮色分离图像YUV的又一路色度通道数据,YRatio为所述亮度通道的对比度调整因子,URatio为所述一路色度通道的色度调整因子,VRatio为所述又一路色度通道的色度调整因子,Sharpeness为所述亮度通道的细节增强因子,Deta_Y为配置的与所述亮度通道对应的第三优化参数,Deta_U为配置的与所述一路色度通道对应的第一优化参数,Deta_V为配置的与所述又一路色度通道对应的第一优化参数,Deta_S为配置的与所述细节增强因子Sharpeness对应的第三优化参数,Y″为所述亮度通道的中间数据,Sharpeness″为目标参数,U′为所述一路色度通道调整后的数据,V′为所述又一路色度通道调整后的数据,Y′为所述亮度通道调整后的数据,所述Y′、U′、V′组成调整后的亮色分离图像YUV′。
  15. 根据权利要求9-14任一项所述的设备,其特征在于,所述处理具体用于:
    通过低层网络对源图像进行去马赛克处理和/或去噪处理得到所述线性RGB图像;
    通过全局处理网络从所述源图像中提取所述全局处理相关矩阵;
    通过比例调整网络从所述源图像提取所述比例调整因子。
  16. 根据权利要求15所述的设备,其特征在于,所述低层网络、所述全局处理网络和所述比例调整网络均为卷积神经网络。
  17. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质用于存储计算机程序,当所述计算机程序在处理器上运行时,实现权利要求1-8任一项所述的方法。
  18. 一种计算机程序产品,其特征在于,当所述计算机程序产品在处理器上运行时,实现权利要求1-8任一项所述的方法。
PCT/CN2020/078476 2020-03-09 2020-03-09 一种图像处理方法及相关装置 WO2021179142A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/078476 WO2021179142A1 (zh) 2020-03-09 2020-03-09 一种图像处理方法及相关装置
CN202080096698.5A CN115088252A (zh) 2020-03-09 2020-03-09 一种图像处理方法及相关装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/078476 WO2021179142A1 (zh) 2020-03-09 2020-03-09 一种图像处理方法及相关装置

Publications (1)

Publication Number Publication Date
WO2021179142A1 true WO2021179142A1 (zh) 2021-09-16

Family

ID=77670428

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/078476 WO2021179142A1 (zh) 2020-03-09 2020-03-09 一种图像处理方法及相关装置

Country Status (2)

Country Link
CN (1) CN115088252A (zh)
WO (1) WO2021179142A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023093291A1 (zh) * 2021-11-24 2023-06-01 腾讯科技(深圳)有限公司 图像处理方法、装置、计算机设备和计算机程序产品

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101009851A (zh) * 2007-01-19 2007-08-01 北京中星微电子有限公司 一种图像处理方法及其装置
US20120087582A1 (en) * 2010-10-06 2012-04-12 International Business Machines Corporation Method and system for resizing an image
US8508624B1 (en) * 2010-03-19 2013-08-13 Ambarella, Inc. Camera with color correction after luminance and chrominance separation
CN103686171A (zh) * 2013-12-19 2014-03-26 中山大学深圳研究院 一种自适应滤波分离的视频解码装置
CN107580163A (zh) * 2017-08-12 2018-01-12 四川精视科技有限公司 一种双镜头黑光摄像机
CN108900819A (zh) * 2018-08-20 2018-11-27 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN109727215A (zh) * 2018-12-28 2019-05-07 Oppo广东移动通信有限公司 图像处理方法、装置、终端设备及存储介质

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109658341B (zh) * 2018-10-26 2021-01-01 深圳市华星光电技术有限公司 增强图像对比度的方法及其装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101009851A (zh) * 2007-01-19 2007-08-01 北京中星微电子有限公司 一种图像处理方法及其装置
US8508624B1 (en) * 2010-03-19 2013-08-13 Ambarella, Inc. Camera with color correction after luminance and chrominance separation
US20120087582A1 (en) * 2010-10-06 2012-04-12 International Business Machines Corporation Method and system for resizing an image
CN103686171A (zh) * 2013-12-19 2014-03-26 中山大学深圳研究院 一种自适应滤波分离的视频解码装置
CN107580163A (zh) * 2017-08-12 2018-01-12 四川精视科技有限公司 一种双镜头黑光摄像机
CN108900819A (zh) * 2018-08-20 2018-11-27 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN109727215A (zh) * 2018-12-28 2019-05-07 Oppo广东移动通信有限公司 图像处理方法、装置、终端设备及存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023093291A1 (zh) * 2021-11-24 2023-06-01 腾讯科技(深圳)有限公司 图像处理方法、装置、计算机设备和计算机程序产品

Also Published As

Publication number Publication date
CN115088252A (zh) 2022-09-20

Similar Documents

Publication Publication Date Title
US10645268B2 (en) Image processing method and apparatus of terminal, and terminal
CN105409211B (zh) 用于图像处理的带皮肤色调校正的自动白平衡
US9942481B2 (en) Systems and methods for iterative adjustment of video-capture settings based on identified persona
WO2023016039A1 (zh) 视频处理方法、装置、电子设备和存储介质
KR102158844B1 (ko) 영상 처리 장치, 영상 처리 방법, 및 컴퓨터 판독가능 기록매체
WO2023016035A1 (zh) 视频处理方法、装置、电子设备和存储介质
WO2023016037A1 (zh) 视频处理方法、装置、电子设备和存储介质
WO2020215180A1 (zh) 图像处理方法、装置和电子设备
US9866764B2 (en) Method and apparatus for synchronizing auto exposure between chromatic pixels and panchromatic pixels in a camera system
US9654756B1 (en) Method and apparatus for interpolating pixel colors from color and panchromatic channels to color channels
CN114331916B (zh) 图像处理方法及电子设备
CN110807735A (zh) 图像处理方法、装置、终端设备及计算机可读存储介质
CN113727085B (zh) 一种白平衡处理方法、电子设备、芯片系统和存储介质
WO2021179142A1 (zh) 一种图像处理方法及相关装置
WO2024027287A1 (zh) 图像处理系统及方法、计算机可读介质和电子设备
WO2020215263A1 (zh) 一种图像处理方法及装置
KR102285756B1 (ko) 전자 시스템 및 영상 처리 방법
WO2023036034A1 (zh) 图像处理方法及其相关设备
WO2023016040A1 (zh) 视频处理方法、装置、电子设备和存储介质
WO2023016044A1 (zh) 视频处理方法、装置、电子设备和存储介质
CN115550575A (zh) 图像处理方法及其相关设备
CN116668838B (zh) 图像处理方法与电子设备
WO2023016043A1 (zh) 视频处理方法、装置、电子设备和存储介质
WO2024051110A1 (zh) 颜色数据的处理方法、光源系统、装置、设备和存储介质
US11405564B2 (en) Methods and systems for parameter alignment for an image capture device with multiple image capture devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20924264

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20924264

Country of ref document: EP

Kind code of ref document: A1