WO2023231583A1 - Image processing method and related device thereof - Google Patents

Image processing method and related device thereof Download PDF

Info

Publication number
WO2023231583A1
WO2023231583A1 PCT/CN2023/087568 CN2023087568W WO2023231583A1 WO 2023231583 A1 WO2023231583 A1 WO 2023231583A1 CN 2023087568 W CN2023087568 W CN 2023087568W WO 2023231583 A1 WO2023231583 A1 WO 2023231583A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
initial
initial image
color
processing method
Prior art date
Application number
PCT/CN2023/087568
Other languages
French (fr)
Chinese (zh)
Inventor
李子荣
毕涵
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2023231583A1 publication Critical patent/WO2023231583A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • G06T5/90
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present application relates to the field of image processing, and in particular, to an image processing method and related equipment.
  • CMOS image sensors are traditional RGB (red, green, blue) sensors. In other words, this image sensor can only receive red. channel signal, green channel signal and blue channel signal.
  • RGB red, green, blue
  • This application provides an image processing method and related equipment. By merging and reorganizing the channel signals of the original image, two frames of images with better details and better color information are generated, and then the two frames of images are fused to generate a target image. This enables better restoration of image details and colors.
  • an image processing method which is applied to electronic devices.
  • the method includes:
  • the first initial image and the front-end processed image are fused to obtain a target image.
  • the image processing method provided by the embodiment of the present application obtains an original image including at least 4 channel signals, and then merges and reorganizes the channel signals in the original image to generate a first initial image, a second initial image, and a third initial image, and then Perform front-end processing on the second initial image and the third initial image to obtain the front-end processed image. Since The first initial image has only undergone merging and reorganization, so the detail richness is higher, and the color accuracy of the front-end processed image is higher. Based on this, the first initial image and the front-end processed image are used to generate the target image, so that image details and Better color restoration.
  • the original image is preprocessed to obtain a first initial image, a second initial image and a third initial image, including:
  • four-in-one pixel merging processing refers to a processing method in which four adjacent pixel values are weighted and averaged and output as a single pixel value
  • diagonal pixel merging processing refers to two pixels in the diagonal direction.
  • the weighted average value is output as a single pixel value.
  • a first initial image with high detail richness can be obtained; and by merging diagonal pixels on the original image, Two second initial images and a third initial image including different color channel signals are obtained, so that the second initial image and the third initial image can provide better color information for subsequent color restoration.
  • the front-end processing at least includes: downsampling, noise reduction, automatic white balance and/or color correction, and upsampling.
  • front-end processing is used to process the colors of the second initial image and the third initial image, so that the processed front-end processed image can provide better colors for subsequent processing.
  • the energy function formula can be used to fuse the first initial image and the front-end processed image, so that the target image is expected to be close to both the gradient value of the first initial image and the pixel value of the front-end processed image, so that The restored target image carries better detailed texture information and color information.
  • the image processing method further includes:
  • back-end processing is used to further enhance the detail and color of the target image.
  • the back-end processing includes: at least one of demosaicing, gamma correction, and style transformation.
  • the original image includes a red channel signal, a green channel signal, a blue channel signal, a yellow channel signal, a cyan channel signal and a magenta channel signal.
  • the first operation refers to an operation of clicking the camera application.
  • the first interface refers to a photographing interface of the electronic device
  • the first control refers to a control for instructing to photograph.
  • the first operation refers to an operation of clicking a control for instructing to take a photo.
  • the first interface refers to a video shooting interface of the electronic device
  • the first control refers to a control used to instruct video shooting.
  • the first operation refers to an operation of clicking a control indicating shooting a video.
  • the above description takes the first operation as a click operation as an example; the first operation may also include a voice instruction operation, or other operations of instructing the electronic device to take photos or videos; the above is an example and does not limit the application in any way. .
  • an electronic device including a module/unit for performing the first aspect or any method in the first aspect.
  • an electronic device including one or more processors and memories;
  • the memory is coupled to one or more processors, the memory is used to store computer program code, the computer program code includes computer instructions, and the one or more processors invoke the computer instructions to cause the electronic device to Perform the first aspect or any method in the first aspect.
  • a chip system is provided.
  • the chip system is applied to an electronic device.
  • the chip system includes one or more processors.
  • the processor is used to call computer instructions to cause the electronic device to execute the first aspect. Or any method in the first aspect.
  • a computer-readable storage medium stores a computer program.
  • the computer program includes program instructions. When executed by a processor, the program instructions cause the processor to Perform the first aspect or any method in the first aspect.
  • a computer program product includes: computer program code.
  • the computer program code When the computer program code is run by an electronic device, the electronic device causes the electronic device to execute the first aspect or any one of the first aspects. method.
  • This application provides an image processing method and related equipment.
  • a first initial image, a second initial image and a third initial image are generated.
  • the initial image is then subjected to front-end processing on the second initial image and the third initial image to obtain the front-end processed image. Since the first initial image has only undergone merger and reorganization, the richness of details is higher, while the color accuracy of the front-end processed image is lower. High, based on this, the first initial image and the front-end processed image are used to generate the target image, so that better restoration of image details and colors can be achieved.
  • Figure 1 is an imaging schematic diagram of an RGBCMY sensor
  • Figure 2 is a spectral response curve of RGBCMY
  • Figure 3 is a schematic diagram of using 24 color blocks to determine the CCM matrix
  • Figure 4 is a comparison before and after using CCM matrix for processing
  • Figure 5 is a schematic diagram of an application scenario
  • FIG. 6 is a schematic flowchart of an image processing method provided by an embodiment of the present application.
  • Figure 7 is a schematic diagram of an original image provided by an embodiment of the present application.
  • Figure 8 is a schematic diagram of Qbin processing of an original image provided by an embodiment of the present application.
  • Figure 9 is a Dbin processing process provided by the embodiment of the present application.
  • Figure 10 is a schematic diagram of Dbin processing of an original image provided by an embodiment of the present application.
  • Figure 11 is a schematic flow chart of another image processing method provided by an embodiment of the present application.
  • Figure 12 is a schematic diagram of front-end processing provided by an embodiment of the present application.
  • Figure 13 is a schematic diagram of the effect of the fusion process provided by the embodiment of the present application.
  • Figure 14 is a schematic flow chart of another image processing method provided by an embodiment of the present application.
  • Figure 15 is a schematic diagram of back-end processing provided by an embodiment of the present application.
  • Figure 16 is a schematic diagram of the effect of the image processing method provided by the embodiment of the present application.
  • Figure 17 is a schematic diagram of a display interface of an electronic device provided by an embodiment of the present application.
  • Figure 18 is a schematic diagram of a display interface of another electronic device provided by an embodiment of the present application.
  • Figure 19 is a schematic diagram of a hardware system suitable for the electronic device of the present application.
  • Figure 20 is a schematic diagram of a software system suitable for the electronic device of the present application.
  • Figure 21 is a schematic structural diagram of an image processing device provided by an embodiment of the present application.
  • Figure 22 is a schematic structural diagram of a chip system provided by an embodiment of the present application.
  • first and second are used for descriptive purposes only and cannot be understood as indicating or implying relative importance or implicitly indicating the quantity of indicated technical features. Therefore, features defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of this embodiment, unless otherwise specified, “plurality” means two or more.
  • RGB (red, green, blue) color space or RGB domain refers to a color model related to the structure of the human visual system. Think of all colors as different combinations of red, green, and blue, based on the structure of the human eye. Red, green and blue are called the three primary colors. It should be understood that a primary color refers to a "basic color" that cannot be obtained by mixing other colors.
  • YUV color space or YUV domain refers to a color encoding method.
  • Y represents brightness
  • U and V represent chroma.
  • the above-mentioned RGB color space focuses on the human eye's perception of color, while the YUV color space focuses on the visual sensitivity to brightness.
  • the RGB color space and the YUV color space can be converted into each other.
  • Pixel value refers to a set of color components corresponding to each pixel in a color image located in the RGB color space.
  • each pixel corresponds to a set of three primary color components, where the three primary color components are the red component R, the green component G, and the blue component B respectively.
  • Bayer pattern color filter array When the image is converted from the actual scene into image data, the image sensor usually receives the red channel signal, the green channel signal and the blue channel signal respectively. information of three channel signals, and then synthesize the information of the three channel signals into a color image. However, in this scheme, three filters are required for each pixel position, which is expensive and difficult to produce. Therefore, it can be used on the image sensor. The surface is covered with a color filter array to obtain information from the three channel signals.
  • Bayer format color filter array refers to filters arranged in a checkerboard format. For example, the minimum repeating unit in the Bayer format color filter array is: one filter to obtain the red channel signal, two filters to obtain the green channel signal, A filter that acquires the blue channel signal is arranged in a 2 ⁇ 2 pattern.
  • Bayer image that is, the image output by the image sensor based on the Bayer format color filter array. Pixels of multiple colors in this image are arranged in a Bayer format. Among them, each pixel in the Bayer format image only corresponds to the channel signal of one color. For example, since human vision is more sensitive to green, it can be set that green pixels (pixels corresponding to green channel signals) account for 50% of all pixels, blue pixels (pixels corresponding to blue channel signals) and red pixels (pixels corresponding to the red channel signal) each account for 25% of all pixels. Among them, the smallest repeating unit of the Bayer format image is: one red pixel, two green pixels and one blue pixel arranged in a 2 ⁇ 2 manner. It should be understood that the RAW domain is the RAW color space, and the Bayer format image can be called an image located in the RAW domain.
  • Grayscale image is a single-channel image, used to represent different brightness levels. The brightest is completely white, and the darkest is completely black. That is, each pixel in a grayscale image corresponds to a different degree of brightness between black and white. Usually in order to describe the brightness change from the brightest to the darkest, it is divided, for example, into 256 parts, which represents 256 levels of brightness, and is called 256 gray levels (the 0th gray level to the 0th gray level). 255 grayscale).
  • Spectral response which can also be called spectral sensitivity.
  • Spectral response represents the ability of the image sensor to convert incident light energy of different wavelengths into electrical energy. Among them, if the light energy of a certain wavelength of light incident on the image sensor is converted into the number of photons, and the current generated by the image sensor and transmitted to the external circuit is expressed in the number of electrons, it means that each incident photon can be converted into The ability of electrons in the external circuit is called quantum efficiency (QE), and the unit is expressed in percentage.
  • QE quantum efficiency
  • the spectral responsivity of the image sensor depends on the quantum efficiency, as well as parameters such as wavelength and integration time.
  • the human eye has the characteristic of color constancy. In most cases, the color of the same object seen in various light source scenarios is consistent. For example, white paper looks white. Then, in order to eliminate the impact of the light source on the imaging of the image sensor, simulate the color constancy of human vision, and ensure that the white seen in any scene is truly white, therefore, it is necessary to correct the color temperature and automatically adjust the white balance to the appropriate Location.
  • the filter colors of different color filter arrays constitute the camera color space (RAW domain or RAW color space). Therefore, the camera color space is not a universal color space.
  • a color filter array with a filter color of RGGB forms a camera color space of RAW RGB. If the Bayer format image or RAW image generated by the color filter array is directly displayed, the image will be greenish.
  • CCM Color correction matrix
  • the CCM matrix is mainly used to convert image data obtained by automatic white balance into standard color space (sRGB). Since there is a big difference between the spectral response of the CMOS sensor and the spectral response of the human eye to visible light, the color reproduction of the camera is very different from the color of the object perceived by the observer. Therefore, it is necessary to improve the color of the object through the CCM matrix. Color saturation makes the color of the image captured by the camera closer to the perception of the human eye. Among them, the process of correction using the CCM matrix is the process of color correction.
  • CMOS image sensors currently used for visible light imaging are traditional RGB sensors. Due to hardware limitations, this image sensor can only receive red channel signals, green channel signals, and blue channel signals. In this way, the number of spectral response channels of the image sensor is very limited, and a small number of spectral response channels will limit the color restoration capability of the image sensor and affect the color and other information of the restored image.
  • CMOS sensors also known as multispectral sensors
  • noise problems will occur when using multispectral sensors for imaging, and usually as the number of spectral response channels increases, the noise problems that occur during imaging will become more serious.
  • multispectral means that the spectral bands used for imaging include 2 or more bands. According to this definition, since the RGB sensor utilizes three bands of red, green and blue, the RGB sensor is also strictly a multispectral response. However, it should be noted that the visible light referred to in this application as a multispectral response CMOS sensors actually refer to other multispectral sensors that have a larger number of spectral response channels than RGB sensors.
  • the multispectral sensor may be an RGBC sensor, an RGBM sensor, an RGBY sensor, an RGBCM sensor, an RGBCY sensor, an RGBMY sensor, an RGBCMY sensor, etc.
  • the RGBCMY sensor receives red channel signal, green channel signal, blue channel signal, cyan channel signal, magenta channel signal and yellow channel signal. The channel colors received by other sensors are deduced in sequence and will not be described again here.
  • the multispectral sensor can also be a sensor that receives signals from other color channels, and can be specifically selected and set according to needs.
  • the embodiments of the present application do not impose any restrictions on this.
  • Figure 1 provides an imaging schematic diagram of an RGBCMY sensor.
  • the color filter array covered on the surface of the RGBCMY sensor can obtain information from six color channel signals.
  • the minimum repeating unit in the Bayer format color filter array is: two filters to obtain the red channel signal, four filters to obtain the green channel signal, two filters to obtain the blue channel signal, and two filters to obtain the cyan channel signal. Filters for channel signals, two filters for acquiring magenta channel signals, and four filters for acquiring yellow channel signals, arranged in a 4 ⁇ 4 matrix.
  • the minimum repeating unit of the Bayer format image obtained using the RGBCMY sensor is: two red pixels, four green pixels, two blue pixels, two cyan pixels, two magenta pixels, Four yellow pixels, arranged in a 4 ⁇ 4 matrix.
  • Figure 2 provides a schematic diagram of the spectral response curve of RGBCMY.
  • the horizontal axis represents the wavelength, and the vertical axis represents the spectral responsivity corresponding to different spectra.
  • the spectral response curve indicated by R represents the different spectral responsivity of red light at different wavelengths
  • the spectral response curve indicated by G represents the different spectral responsivity of green light at different wavelengths
  • the curve represents the different spectral responsivity of blue light at different wavelengths
  • the spectral response curve indicated by C represents the different spectral responsivity of cyan light at different wavelengths
  • the spectral response curve indicated by M represents the different spectral responsivity of magenta light at different wavelengths.
  • the spectrum indicated by Y The response curve represents the different spectral responsivity of yellow light at different wavelengths.
  • the RGBCMY sensor Take the RGBCMY sensor as an example. Compared with the RGB sensor, due to the increase in the number of primary colors and the increase in the number of spectral response channels, the RGBCMY sensor can generally achieve relatively better color restoration capabilities, that is, color accuracy.
  • the Bayer format image acquired by the sensor is usually processed through automatic white balance and CCM matrix to restore the scene color.
  • the Bayer format image acquired by the RGBCMY sensor it is usually also possible to process the Bayer format image through automatic white balance and CCM.
  • Matrix is processed to restore the scene color.
  • the CCM matrix used in this process needs to be fitted in advance.
  • the CCM matrix corresponding to the RGBCMY sensor is a 6 ⁇ 3 matrix, which includes more parameter values.
  • the CCM matrix corresponding to the RGBCMY sensor is a 6 ⁇ 3 matrix, which includes more parameter values.
  • the CCM matrix corresponding to the RGBCMY sensor usually when fitting the CCM Matrix will encounter over-fitting phenomenon, which will cause some parameter values in the fitted CCM matrix to be too large.
  • the noise in the Bayer format image obtained by the RGBCMY sensor will be amplified. , causing serious color noise problems.
  • color noise refers to colored noise.
  • Figure 3 provides a schematic diagram of determining the CCM matrix using 24 color blocks.
  • the image data acquired by the RGBCMY sensor is generated after automatic white balance processing and demosaic (DM) 24 color cards.
  • DM demosaic
  • the corresponding CCM matrix at a color temperature of 6500K can be obtained.
  • This CCM matrix represents the 24 colors shown in (a) in Figure 3
  • the coefficient matrix that needs to be multiplied is required.
  • each color shown in (a) in Figure 3 corresponds to the six primary color values of R, G, B, C, M and Y, while each color shown in (b) in Figure 3 only corresponds to R,
  • the fitted CCM matrix is a 6 ⁇ 3 matrix, that is, the fitted CCM matrix includes 18 parameter values. Since in the fitting process, over-fitting phenomenon is usually encountered, causing some of the 18 parameter values included in the CCM matrix to be too large, which will lead to the problem of using the fitted parameters during actual processing. The noise of the image processed by the CCM matrix is amplified.
  • Figure 4 provides a before-and-after comparison of processing using the CCM matrix.
  • the image on the left is a Bayer format image acquired by the RGB sensor
  • the image on the right is an image processed using the corresponding 3 ⁇ 3 CCM matrix
  • the image on the right is Compared to the image on the left, although the noise is amplified, it is not very obvious.
  • the image on the left is a Bayer format image obtained by the RGBCMY sensor
  • the image on the right is an image processed using the corresponding 6 ⁇ 3 CCM matrix
  • the image on the right is relative to the left
  • the noise is amplified; compared to (a) in Figure 4, the noise problem is more serious.
  • embodiments of the present application provide an image processing method that combines and reorganizes the channel signals of the original image to generate two frames of images with better details and better color information respectively, and then fuses the two frames of images to generate The target image can achieve better restoration of the details and colors of the target image.
  • the image processing method provided by the embodiment of the present application can be applied to the field of photography.
  • Figure 5 shows a schematic diagram of an application scenario provided by the embodiment of the present application.
  • the electronic device is a mobile phone, which includes a multispectral sensor other than an RGB sensor.
  • the electronic device in response to the user's operation, can start the camera application and display a graphical user interface (GUI) as shown in Figure 5.
  • GUI graphical user interface
  • the GUI interface can be called a first interface.
  • the first interface includes multiple shooting mode options and first controls.
  • the multiple shooting modes include, for example, photo taking mode, video recording mode, etc.
  • the first control is, for example, the shooting key 11 , and the shooting key 11 is used to indicate that the current shooting mode is one of the multiple shooting modes.
  • the user when the user starts the camera application and wants to take pictures of outdoor grass and trees at night, the user clicks the shooting button 11 on the first interface, and the electronic device detects the user's click on the shooting button 11. After the operation, in response to the click operation, run the program corresponding to the image processing method provided by the embodiment of the present application to obtain the image.
  • the multispectral sensor included in this electronic device is not an RGB sensor, for example, an RGBCMY sensor.
  • the spectral response range of this electronic device has been expanded relative to the existing technology, that is to say, the color reduction capability has been improved.
  • due to The CCM matrix may have an over-fitting problem, so after processing with the CCM matrix, the noise of the image may be amplified.
  • the electronic device is processed using the image processing method provided by the embodiment of the present application, it can ensure color restoration and reduce noise, thereby improving the quality of the captured image or video.
  • FIG. 6 shows a schematic flowchart of an image processing method provided by an embodiment of the present application. As shown in Figure 6, the embodiment of the present application provides an image processing method 1.
  • the image processing method 1 includes the following S11 to S16.
  • the first control is, for example, the shooting key 11 shown in FIG. 5
  • the first operation is, for example, a click operation.
  • the first operation can also be other operations, and the embodiment of the present application does not impose any limitation on this.
  • the original image includes channel signals of at least four colors.
  • the original image is a Bayer format image, or is located in the RAW domain.
  • Figure 7 shows a schematic diagram of an original image.
  • the original image includes channel signals that may include 4 colors (for example, t1, t2, t3, and t4), or, as shown in (b) in Figure 7,
  • the original image may include channel signals of 5 colors (eg, t1, t2, t3, t4, and t5), or, as shown in (c) in FIG. 7 , the original image may include channel signals of 6 colors (eg, t1, t2, t3, t4, and t5). t1, t2, t3, t4, t5 and t6).
  • the original image may also include channel signals of more colors, and the embodiments of the present application do not impose any limitation on this.
  • the arrangement of channel signals included in the original image can be set and modified as needed.
  • the arrangement shown in FIG. 7 is only an example, and the embodiment of the present application does not impose any restrictions on this.
  • 1 frame, 2 frames, or more than 2 frames of original images may be acquired. Specifically, it can be obtained as needed, and the embodiments of this application do not impose any restrictions on this.
  • the multi-frame original image can be collected using a multispectral sensor included in the electronic device itself or obtained from other devices.
  • the specific settings can be set as needed, and the embodiments of the present application do not impose any restrictions on this.
  • the multispectral sensor when using its own multispectral sensor to acquire multiple frames of original images, can simultaneously Multiple frames of original images can be output simultaneously, or multiple frames of original images can be output serially. Specific selections and settings may be required, and the embodiments of the present application do not impose any restrictions on this.
  • multiple frames of original images are output from the multispectral sensor, they can be output simultaneously or serially, but no matter how they are output, the multiple frames of original images are actually images generated by taking the same shot of the scene to be shot.
  • the scene to be shot refers to all objects in the camera's shooting perspective.
  • the scene to be shot can also be called the target scene, or it can also be understood as the scene that the user expects to shoot.
  • the preprocessing is used to merge and recombine multiple color channel signals included in the original image.
  • the preprocessing can include four-in-one pixel binning (quarter binning, Qbin) processing and diagonal pixel binning (diagonal binning, Dbin). )deal with.
  • Qbin quarter binning
  • Dbin diagonal pixel binning
  • other methods can also be used for merging and reorganization, and specific settings and changes can be made as needed.
  • the embodiments of the present application do not impose any restrictions on this.
  • Qbin processing refers to a processing method in which four adjacent pixel values are weighted and averaged and output as a single pixel value.
  • Dbin processing refers to a weighted average of two pixel values in the diagonal direction and is output as a single pixel value. How a single pixel value is output.
  • the weights assigned during weighting can be set and modified as needed, and the embodiments of the present application do not impose any restrictions on this. For example, the weights can all be set to 1.
  • the above S14 may include:
  • FIG. 8 shows a schematic diagram of Qbin processing of original images.
  • the original image includes channel signals of 6 colors.
  • the channel signals of the 6 colors are respectively the red channel signal (R), Green channel signal (G), blue channel signal (B), cyan channel signal (C), magenta channel signal (M) and yellow channel signal (Y), these 6 colors are arranged in a 4 ⁇ 4 arrangement. And repeat with the minimum repeating unit as shown in Figure 1.
  • the pixel channel signal B in the 1st row and 3rd column, the pixel channel C in the 1st row and 4th column, the pixel channel signal C in the 2nd row and 2nd column, and the pixel channel signal B in the 2nd row and 4th column can be , the pixel values corresponding to the four adjacent pixel channel signals are weighted and averaged to obtain a single pixel value, for example, T2.
  • the T2 is the pixel value of the pixel in the first row and the second column corresponding to the first initial image.
  • the first initial image can be determined based on the original image.
  • Each pixel in the first initial image corresponds to a pixel value. Therefore, the first initial image can still be considered as an image in the RAW domain.
  • the size of each side of the first initial image is half of the original image. 1.
  • the area of the entire image is one quarter of the original image.
  • Figure 9 shows a schematic diagram of Dbin processing
  • Figure 10 shows a raw image processing Schematic diagram of Dbin processing.
  • the weight is assumed to be 1, and so on.
  • the weight is assumed to be 1, and so on for the others.
  • the pixel value of the pixel in row 1 and column 1 in the second initial image is the average of the pixel values of the pixel in row 1 and column 1 and the pixel in row 2 and column 2 in the original image, for example, it is the pixel corresponding to the green channel signal.
  • the pixel value of the pixel in row 1, column 1 in the third initial image is the pixel in row 1, column 2 and the pixel in row 2, column 1 in the original image
  • the average value of the pixel values of the column pixels for example, the pixel value corresponding to the yellow channel signal, and so on for the others.
  • the second initial image and the third initial image are also images in the RAW domain.
  • the pixel value in the second initial image is obtained by the weighted average of two pixels in the upper left corner and the lower right corner of the four adjacent pixels in the original image
  • the pixel value in the third initial image is obtained by the adjacent pixels in the original image. It is obtained by weighting the average of the two pixels in the lower left corner and the upper right corner of the four pixels, so the pixel values of the second initial image and the third initial image are different.
  • the second initial image since one pixel value of the second initial image and the third initial image is obtained by a weighted average of two pixel values in the diagonal direction among the four adjacent pixels of the original image, therefore, the second initial image,
  • the size of each side of the third initial image is half the size of the original image, and the size of the entire image is one-fourth the size of the original image.
  • Qbin processing and Dbin processing can be processed in the same image signal processor (image signal processing, ISP), or they can be processed separately in two image signal processors, or they can also be processed in two image signal processors.
  • the processing can be performed in a multispectral sensor, and the specific settings can be set as needed. The embodiments of the present application do not impose any restrictions on this.
  • Front-end processing only means that this step is before fusion, so it is called “front-end” processing and has no other meaning.
  • Front-end processing may also be called first processing, etc., and the embodiments of this application do not impose any restrictions on this.
  • the front-end processing is used to process the colors of the second initial image and the third initial image, so that the front-end processed image obtained after processing can provide better colors for subsequent processing.
  • the channel signals included in the second initial image and the third initial image respectively have different colors, the color information of the front-end processed image obtained according to the second initial image and the third initial image is better preserved and also That is to say, front-end processing of images can provide better color reproduction capabilities for subsequent processing.
  • the front-end processing provided by the embodiment of the present application can be processed in the same ISP as the above-mentioned Qbin processing or Dbin processing, or can be processed in the same ISP as Qbin processing and Dbin processing, or it can also be processed in another
  • the processing can be carried out in separate different ISPs.
  • the processing can also be carried out in a multispectral sensor.
  • the specific settings can be set according to needs.
  • the embodiments of the present application do not impose any restrictions on this.
  • the front-end processing may at least include: down sampling (down scale sample), noise reduction (denoise), automatic white balance and/or color correction, and up sampling (up sample).
  • downsampling is used to split and recombine the channel signals included in the image to reduce the size of the image.
  • Noise reduction is used to reduce noise in images. Common methods include mean filtering, Gaussian filtering, bilateral filtering, etc. Of course, other methods can also be used for noise reduction, and the embodiments of the present application do not impose any limitations on this.
  • Automatic white balance is used to correct the down-sampled and noise-reduced image to the D65 reference light source so that its white color appears truly white.
  • Color correction is used to calibrate the accuracy of colors other than white.
  • it is equivalent to using the CCM matrix to correct multiple color channel signals into three color channel signals, for example, a red channel signal, a green channel signal, and a blue channel signal respectively.
  • the CCM matrix used when performing color correction, can be a previously fitted CCM matrix.
  • the CCM matrix under the D65 reference light source can be determined by interpolating the CCM matrices corresponding to other color temperatures.
  • Upsampling is used to enlarge the image size. It should be understood that due to the previous downsampling, the image size is reduced, so accordingly, upsampling needs to be performed to enlarge the image size and restore the image size to facilitate subsequent fusion.
  • FIG 12 shows a schematic diagram of front-end processing.
  • front-end processing includes: downsampling, noise reduction, automatic white balance, color correction, and upsampling in the processing order.
  • the three color channels included in the second initial image can be split, and then the red channel signals are recombined to generate only one frame.
  • the three color channels included in the third initial image can be split, and then the yellow channel signals are reorganized together to generate a single-color channel image that only includes the yellow channel signal, and the cyan channel signals are reassembled together. Generating a frame of a monochrome channel image including only the cyan channel signal, and recombining the magenta channel signals together to generate a frame of a monochrome channel image including only the magenta channel signal.
  • each frame of the monochromatic channel image is the original second initial image.
  • One-half of the image or the third initial image, or in other words, the overall area of each frame of the monochromatic channel image is one-quarter of the original second initial image or the third initial image.
  • one frame of a monochromatic channel image including only red channel signals, one frame of monochromatic channel images including only green channel signals, and one frame of monochromatic channel images including only blue channel signals can be obtained.
  • multiple frames of monochromatic channel images are all of the same size.
  • the three-frame three-color channel image can be upsampled.
  • the included red channel signal, green channel signal and blue channel signal are spliced and reorganized to determine a front-end processed image including three color channel signals.
  • front-end processed images can include better color information.
  • the front-end processed image is a Bayer format image, that is to say, the front-end processed image is an image located in the RAW domain.
  • front-end processing can also include: at least one of dynamic dead pixel compensation (defect pixel correction, DPC), lens shading correction (lens shading correction, LSC) and wide dynamic range adjustment (wide range compression, WDR) .
  • DPC dynamic dead pixel correction
  • LSC lens shading correction
  • WDR wide dynamic range adjustment
  • dynamic bad pixel compensation is used to solve the defects in the array formed by the light collection points on the multi-spectral sensor, or the errors in the process of converting the light signal; usually by taking the average value of other surrounding pixels in the brightness domain to eliminate bad pixels.
  • lens shading correction is used to eliminate the problem of inconsistency between the color and brightness around the image and the center of the image due to the lens optical system.
  • Wide dynamic range adjustment refers to: when high-brightness areas illuminated by strong light sources (sunlight, lamps, reflections, etc.) and relatively low-brightness areas such as shadows and backlights coexist in the image, bright areas will appear in the image. Overexposure becomes white, while dark areas become black due to underexposure, seriously affecting image quality. Therefore, the brightest area and the darker area can be adjusted in the same scene, for example, the dark area is made brighter in the image, and the bright area is darkened in the image, so that the processed image can show the difference between the dark area and the bright area. more details.
  • the front-end processing may include one or more of the above-mentioned processing steps.
  • the front-end processing includes multiple processing steps, the order of the multiple processing steps may be adjusted as needed.
  • This embodiment of the present application does not impose any limitation on this.
  • the front-end processing may also include other steps, which may be added as needed. The embodiments of the present application do not impose any restrictions on this.
  • the first initial image since the first initial image is directly reconstructed from the channel signals of the original image without any other processing, the first initial image carries more texture details. Then, in order to ensure the richness of details, the first initial image can be used for fusion processing, so that while the color of the scene is restored, the restored image carries more details.
  • the front-end processed image is obtained from the second initial image and the third initial image after a series of color processing. Some details are missing, but good color information is retained. Therefore, in order to ensure the color richness, the front-end processed image can be used. Fusion processing, so that when the scene color is restored, the restored image has good color.
  • the resolution of the first initial image is relatively high and can be called a high resolution (HR) image.
  • the resolution of the front-end processed image is relatively low and can be called a low resolution (LR) image.
  • the first initial image and the front-end processed image are fused, and the pixel values corresponding to the same position can be added or multiplied according to different weights, or a network model can be used for fusion; of course, it can also be Fusion processing can be performed using other methods, which can be specifically selected and set according to needs.
  • Fusion processing can be performed using other methods, which can be specifically selected and set according to needs. The embodiments of this application do not impose any restrictions on this.
  • f is the pixel value of the target image
  • g is the pixel value of the first initial image
  • d is the pixel value of the front-end processed image
  • f x and f y are the gradient values of the target image in the x direction and y direction
  • g x and g y are the gradient values of the first initial image
  • w d , w x , and w y are weight values
  • E(f) is the energy function of f.
  • the high-resolution first initial image is equivalent to a constraint map of the gradient value of the target image. Since the first initial image has a higher degree of detail, the closer the gradient value of the target image is to the gradient value of the first initial image. The better. Among them, the gradient value is used to reflect the change rate of image data.
  • the low-resolution front-end processed image is equivalent to the constraint map of the pixel values of the target image. Since the color richness of the front-end processed image is higher, the closer the pixel value of the target image is to the pixel value of the front-end processed image, the better.
  • the target image can ensure that it is close to the gradient value of the first initial image and the pixel value of the front-end processed image. Therefore, the restored target image carries better detailed texture information and color information. .
  • FIG. 13 is a schematic diagram of the effect of the fusion process provided by the embodiment of the present application.
  • this fusion method can combine the detail information of the first initial image and the color information of the front-end processed image, the resulting target image has better color compared to the first initial image, and has better texture information compared to the front-end processed image. Richer and higher resolution.
  • target image will be displayed on the interface of the electronic device as a captured image, or will only be stored.
  • the specific selection can be made as needed, and the embodiments of the present application do not impose any restrictions on this.
  • the image processing method provided by the embodiment of the present application obtains an original image including at least 4 channel signals, and then merges and reorganizes the channel signals in the original image to generate a first initial image, a second initial image, and a third initial image, and then Perform front-end processing on the second initial image and the third initial image to obtain the front-end processed image. Since the first initial image has only undergone merger and reorganization, the richness of details is higher, and the color accuracy of the front-end processed image is higher. Based on this , using the first initial image and the front-end processed image to generate the target image, so that better restoration of image details and colors can be achieved.
  • Figure 14 provides a schematic flow chart of another image processing method. As shown in Figure 14, the above method may also include S17.
  • back-end processing only means that this step is located after the fusion, so it is called “back-end” processing and has no other meaning.
  • Back-end processing may also be called second processing, etc., and the embodiments of this application do not impose any restrictions on this.
  • backend processing can include: demosaicing.
  • demosaicing is used to complement the single-channel signal in each pixel into a multi-channel signal, i.e. A color image in the RGB domain is reconstructed from the image in the RAW domain.
  • a certain pixel in the image only corresponds to one color channel signal, such as only the red channel signal; after demosaicing, the pixel Corresponding to three color channel signals, they are red, green, and blue channel signals.
  • the pixel Corresponding to three color channel signals they are red, green, and blue channel signals.
  • green and blue channel signals are supplemented. The supplementation of pixels of other colors is analogous and will not be described again here.
  • the first back-end processing may also include: at least one of gamma correction, style transformation (3dimensional look up table, 3DLUT), and RGB domain conversion to YUV domain.
  • gamma correction is used to adjust the brightness, contrast, dynamic range, etc. of the image by adjusting the gamma curve
  • style transformation indicates the style transformation of the color, that is, using a color filter to change the original image style into other image styles.
  • Common styles include movie style, Japanese style, spooky style, etc.
  • RGB domain to YUV domain conversion refers to converting an image in the RGB domain into an image in the YUV domain.
  • the back-end processing may include one or more of the above-mentioned processing steps.
  • the back-end processing includes multiple processing steps, the order of the multiple processing steps may be adjusted as needed.
  • the embodiments of the present application do not impose any limitation on this.
  • the back-end processing may also include other steps, which may be added as needed. The embodiments of this application do not impose any restrictions on this.
  • back-end processing can be processed in the same image signal processor as the pre-processing, front-end processing and/or fusion processing, or the back-end processing can also be processed separately in other image signal processors, as needed. Settings are made, and the embodiments of this application do not impose any restrictions on this.
  • Figure 15 shows a schematic diagram of back-end processing.
  • the back-end processing includes: demosaic, gamma correction, style transformation and RGB domain conversion to YUV domain in processing order.
  • the target image is converted from the RAW domain to the YUV domain, which can reduce the amount of subsequent transmission data and save bandwidth.
  • color images are in the YUV domain.
  • the color image may be displayed on the interface of the electronic device 100 as a captured image, or may only be stored. Specifically, the color image may be set as needed, and the embodiment of the present application does not impose any limitation on this.
  • a target image containing better detail information and better color information is generated based on the fusion of the first initial image and the front-end processed image, and then back-end processing is performed on the fused target image to obtain its color and The details are further adjusted to achieve better restoration of image details and colors.
  • FIG. 16 is a schematic diagram of the effect of the image processing method provided by the embodiment of the present application.
  • the image processing method provided by the embodiment of the present application is introduced in detail above. The following describes how the user activates the image processing method provided by the embodiment of the present application in conjunction with the display interface of the electronic device.
  • FIG. 17 is a schematic diagram of a display interface of an electronic device provided by an embodiment of the present application.
  • the electronic device 100 displays a shooting interface as shown in (a) of FIG. 17 .
  • the user can perform a sliding operation on the interface so that the shooting key 11 indicates the shooting option "more".
  • the electronic device 100 displays a shooting interface as shown in (b) of Figure 17 , on which multiple shooting mode options are displayed, such as: professional mode, panorama mode, HDR mode, time-lapse photography mode, watermark mode, detailed color restoration mode, etc.
  • shooting mode options are only examples, and can be set and modified as needed. The embodiments of the present application do not impose any restrictions on this.
  • the electronic device 100 can enable the program related to the image processing method provided by the embodiment of the present application during shooting.
  • FIG. 18 is a schematic diagram of a display interface of another electronic device provided by an embodiment of the present application.
  • the electronic device 100 displays a shooting interface as shown in (a) of Figure 18 , with “Settings” displayed in the upper right corner of the shooting interface. button. Users can click the “Settings” button on this interface to enter the setting interface to set related functions.
  • the electronic device 100 displays a setting interface as shown in (b) of FIG. 18 .
  • Multiple functions are displayed on this interface.
  • the photo ratio is used to realize the photo-taking mode.
  • voice-activated photography is used to set whether to trigger by sound in the photo mode
  • video resolution is used to adjust the video resolution
  • video frame rate is used to adjust the video frame rate.
  • the electronic device 100 can activate the program related to the image processing method provided by the embodiment of the present application when shooting.
  • the above are only two examples in which the user enables the image processing method provided by the embodiment of the present application from the display interface of the electronic device.
  • the image processing method provided by the embodiment of the present application can also be enabled in other ways, or it can also be In the shooting process, the image processing method provided by the embodiment of the present application is directly used by default, and the embodiment of the present application does not impose any restrictions on this.
  • FIG 19 shows a hardware system suitable for the electronic device of the present application.
  • the electronic device 100 may be used to implement the image processing method described in the above method embodiment.
  • the electronic device 100 may be a mobile phone, a smart screen, a tablet, a wearable electronic device, a vehicle-mounted electronic device, an augmented reality (AR) device, a virtual reality (VR) device, a notebook computer, or a super mobile personal computer ( ultra-mobile personal computer (UMPC), netbook, personal computer Personal digital assistant (personal digital assistant, PDA), projector, etc., the embodiment of the present application does not place any restrictions on the specific type of the electronic device 100.
  • AR augmented reality
  • VR virtual reality
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (SIM) card interface 195, etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structure shown in FIG. 19 does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or less components than those shown in FIG. 19 , or the electronic device 100 may include a combination of some of the components shown in FIG. 19 , or , the electronic device 100 may include sub-components of some of the components shown in FIG. 19 .
  • the components shown in Figure 19 may be implemented in hardware, software, or a combination of software and hardware.
  • Processor 110 may include one or more processing units.
  • the processor 110 may include at least one of the following processing units: an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), an image signal processor (image signal processor) , ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, neural network processing unit (NPU).
  • an application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural network processing unit
  • different processing units can be independent devices or integrated devices.
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • the processor 110 may display a first interface, the first interface including a first control; detect a first operation on the first control; and in response to the first operation, obtain an original image, the original image including at least Channel signals of 4 colors; preprocess the original image to obtain the first initial image, the second initial image and the third initial image; then perform front-end processing on the second initial image and the third initial image to obtain the front-end processed image ; Fusion process the first initial image and the front-end processed image to obtain the target image.
  • connection relationship between the modules shown in FIG. 19 is only a schematic illustration and does not constitute a limitation on the connection relationship between the modules of the electronic device 100 .
  • each module of the electronic device 100 may also adopt a combination of various connection methods in the above embodiments.
  • the wireless communication function of the electronic device 100 can be implemented through antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, modem processor, baseband processor and other components.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to Covers single or multiple communication bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
  • the electronic device 100 may implement display functions through a GPU, a display screen 194, and an application processor.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display 194 may be used to display images or videos.
  • the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can algorithmically optimize the noise, brightness and color of the image. ISP can also optimize parameters such as exposure and color temperature of the shooting scene.
  • the ISP may be provided in the camera 193.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard red green blue (RGB), YUV and other format image signals.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3 and MPEG4.
  • MPEG moving picture experts group
  • MPEG2 MPEG2
  • MPEG3 MPEG3
  • the hardware system of the electronic device 100 is described in detail above, and the software system of the electronic device 100 is introduced below.
  • Figure 20 is a schematic diagram of a software system of an electronic device provided by an embodiment of the present application.
  • the system architecture may include an application layer 210, an application framework layer 220, a hardware abstraction layer 230, a driver layer 240 and a hardware layer 250.
  • the application layer 210 may include a camera application or other applications.
  • Other applications include but are not limited to: camera, gallery and other applications.
  • the application framework layer 220 can provide an application programming interface (API) and programming framework to applications in the application layer; the application framework layer can include some predefined functions.
  • API application programming interface
  • the application framework layer 220 may include a camera access interface; the camera access interface may include camera management and camera equipment; where camera management may be used to provide an access interface for managing cameras; and the camera device may be used to provide an interface for accessing cameras.
  • Hardware abstraction layer 230 is used to abstract hardware.
  • the hardware abstraction layer can include the camera abstraction layer and other hardware device abstraction layers; the camera hardware abstraction layer can call the camera algorithm in the camera algorithm library.
  • the hardware abstraction layer 230 includes a camera hardware abstraction layer 2301 and a camera algorithm library;
  • the camera algorithm library may include software algorithms; for example, Algorithm 1, Algorithm 2, etc. may be software algorithms for image processing.
  • the driver layer 240 is used to provide drivers for different hardware devices.
  • the driver layer may include camera device drivers, digital signal processor drivers, and graphics processor drivers.
  • the hardware layer 250 may include multiple image sensors, multiple image signal processors, digital signal processors, graphics processors, and other hardware devices.
  • the hardware layer 250 includes a sensor and an image signal processor; the sensor may include sensor 1, sensor 2, depth sensor (time of flight, TOF), multispectral sensor, etc.
  • the image signal processor may include image signal processor 1, image signal processor 2, etc.
  • the connection between the application layer 210 and the application framework layer 220 above the hardware abstraction layer 230 and the driver layer 240 and the hardware layer 250 below can be realized.
  • the camera hardware interface layer in the hardware abstraction layer 230 manufacturers can customize functions here according to needs. Compared with the hardware abstraction layer interface, the camera hardware interface layer is more efficient, flexible, and low-latency, and can also make more abundant calls to ISP and GPU to implement image processing.
  • the image input to the hardware abstraction layer 230 may come from an image sensor or a stored picture.
  • the scheduling layer in the hardware abstraction layer 230 includes general functional interfaces for implementing management and control.
  • the camera service layer in the hardware abstraction layer 230 is used to access interfaces of ISP and other hardware.
  • the following exemplifies the workflow of the software and hardware of the electronic device 100 in conjunction with capturing the photographing scene.
  • the camera application in the application layer may be displayed on the screen of the electronic device 100 in the form of an icon.
  • the electronic device 100 starts to run the camera application.
  • the camera application calls the interface corresponding to the camera application in the application framework layer 210, and then starts the camera driver by calling the hardware abstraction layer 230, turning on the multispectral sensor on the electronic device 100.
  • camera and collect raw images through multispectral sensors can collect according to a certain operating frequency, and the collected images are processed inside the multispectral sensor or transmitted to one or more image signal processors, and then the processed target image or Color images are saved and/or transferred to the monitor for display.
  • FIG. 21 is a schematic diagram of an image processing device 300 provided by an embodiment of the present application.
  • the image processing device 300 includes a display unit 310 , an acquisition unit 320 and a processing unit 330 .
  • the display unit 310 is used to display a first interface, and the first interface includes first controls.
  • the obtaining unit 320 is used to detect the first operation on the first control.
  • the processing unit 330 is configured to acquire an original image in response to the first operation, where the original image includes channel signals of at least 4 colors.
  • the processing unit 330 is also used to pre-process the original image to obtain the first initial image, the second initial image and the third initial image; perform front-end processing on the second initial image and the third initial image to obtain the front-end processed image; An initial image and the front-end processed image are fused to obtain the target image.
  • image processing device 300 is embodied in the form of functional units.
  • unit here can be implemented in the form of software and/or hardware, and is not specifically limited.
  • a "unit” may be a software program, a hardware circuit, or a combination of both that implements the above functions.
  • the hardware circuit may include an application specific integrated circuit (ASIC), an electronic circuit, a processor (such as a shared processor, a dedicated processor, or a group processor) for executing one or more software or firmware programs. etc.) and memory, merged logic circuitry, and/or other suitable components to support the described functionality.
  • ASIC application specific integrated circuit
  • processor such as a shared processor, a dedicated processor, or a group processor for executing one or more software or firmware programs. etc.
  • memory merged logic circuitry, and/or other suitable components to support the described functionality.
  • the units of each example described in the embodiments of the present application can be implemented by electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are performed in hardware or software depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each specific application, but such implementations should not be considered beyond the scope of this application.
  • Embodiments of the present application also provide a computer-readable storage medium, in which computer instructions are stored; when the computer-readable storage medium is run on the image processing device 300, the image processing device 300 causes the image processing device 300 to Execute the image processing method shown previously.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transferred from a website, computer, server, or data center Transmission to another website, computer, server or data center through wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium can be any available medium that can be accessed by a computer or include one or more data storage devices such as servers and data centers that can be integrated with the medium.
  • the available media may be magnetic media (eg, floppy disk, hard disk, tape), optical media, or semiconductor media (eg, solid state disk (SSD)), etc.
  • Embodiments of the present application also provide a computer program product containing computer instructions, which when run on the image processing device 300 enables the image processing device 300 to execute the image processing method shown above.
  • Figure 22 is a schematic structural diagram of a chip provided by an embodiment of the present application.
  • the chip shown in Figure 22 can be a general-purpose processor or a special-purpose processor.
  • the chip includes a processor 401.
  • the processor 401 is used to support the image processing device 300 in executing the technical solutions shown above.
  • the chip also includes a transceiver 402, which is used to accept the control of the processor 401 and to support the image processing device 300 in executing the technical solution shown above.
  • the chip shown in Figure 22 may also include: a storage medium 403.
  • the chip shown in Figure 22 can be implemented using the following circuits or devices: one or more field programmable gate arrays (FPGA), programmable logic devices (PLD) , controller, state machine, gate logic, discrete hardware components, any other suitable circuit, or any combination of circuits capable of performing the various functions described throughout this application.
  • FPGA field programmable gate arrays
  • PLD programmable logic devices
  • controller state machine
  • gate logic discrete hardware components
  • any other suitable circuit any combination of circuits capable of performing the various functions described throughout this application.
  • the electronic equipment, image processing device 300, computer storage media, computer program products, and chips provided by the embodiments of the present application are all used to execute the methods provided above. Therefore, the beneficial effects they can achieve can be referred to the methods provided above. The beneficial effects corresponding to the method will not be repeated here.
  • preset and predefined can be realized by pre-saving corresponding codes, tables or other methods that can be used to indicate relevant information in the device (for example, including electronic devices). , this application does not limit its specific implementation.

Abstract

The present application relates to the field of image processing, and provides an image processing method and a related device thereof. The image processing method comprises: displaying a first interface, the first interface comprising a first control; detecting a first operation on the first control; in response to the first operation, obtaining an original image; preprocessing the original image to obtain a first initial image, a second initial image, and a third initial image; performing front-end processing on the second initial image and the third initial image to obtain front-end processed images; and performing fusion processing on the first initial image and the front-end processed images to obtain a target image. In this way, better restoration of details and colors of the images can be realized.

Description

图像处理方法及其相关设备Image processing methods and related equipment
本申请要求于2022年05月31日提交国家知识产权局、申请号为202210606523.5、申请名称为“图像处理方法及其相关设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims priority to the Chinese patent application submitted to the State Intellectual Property Office on May 31, 2022, with application number 202210606523.5 and application name "Image processing method and related equipment", the entire content of which is incorporated into this application by reference. middle.
技术领域Technical field
本申请涉及图像处理领域,尤其涉及一种图像处理方法及其相关设备。The present application relates to the field of image processing, and in particular, to an image processing method and related equipment.
背景技术Background technique
目前用于可见光成像的互补金属氧化物半导体(complementary metal oxide semiconductor,CMOS)图像传感器大部分皆为传统的RGB(red,green,blue)传感器,也即是说,这种图像传感器只能接收红色通道信号、绿色通道信号和蓝色通道信号。Most of the complementary metal oxide semiconductor (CMOS) image sensors currently used for visible light imaging are traditional RGB (red, green, blue) sensors. In other words, this image sensor can only receive red. channel signal, green channel signal and blue channel signal.
由于其较少的光谱响应通道数量制约着成像的颜色还原的上限,因此,市场上出现了一些多光谱响应的可见光成像CMOS图像传感器,又称多光谱传感器,希望以此来解决成像色彩还原的问题,但利用该多光谱传感器成像时又会出现噪声问题,而且通常随着光谱响应通道数量的增多,成像时出现的噪声问题越严重。目前并没有成熟的处理方案来利用好这种多光谱传感器,实现精准颜色还原且降低噪声这一目标。由此,亟待一种新的处理方案。Since its small number of spectral response channels limits the upper limit of imaging color restoration, some multispectral response visible light imaging CMOS image sensors, also known as multispectral sensors, have appeared on the market, hoping to solve the problem of imaging color restoration. However, noise problems will occur when using this multispectral sensor for imaging, and usually as the number of spectral response channels increases, the noise problems that occur during imaging will become more serious. Currently, there is no mature processing solution to make good use of this multispectral sensor to achieve the goal of accurate color restoration and noise reduction. Therefore, a new solution is urgently needed.
发明内容Contents of the invention
本申请提供一种图像处理方法及其相关设备,通过将原始图像的通道信号合并重组,生成分别具有较好细节和较好颜色信息的两帧图像,然后将该两帧图像融合生成目标图像,从而可以实现图像细节和色彩的较好还原。This application provides an image processing method and related equipment. By merging and reorganizing the channel signals of the original image, two frames of images with better details and better color information are generated, and then the two frames of images are fused to generate a target image. This enables better restoration of image details and colors.
为达到上述目的,本申请采用如下技术方案:In order to achieve the above purpose, this application adopts the following technical solutions:
第一方面,提供一种图像处理方法,应用于电子设备,该方法包括:In the first aspect, an image processing method is provided, which is applied to electronic devices. The method includes:
显示第一界面,所述第一界面包括第一控件;Display a first interface, the first interface including a first control;
检测到对所述第一控件的第一操作;detecting a first operation on the first control;
响应于所述第一操作,获取原始图像,所述原始图像包括至少4种颜色的通道信号;In response to the first operation, acquiring an original image, the original image including channel signals of at least 4 colors;
对所述原始图像进行预处理,得到第一初始图像、第二初始图像和第三初始图像,所述预处理用于对所述原始图像中包括的多种颜色的通道信号进行合并重组;Perform preprocessing on the original image to obtain a first initial image, a second initial image and a third initial image, where the preprocessing is used to merge and recombine channel signals of multiple colors included in the original image;
对所述第二初始图像和所述第三初始图像进行前端处理,得到前端处理图像;Perform front-end processing on the second initial image and the third initial image to obtain a front-end processed image;
将所述第一初始图像和所述前端处理图像进行融合处理,得到目标图像。The first initial image and the front-end processed image are fused to obtain a target image.
本申请实施例提供的图像处理方法,通过获取包括至少4个通道信号的原始图像,然后将原始图像中的通道信号合并重组,生成第一初始图像、第二初始图像和第三初始图像,再对第二初始图像和第三初始图像进行前端处理,得到前端处理图像,由于 第一初始图像仅经历了合并重组,所以细节丰富度较高,而前端处理图像的颜色准确性较高,基于此,利用第一初始图像和前端处理图像生成目标图像,从而可以实现图像细节和色彩的较好还原。The image processing method provided by the embodiment of the present application obtains an original image including at least 4 channel signals, and then merges and reorganizes the channel signals in the original image to generate a first initial image, a second initial image, and a third initial image, and then Perform front-end processing on the second initial image and the third initial image to obtain the front-end processed image. Since The first initial image has only undergone merging and reorganization, so the detail richness is higher, and the color accuracy of the front-end processed image is higher. Based on this, the first initial image and the front-end processed image are used to generate the target image, so that image details and Better color restoration.
在第一方面一种可能的实现方式中,对所述原始图像进行预处理,得到第一初始图像、第二初始图像和第三初始图像,包括:In a possible implementation of the first aspect, the original image is preprocessed to obtain a first initial image, a second initial image and a third initial image, including:
对所述原始图像进行四合一像素合并处理,得到所述第一初始图像;Perform a four-in-one pixel merging process on the original image to obtain the first initial image;
对所述原始图像进行对角线像素合并处理,得到所述第二初始图像和所述第三初始图像。Perform diagonal pixel merging processing on the original image to obtain the second initial image and the third initial image.
应理解,四合一像素合并处理指的是将相邻四个像素值进行加权平均后作为一个单一像素值输出的处理方式,对角线像素合并处理指的是将对角线方向两个像素值进行加权平均后作为一个单一像素值输出的处理方式。It should be understood that four-in-one pixel merging processing refers to a processing method in which four adjacent pixel values are weighted and averaged and output as a single pixel value, and diagonal pixel merging processing refers to two pixels in the diagonal direction. The weighted average value is output as a single pixel value.
在该实现方式中,通过将原始图像中的多种颜色通道信号四合一像素合并处理,可以得到细节丰富度较高的第一初始图像;而将原始图像进行对角线像素合并处理,可以得到两种包括不同颜色通道信号的第二初始图像和第三初始图像,以便于第二初始图像和第三初始图像可以为后续色彩还原提供更好的色彩信息。In this implementation, by merging four-in-one pixels of multiple color channel signals in the original image, a first initial image with high detail richness can be obtained; and by merging diagonal pixels on the original image, Two second initial images and a third initial image including different color channel signals are obtained, so that the second initial image and the third initial image can provide better color information for subsequent color restoration.
在第一方面一种可能的实现方式中,所述前端处理至少包括:下采样、降噪、自动白平衡和/或颜色校正、上采样。In a possible implementation of the first aspect, the front-end processing at least includes: downsampling, noise reduction, automatic white balance and/or color correction, and upsampling.
在该实现方式中,前端处理用于对第二初始图像和第三初始图像的色彩进行处理,使得处理后得到的前端处理图像可以为后续处理提供更好的色彩。In this implementation, front-end processing is used to process the colors of the second initial image and the third initial image, so that the processed front-end processed image can provide better colors for subsequent processing.
在第一方面一种可能的实现方式中,将所述第一初始图像和所述前端处理图像进行融合处理,得到目标图像,包括:将所述第一初始图像和所述前端处理图像,利用以下公式进行融合处理:E(f)=wd(f-d)2+wx(fx-gx)2+wy(fy-gy)2;其中,f为所述目标图像的像素值,g为所述第一初始图像的像素值,d为所述前端处理图像的像素值,fx和fy为所述目标图像在x方向和y方向上的梯度值,gx和gy为所述第一初始图像的梯度值,wd、wx、wy为权重值,E(f)为f的能量函数;确定E(f)的最小值,得到所述目标图像。In a possible implementation of the first aspect, fusion processing of the first initial image and the front-end processed image to obtain a target image includes: merging the first initial image and the front-end processed image using The following formula is used for fusion processing: E(f)=w d (fd) 2 +w x (f x -g x ) 2 +w y (f y -g y ) 2 ; where f is the pixel of the target image value, g is the pixel value of the first initial image, d is the pixel value of the front-end processed image, f x and f y are the gradient values of the target image in the x direction and y direction, g x and g y is the gradient value of the first initial image, w d , w x , and wy are weight values, and E(f) is the energy function of f; determine the minimum value of E(f) to obtain the target image.
在该实现方式中,可以利用能量函数公式将第一初始图像和前端处理图像进行融合,以期望目标图像既能接近第一初始图像的梯度值,又能接近前端处理图像的像素值,进而使得还原出的目标图像携带有较好的细节纹理信息和色彩信息。In this implementation, the energy function formula can be used to fuse the first initial image and the front-end processed image, so that the target image is expected to be close to both the gradient value of the first initial image and the pixel value of the front-end processed image, so that The restored target image carries better detailed texture information and color information.
在第一方面一种可能的实现方式中,所述图像处理方法还包括:In a possible implementation of the first aspect, the image processing method further includes:
对所述目标图像进行后端处理,得到彩色图像。Perform back-end processing on the target image to obtain a color image.
在该实现方式中,后端处理用于进一步对目标图像的细节和色彩进行增强。In this implementation, back-end processing is used to further enhance the detail and color of the target image.
在第一方面一种可能的实现方式中,所述后端处理包括:去马赛克、伽马校正、风格变换中的至少一项。In a possible implementation of the first aspect, the back-end processing includes: at least one of demosaicing, gamma correction, and style transformation.
在第一方面一种可能的实现方式中,所述原始图像包括红色通道信号、绿色通道信号、蓝色通道信号、黄色通道信号、青色通道信号和品红色通道信号。In a possible implementation of the first aspect, the original image includes a red channel signal, a green channel signal, a blue channel signal, a yellow channel signal, a cyan channel signal and a magenta channel signal.
可选地,第一操作是指点击相机应用程序的操作。Optionally, the first operation refers to an operation of clicking the camera application.
在第一方面一种可能的实现方式中,所述第一界面是指所述电子设备的拍照界面,所述第一控件是指用于指示拍照的控件。In a possible implementation manner of the first aspect, the first interface refers to a photographing interface of the electronic device, and the first control refers to a control for instructing to photograph.
可选地,第一操作是指点击用于指示拍照的控件的操作。在第一方面一种可能的 实现方式中,所述第一界面是指所述电子设备的拍摄视频界面,所述第一控件是指用于指示拍摄视频的控件。Optionally, the first operation refers to an operation of clicking a control for instructing to take a photo. In the first aspect a possible In an implementation manner, the first interface refers to a video shooting interface of the electronic device, and the first control refers to a control used to instruct video shooting.
可选地,第一操作是指点击指示拍摄视频的控件的操作。Optionally, the first operation refers to an operation of clicking a control indicating shooting a video.
上述以第一操作为点击操作为例进行举例说明;第一操作还可以包括语音指示操作,或者其它的指示电子设备进行拍照或者拍摄视频的操作;上述为举例说明,并不对本申请作任何限定。The above description takes the first operation as a click operation as an example; the first operation may also include a voice instruction operation, or other operations of instructing the electronic device to take photos or videos; the above is an example and does not limit the application in any way. .
第二方面,提供了一种电子设备,包括用于执行第一方面或第一方面中任一种方法的模块/单元。In a second aspect, an electronic device is provided, including a module/unit for performing the first aspect or any method in the first aspect.
第三方面,提供了一种电子设备,包括一个或多个处理器和存储器;In a third aspect, an electronic device is provided, including one or more processors and memories;
所述存储器与一个或多个处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行第一方面或第一方面中任一种方法。The memory is coupled to one or more processors, the memory is used to store computer program code, the computer program code includes computer instructions, and the one or more processors invoke the computer instructions to cause the electronic device to Perform the first aspect or any method in the first aspect.
第四方面,提供了一种芯片系统,所述芯片系统应用于电子设备,所述芯片系统包括一个或多个处理器,所述处理器用于调用计算机指令以使得所述电子设备执行第一方面或第一方面中任一种方法。In a fourth aspect, a chip system is provided. The chip system is applied to an electronic device. The chip system includes one or more processors. The processor is used to call computer instructions to cause the electronic device to execute the first aspect. Or any method in the first aspect.
第五方面,提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序包括程序指令,所述程序指令当被处理器执行时,使所述处理器执行第一方面或第一方面中任一种方法。In a fifth aspect, a computer-readable storage medium is provided. The computer-readable storage medium stores a computer program. The computer program includes program instructions. When executed by a processor, the program instructions cause the processor to Perform the first aspect or any method in the first aspect.
第六方面,提供了一种计算机程序产品,所述计算机程序产品包括:计算机程序代码,当所述计算机程序代码被电子设备运行时,使得该电子设备执行第一方面或第一方面中任一种方法。In a sixth aspect, a computer program product is provided. The computer program product includes: computer program code. When the computer program code is run by an electronic device, the electronic device causes the electronic device to execute the first aspect or any one of the first aspects. method.
本申请提供了一种图像处理方法及其相关设备,通过获取包括至少4个通道信号的原始图像,然后将原始图像中的通道信号合并重组,生成第一初始图像、第二初始图像和第三初始图像,再对第二初始图像和第三初始图像进行前端处理,得到前端处理图像,由于第一初始图像仅经历了合并重组,所以细节丰富度较高,而前端处理图像的颜色准确性较高,基于此,利用第一初始图像和前端处理图像生成目标图像,从而可以实现图像细节和色彩的较好还原。This application provides an image processing method and related equipment. By acquiring an original image including at least 4 channel signals, and then merging and reorganizing the channel signals in the original image, a first initial image, a second initial image and a third initial image are generated. The initial image is then subjected to front-end processing on the second initial image and the third initial image to obtain the front-end processed image. Since the first initial image has only undergone merger and reorganization, the richness of details is higher, while the color accuracy of the front-end processed image is lower. High, based on this, the first initial image and the front-end processed image are used to generate the target image, so that better restoration of image details and colors can be achieved.
附图说明Description of the drawings
图1为一种RGBCMY传感器的成像示意图;Figure 1 is an imaging schematic diagram of an RGBCMY sensor;
图2为一种RGBCMY的光谱响应曲线;Figure 2 is a spectral response curve of RGBCMY;
图3为一种利用24色块确定CCM矩阵的示意图;Figure 3 is a schematic diagram of using 24 color blocks to determine the CCM matrix;
图4为一种利用CCM矩阵进行处理的前后对比图;Figure 4 is a comparison before and after using CCM matrix for processing;
图5为一种应用场景的示意图;Figure 5 is a schematic diagram of an application scenario;
图6为本申请实施例提供的一种图像处理方法的流程示意图;Figure 6 is a schematic flowchart of an image processing method provided by an embodiment of the present application;
图7为本申请实施例提供的一种原始图像的示意图;Figure 7 is a schematic diagram of an original image provided by an embodiment of the present application;
图8为本申请实施例提供的一种原始图像进行Qbin处理的示意图;Figure 8 is a schematic diagram of Qbin processing of an original image provided by an embodiment of the present application;
图9为本申请实施例提供的一种Dbin处理过程;Figure 9 is a Dbin processing process provided by the embodiment of the present application;
图10为本申请实施例提供的一种原始图像进行Dbin处理的示意图;Figure 10 is a schematic diagram of Dbin processing of an original image provided by an embodiment of the present application;
图11为本申请实施例提供的另一种图像处理方法的流程示意图; Figure 11 is a schematic flow chart of another image processing method provided by an embodiment of the present application;
图12为本申请实施例提供的一种前端处理的示意图;Figure 12 is a schematic diagram of front-end processing provided by an embodiment of the present application;
图13为本申请实施例提供的融合处理的效果示意图;Figure 13 is a schematic diagram of the effect of the fusion process provided by the embodiment of the present application;
图14为本申请实施例提供的又一种图像处理方法的流程示意图;Figure 14 is a schematic flow chart of another image processing method provided by an embodiment of the present application;
图15为本申请实施例提供的一种后端处理的示意图;Figure 15 is a schematic diagram of back-end processing provided by an embodiment of the present application;
图16为本申请实施例提供的图像处理方法的效果示意图;Figure 16 is a schematic diagram of the effect of the image processing method provided by the embodiment of the present application;
图17为本申请实施例提供的一种电子设备的显示界面的示意图;Figure 17 is a schematic diagram of a display interface of an electronic device provided by an embodiment of the present application;
图18为本申请实施例提供的另一种电子设备的显示界面的示意图;Figure 18 is a schematic diagram of a display interface of another electronic device provided by an embodiment of the present application;
图19为一种适用于本申请的电子设备的硬件系统的示意图;Figure 19 is a schematic diagram of a hardware system suitable for the electronic device of the present application;
图20为一种适用于本申请的电子设备的软件系统的示意图;Figure 20 is a schematic diagram of a software system suitable for the electronic device of the present application;
图21为本申请实施例提供的一种图像处理装置的结构示意图;Figure 21 is a schematic structural diagram of an image processing device provided by an embodiment of the present application;
图22为本申请实施例提供的一种芯片系统的结构示意图。Figure 22 is a schematic structural diagram of a chip system provided by an embodiment of the present application.
具体实施方式Detailed ways
下面将结合附图,对本申请中的技术方案进行描述。The technical solutions in this application will be described below with reference to the accompanying drawings.
在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B三种情况。In the description of the embodiments of this application, unless otherwise stated, "/" means or, for example, A/B can mean A or B; "and/or" in this article is just a way to describe the association of related objects. Relationship means that there can be three relationships. For example, A and/or B can mean: A alone exists, A and B exist at the same time, and B exists alone.
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。Hereinafter, the terms “first” and “second” are used for descriptive purposes only and cannot be understood as indicating or implying relative importance or implicitly indicating the quantity of indicated technical features. Therefore, features defined as "first" and "second" may explicitly or implicitly include one or more of these features. In the description of this embodiment, unless otherwise specified, "plurality" means two or more.
首先,对本申请实施例中的部分用语进行解释说明,以便于本领域技术人员理解。First, some terms used in the embodiments of this application are explained to facilitate understanding by those skilled in the art.
1、RGB(red,green,blue)颜色空间或RGB域,指的是一种与人的视觉系统结构相关的颜色模型。根据人眼睛的结构,将所有颜色都当作是红色、绿色和蓝色的不同组合。红色、绿色和蓝色称为三基色。应理解,基色指的是通过其他颜色的混合无法得到的“基本色”。1. RGB (red, green, blue) color space or RGB domain refers to a color model related to the structure of the human visual system. Think of all colors as different combinations of red, green, and blue, based on the structure of the human eye. Red, green and blue are called the three primary colors. It should be understood that a primary color refers to a "basic color" that cannot be obtained by mixing other colors.
2、YUV颜色空间或YUV域,指的是一种颜色编码方法,Y表示亮度,U和V表示的则是色度。上述RGB颜色空间着重于人眼对色彩的感应,YUV颜色空间则着重于视觉对亮度的敏感程度,RGB颜色空间和YUV颜色空间可以互相转换。2. YUV color space or YUV domain refers to a color encoding method. Y represents brightness, and U and V represent chroma. The above-mentioned RGB color space focuses on the human eye's perception of color, while the YUV color space focuses on the visual sensitivity to brightness. The RGB color space and the YUV color space can be converted into each other.
3、像素值,指的是位于RGB颜色空间的彩色图像中每个像素对应的一组颜色分量。例如,每个像素对应一组三基色分量,其中,三基色分量分别为红色分量R、绿色分量G和蓝色分量B。3. Pixel value refers to a set of color components corresponding to each pixel in a color image located in the RGB color space. For example, each pixel corresponds to a set of three primary color components, where the three primary color components are the red component R, the green component G, and the blue component B respectively.
4、拜耳格式(bayer pattern)彩色滤波阵列(color filter array,CFA),图像由实际的景物转换为图像数据时,通常是图像传感器分别接收红色通道信号、绿色通道信号和蓝色通道信号,三个通道信号的信息,然后将三个通道信号的信息合成彩色图像,但是,这种方案中每个像素位置处都对应需要三块滤镜,价格昂贵且不好制作,因此,可以在图像传感器表面覆盖一层彩色滤波阵列,以获取三个通道信号的信息。拜耳格式彩色滤波阵列指的是滤镜以棋盘格式进行排布。例如,该拜耳格式彩色滤波阵列中的最小重复单元为:一个获取红色通道信号的滤镜、两个获取绿色通道信号的滤镜、 一个获取蓝色通道信号的滤镜以2×2的方式排布。4. Bayer pattern color filter array (CFA). When the image is converted from the actual scene into image data, the image sensor usually receives the red channel signal, the green channel signal and the blue channel signal respectively. information of three channel signals, and then synthesize the information of the three channel signals into a color image. However, in this scheme, three filters are required for each pixel position, which is expensive and difficult to produce. Therefore, it can be used on the image sensor. The surface is covered with a color filter array to obtain information from the three channel signals. Bayer format color filter array refers to filters arranged in a checkerboard format. For example, the minimum repeating unit in the Bayer format color filter array is: one filter to obtain the red channel signal, two filters to obtain the green channel signal, A filter that acquires the blue channel signal is arranged in a 2×2 pattern.
5、拜耳格式图像(bayer image),即基于拜耳格式彩色滤波阵列的图像传感器输出的图像。该图像中的多种颜色的像素以拜耳格式进行排布。其中,拜耳格式图像中的每个像素仅对应一种颜色的通道信号。示例性的,由于人的视觉对绿色较为敏感,所以,可以设定绿色像素(对应绿色通道信号的像素)占全部像素的50%,蓝色像素(对应蓝色通道信号的像素)和红色像素(对应红色通道信号的像素)各占全部像素的25%。其中,拜耳格式图像的最小重复单元为:一个红色像素、两个绿色像素和一个蓝色像素以2×2的方式排布。应理解,RAW域为RAW颜色空间,拜耳格式图像即可以称为位于RAW域的图像。5. Bayer image (bayer image), that is, the image output by the image sensor based on the Bayer format color filter array. Pixels of multiple colors in this image are arranged in a Bayer format. Among them, each pixel in the Bayer format image only corresponds to the channel signal of one color. For example, since human vision is more sensitive to green, it can be set that green pixels (pixels corresponding to green channel signals) account for 50% of all pixels, blue pixels (pixels corresponding to blue channel signals) and red pixels (pixels corresponding to the red channel signal) each account for 25% of all pixels. Among them, the smallest repeating unit of the Bayer format image is: one red pixel, two green pixels and one blue pixel arranged in a 2×2 manner. It should be understood that the RAW domain is the RAW color space, and the Bayer format image can be called an image located in the RAW domain.
6、灰阶图像(gray image),灰阶图像是单通道图像,用于表示不同亮度程度,最亮为全白,最暗为全黑。也就是说,灰阶图像中的每个像素对应黑色到白色之间的不同程度的亮度。通常为了对最亮到最暗之间的亮度变化进行描述,将其进行划分,例如划分为256份,即代表256个等级的亮度,并称之为256个灰阶(第0灰阶~第255灰阶)。6. Grayscale image. Grayscale image is a single-channel image, used to represent different brightness levels. The brightest is completely white, and the darkest is completely black. That is, each pixel in a grayscale image corresponds to a different degree of brightness between black and white. Usually in order to describe the brightness change from the brightest to the darkest, it is divided, for example, into 256 parts, which represents 256 levels of brightness, and is called 256 gray levels (the 0th gray level to the 0th gray level). 255 grayscale).
7、光谱响应度(spectral response),也可以称为光谱灵敏度,光谱响应度表示图像传感器对不同波长入射光能转换成电能的能力。其中,若将某一波长的光入射到图像传感器的光能量转换成光子数目,而图像传感器产生、传递到外部电路的电流以电子数来表示,则代表每一入射的光子能够转换成传输到外部电路的电子的能力,称为量子效率(quantum efficiency,QE),单位以百分比来表示,图像传感器的光谱响应度则取决于该量子效率、以及波长和积分时间等参数。7. Spectral response (spectral response), which can also be called spectral sensitivity. Spectral response represents the ability of the image sensor to convert incident light energy of different wavelengths into electrical energy. Among them, if the light energy of a certain wavelength of light incident on the image sensor is converted into the number of photons, and the current generated by the image sensor and transmitted to the external circuit is expressed in the number of electrons, it means that each incident photon can be converted into The ability of electrons in the external circuit is called quantum efficiency (QE), and the unit is expressed in percentage. The spectral responsivity of the image sensor depends on the quantum efficiency, as well as parameters such as wavelength and integration time.
8、自动白平衡(auto white balance,AWB)8. Auto white balance (AWB)
人眼具有颜色恒常性特点,大部分情况下在各种光源场景下看到相同物体的颜色是一致的,例如白纸是白色观感。那么,为了消除光源对图像传感器成像的影响,模拟人类视觉的颜色恒常性,保证在任何场景下看到的白色是真正的白色,因此,需要对色温进行校正,自动将白平衡调到合适的位置。The human eye has the characteristic of color constancy. In most cases, the color of the same object seen in various light source scenarios is consistent. For example, white paper looks white. Then, in order to eliminate the impact of the light source on the imaging of the image sensor, simulate the color constancy of human vision, and ensure that the white seen in any scene is truly white, therefore, it is necessary to correct the color temperature and automatically adjust the white balance to the appropriate Location.
不同的相机有着各自的彩色滤波阵列,不同彩色滤波阵列的滤镜颜色就构成了相机颜色空间(RAW域或RAW颜色空间),因此,相机颜色空间不是一个通用的色彩空间。例如,滤镜颜色为RGGB的彩色滤波阵列,构成的相机颜色空间为RAW RGB,如果直接显示该彩色滤波阵列生成的拜耳格式图像或者说RAW图,该图像则会是偏绿色的。而一般显示器是以标准色彩空间(sRGB)为准来进行显示的,其参考光源为D65,因此,自动白平衡算法需要将RAW域的图像校正到D65参考光源下。其中,D65指的是色温为6500K的标准光源,D65光源下一般规定白色为R=G=B。Different cameras have their own color filter arrays, and the filter colors of different color filter arrays constitute the camera color space (RAW domain or RAW color space). Therefore, the camera color space is not a universal color space. For example, a color filter array with a filter color of RGGB forms a camera color space of RAW RGB. If the Bayer format image or RAW image generated by the color filter array is directly displayed, the image will be greenish. Generally, monitors display based on the standard color space (sRGB), and their reference light source is D65. Therefore, the automatic white balance algorithm needs to correct the image in the RAW domain to the D65 reference light source. Among them, D65 refers to the standard light source with a color temperature of 6500K. Under the D65 light source, white color is generally specified as R=G=B.
9、颜色校正,由于摄像头获取的图像,与人们期望的颜色会存在一定差距,因此需要对颜色进行校正。又因为自动白平衡已经将白色校准了,因此,可以通过颜色校正来校准除白色以外的其他颜色。9. Color correction. Since there will be a certain gap between the image acquired by the camera and the color people expect, the color needs to be corrected. And because the automatic white balance has already calibrated white, colors other than white can be calibrated through color correction.
10、颜色校正矩阵(color correction matrix,CCM)10. Color correction matrix (CCM)
CCM矩阵主要用于将自动白平衡获取得图像数据转换到标准颜色空间(sRGB)。由于CMOS传感器的频谱响应和人眼对可见光的频谱响应存在较大差异,导致相机的色彩还原与观察者感知到的物体颜色存在很大差异,因此,需要通过CCM矩阵提高 颜色饱和度,使得相机拍摄出的图像颜色更加接近人眼的感知效果。其中,利用CCM矩阵进行校正的过程即为颜色校正的过程。The CCM matrix is mainly used to convert image data obtained by automatic white balance into standard color space (sRGB). Since there is a big difference between the spectral response of the CMOS sensor and the spectral response of the human eye to visible light, the color reproduction of the camera is very different from the color of the object perceived by the observer. Therefore, it is necessary to improve the color of the object through the CCM matrix. Color saturation makes the color of the image captured by the camera closer to the perception of the human eye. Among them, the process of correction using the CCM matrix is the process of color correction.
以上是对本申请实施例所涉及名词的简单介绍,以下不再赘述。The above is a brief introduction to the terms involved in the embodiments of this application, and will not be described in detail below.
目前用于可见光成像的CMOS图像传感器大部分皆为传统的RGB传感器,由于硬件的限制,导致这种图像传感器只能接收红色通道信号、绿色通道信号和蓝色通道信号。这样,该图像传感器的光谱响应通道数量是十分有限的,而较少的光谱响应通道数量会限制图像传感器的颜色还原能力,影响还原出的图像的颜色等信息。Most of the CMOS image sensors currently used for visible light imaging are traditional RGB sensors. Due to hardware limitations, this image sensor can only receive red channel signals, green channel signals, and blue channel signals. In this way, the number of spectral response channels of the image sensor is very limited, and a small number of spectral response channels will limit the color restoration capability of the image sensor and affect the color and other information of the restored image.
由于RGB传感器较少的光谱响应通道数量制约着成像的颜色还原的上限,因此,市场上出现了一些多光谱响应的可见光成像CMOS传感器,又称多光谱传感器,希望以此解决成像色彩还原的问题,但利用多光谱传感器成像时又会出现噪声问题,而且通常随着光谱响应通道数量的增多,成像时出现的噪声问题越严重。目前并没有成熟的处理方案来利用好这种传感器,以实现精准颜色还原且降低噪声这一目标。Since the small number of spectral response channels of RGB sensors limits the upper limit of imaging color restoration, some multispectral response visible light imaging CMOS sensors, also known as multispectral sensors, have appeared on the market, hoping to solve the problem of imaging color restoration. , but noise problems will occur when using multispectral sensors for imaging, and usually as the number of spectral response channels increases, the noise problems that occur during imaging will become more serious. Currently, there is no mature processing solution to make good use of this sensor to achieve the goal of accurate color restoration and noise reduction.
应理解,多光谱指的是用于成像的光谱波段包括2个及2个以上数量的波段。根据此定义,由于RGB传感器利用了红色、绿色和蓝色三个波段,所以,RGB传感器严格来说也是属于多光谱响应的,但是,需要说明的是,本申请所指的多光谱响应的可见光CMOS传感器,其实指的是比RGB传感器的光谱响应通道数量多的其他多光谱传感器。It should be understood that multispectral means that the spectral bands used for imaging include 2 or more bands. According to this definition, since the RGB sensor utilizes three bands of red, green and blue, the RGB sensor is also strictly a multispectral response. However, it should be noted that the visible light referred to in this application as a multispectral response CMOS sensors actually refer to other multispectral sensors that have a larger number of spectral response channels than RGB sensors.
例如,所述多光谱传感器可以为RGBC传感器、RGBM传感器、RGBY传感器,RGBCM传感器、RGBCY传感器、RGBMY传感器、RGBCMY传感器等。应理解,该RGBCMY传感器接收的是红色通道信号、绿色通道信号、蓝色通道信号、青色(cyan)通道信号、品红(magenta)通道信号和黄色(yellow)通道信号。其他传感器所接收的通道颜色依次类推,在此不再赘述。For example, the multispectral sensor may be an RGBC sensor, an RGBM sensor, an RGBY sensor, an RGBCM sensor, an RGBCY sensor, an RGBMY sensor, an RGBCMY sensor, etc. It should be understood that the RGBCMY sensor receives red channel signal, green channel signal, blue channel signal, cyan channel signal, magenta channel signal and yellow channel signal. The channel colors received by other sensors are deduced in sequence and will not be described again here.
当然,多光谱传感器也可以为接收其他颜色通道信号的传感器,具体可以根据需要进行选择和设置,本申请实施例对此不进行任何限制。Of course, the multispectral sensor can also be a sensor that receives signals from other color channels, and can be specifically selected and set according to needs. The embodiments of the present application do not impose any restrictions on this.
示例性的,图1提供了一种RGBCMY传感器的成像示意图。RGBCMY传感器表面覆盖的彩色滤波阵列可以获取六种颜色通道信号的信息。例如,该拜耳格式彩色滤波阵列中的最小重复单元为:两个获取红色通道信号的滤镜、四个获取绿色通道信号的滤镜、两个获取蓝色通道信号的滤镜、两个获取青色通道信号的滤镜、两个获取品红通道信号的滤镜、四个获取黄色通道信号的滤镜,且以4×4的矩阵方式排布。For example, Figure 1 provides an imaging schematic diagram of an RGBCMY sensor. The color filter array covered on the surface of the RGBCMY sensor can obtain information from six color channel signals. For example, the minimum repeating unit in the Bayer format color filter array is: two filters to obtain the red channel signal, four filters to obtain the green channel signal, two filters to obtain the blue channel signal, and two filters to obtain the cyan channel signal. Filters for channel signals, two filters for acquiring magenta channel signals, and four filters for acquiring yellow channel signals, arranged in a 4×4 matrix.
相应的,如图1所示,利用RGBCMY传感器获取的拜耳格式图像的最小重复单元为:两个红色像素、四个绿色像素、两个蓝色像素、两个青色像素、两个品红像素、四个黄色像素,且以4×4的矩阵方式排布。Correspondingly, as shown in Figure 1, the minimum repeating unit of the Bayer format image obtained using the RGBCMY sensor is: two red pixels, four green pixels, two blue pixels, two cyan pixels, two magenta pixels, Four yellow pixels, arranged in a 4×4 matrix.
图2提供了一种RGBCMY的光谱响应曲线的示意图。横轴表示波长,纵轴表示不同光谱所对应的光谱响应度。其中,R所指示的光谱响应曲线表示红光在不同波长所对应的不同光谱响应度,G所指示的光谱响应曲线表示绿光在不同波长所对应的不同光谱响应度,B所指示的光谱响应曲线表示蓝光在不同波长所对应的不同光谱响应度;C所指示的光谱响应曲线表示青光在不同波长所对应的不同光谱响应度,M所指示的光谱响应曲线表示品红光在不同波长所对应的不同光谱响应度,Y所指示的光谱 响应曲线表示黄光在不同波长所对应的不同光谱响应度。Figure 2 provides a schematic diagram of the spectral response curve of RGBCMY. The horizontal axis represents the wavelength, and the vertical axis represents the spectral responsivity corresponding to different spectra. Among them, the spectral response curve indicated by R represents the different spectral responsivity of red light at different wavelengths, the spectral response curve indicated by G represents the different spectral responsivity of green light at different wavelengths, and the spectral response indicated by B The curve represents the different spectral responsivity of blue light at different wavelengths; the spectral response curve indicated by C represents the different spectral responsivity of cyan light at different wavelengths; the spectral response curve indicated by M represents the different spectral responsivity of magenta light at different wavelengths. Corresponding different spectral responsivities, the spectrum indicated by Y The response curve represents the different spectral responsivity of yellow light at different wavelengths.
以RGBCMY传感器为例,相对于RGB传感器来说,由于基色数量提升,光谱响应通道数量变多,所以RGBCMY传感器一般能获得相对更好的颜色还原能力,即色准。Take the RGBCMY sensor as an example. Compared with the RGB sensor, due to the increase in the number of primary colors and the increase in the number of spectral response channels, the RGBCMY sensor can generally achieve relatively better color restoration capabilities, that is, color accuracy.
相关技术中通常是通过自动白平衡和CCM矩阵来对传感器获取的拜耳格式图像进行处理以恢复场景颜色的,那么,对于RGBCMY传感器获取的拜耳格式图像来说,通常也可以通过自动白平衡和CCM矩阵来进行处理以恢复场景颜色。在该处理过程中所利用的CCM矩阵则需要提前拟合出来。In the related art, the Bayer format image acquired by the sensor is usually processed through automatic white balance and CCM matrix to restore the scene color. Then, for the Bayer format image acquired by the RGBCMY sensor, it is usually also possible to process the Bayer format image through automatic white balance and CCM. Matrix is processed to restore the scene color. The CCM matrix used in this process needs to be fitted in advance.
但是,相对于RGB传感器对应的CCM矩阵为3×3的矩阵来说,RGBCMY传感器对应的CCM矩阵为6×3的矩阵,包括的参数值更多,而且,针对RGBCMY传感器,通常在拟合CCM矩阵时会遇到过拟合现象,从而会导致拟合出的CCM矩阵中部分参数值过大,这样在实际使用该CCM矩阵处理时,就会将RGBCMY传感器获取的拜耳格式图像中的噪声放大,产生严重的彩噪问题。其中,彩噪指的是彩色的噪声。However, compared to the CCM matrix corresponding to the RGB sensor, which is a 3×3 matrix, the CCM matrix corresponding to the RGBCMY sensor is a 6×3 matrix, which includes more parameter values. Moreover, for the RGBCMY sensor, usually when fitting the CCM Matrix will encounter over-fitting phenomenon, which will cause some parameter values in the fitted CCM matrix to be too large. In this way, when the CCM matrix is actually used for processing, the noise in the Bayer format image obtained by the RGBCMY sensor will be amplified. , causing serious color noise problems. Among them, color noise refers to colored noise.
图3提供了一种利用24色块确定CCM矩阵的示意图。Figure 3 provides a schematic diagram of determining the CCM matrix using 24 color blocks.
示例性的,以6500K色温为例,如图3中的(a)所示的,为该6500K色温下,对RGBCMY传感器获取的图像数据进行自动白平衡处理和去马赛克(demosaic,DM)后生成的24色卡。如图3中的(b)所示的,为该6500K色温下的标准24色卡。Illustratively, taking a color temperature of 6500K as an example, as shown in (a) in Figure 3, at this color temperature of 6500K, the image data acquired by the RGBCMY sensor is generated after automatic white balance processing and demosaic (DM) 24 color cards. As shown in (b) in Figure 3, this is a standard 24 color card at a color temperature of 6500K.
利用图3中的(a)和(b)所示的24色卡进行矩阵拟合,可以获取6500K色温下对应的CCM矩阵,该CCM矩阵表示将图3中的(a)所示的24色卡校正成图3中的(b)所示的标准24色卡时,所需相乘的系数矩阵。Using the 24 color cards shown in (a) and (b) in Figure 3 for matrix fitting, the corresponding CCM matrix at a color temperature of 6500K can be obtained. This CCM matrix represents the 24 colors shown in (a) in Figure 3 When the card is calibrated to the standard 24-color card shown in (b) in Figure 3, the coefficient matrix that needs to be multiplied is required.
由于图3中的(a)所示的每种颜色对应R、G、B、C、M和Y这6个基色值,而图3中的(b)所示的每种颜色仅对应R、G和B这3个基色值,所以,拟合出的CCM矩阵为6×3的矩阵,也即拟合出的CCM矩阵包括18个参数值。由于在该拟合过程中,通常会遇到过拟合现象,导致CCM矩阵中包括的18个参数值中的部分参数值过大,进而就会导致在实际处理时,利用该拟合出的CCM矩阵处理后的图像的噪声被放大。Since each color shown in (a) in Figure 3 corresponds to the six primary color values of R, G, B, C, M and Y, while each color shown in (b) in Figure 3 only corresponds to R, There are three primary color values of G and B, so the fitted CCM matrix is a 6×3 matrix, that is, the fitted CCM matrix includes 18 parameter values. Since in the fitting process, over-fitting phenomenon is usually encountered, causing some of the 18 parameter values included in the CCM matrix to be too large, which will lead to the problem of using the fitted parameters during actual processing. The noise of the image processed by the CCM matrix is amplified.
图4提供了一种利用CCM矩阵进行处理的前后对比图。Figure 4 provides a before-and-after comparison of processing using the CCM matrix.
示例性的,如图4中的(a)所示,左侧的图像为RGB传感器获取的拜尔格式图像,右侧的图像为利用对应的3×3的CCM矩阵处理后的图像,右侧的图像相对于左侧的图像来说,虽然噪声被放大,但不是很明显。For example, as shown in (a) in Figure 4, the image on the left is a Bayer format image acquired by the RGB sensor, the image on the right is an image processed using the corresponding 3×3 CCM matrix, and the image on the right is Compared to the image on the left, although the noise is amplified, it is not very obvious.
如图4中的(b)所示,左侧的图像为RGBCMY传感器获取的拜耳格式图像,右侧的图像为利用对应的6×3的CCM矩阵处理后的图像,右侧的图像相对于左侧的图像来说,噪声被放大;相对于图4中的(a)来说,噪声问题更加严重。As shown in (b) in Figure 4, the image on the left is a Bayer format image obtained by the RGBCMY sensor, the image on the right is an image processed using the corresponding 6×3 CCM matrix, and the image on the right is relative to the left For the image on the side, the noise is amplified; compared to (a) in Figure 4, the noise problem is more serious.
由此,亟待一种新的处理方案,能对以上多个问题均进行有效的解决。Therefore, a new solution is urgently needed that can effectively solve many of the above problems.
有鉴于此,本申请实施例提供了一种图像处理方法,通过将原始图像的通道信号合并重组,生成分别具有较好细节和较好颜色信息的两帧图像,然后将该两帧图像融合生成目标图像,从而可以实现目标图像细节和色彩的较好还原。In view of this, embodiments of the present application provide an image processing method that combines and reorganizes the channel signals of the original image to generate two frames of images with better details and better color information respectively, and then fuses the two frames of images to generate The target image can achieve better restoration of the details and colors of the target image.
下面结合图5对本申请实施例提供的图像处理方法的应用场景进行举例说明。The application scenarios of the image processing method provided by the embodiment of the present application are illustrated below with reference to FIG. 5 .
本申请实施例提供的图像处理方法可以应用于拍摄领域。例如,可以应用于在暗光环 境下拍摄图像或者录制视频。The image processing method provided by the embodiment of the present application can be applied to the field of photography. For example, can be applied in dark halo Take images or record videos in any environment.
图5示出了本申请实施例提供的一种应用场景的示意图。在一个示例中,以电子设备为手机进行举例说明,该手机包括非RGB传感器的多光谱传感器。Figure 5 shows a schematic diagram of an application scenario provided by the embodiment of the present application. In one example, the electronic device is a mobile phone, which includes a multispectral sensor other than an RGB sensor.
如图5所示,响应于用户的操作,电子设备可以启动相机应用,显示如图5中所示的图形用户界面(graphical user interface,GUI),该GUI界面可以称为第一界面。该第一界面包括多种拍摄模式选项和第一控件。该多种拍摄模式例如包括:拍照模式、录像模式等,该第一控件例如为拍摄键11,拍摄键11用于指示当前拍摄模式为多种拍摄模式中的其中一种。As shown in Figure 5, in response to the user's operation, the electronic device can start the camera application and display a graphical user interface (GUI) as shown in Figure 5. The GUI interface can be called a first interface. The first interface includes multiple shooting mode options and first controls. The multiple shooting modes include, for example, photo taking mode, video recording mode, etc. The first control is, for example, the shooting key 11 , and the shooting key 11 is used to indicate that the current shooting mode is one of the multiple shooting modes.
示例性的,如图5所示,当用户启动相机应用,想在夜晚对户外草地、树木进行拍照时,用户点击第一界面上的拍摄键11,电子设备检测到用户对拍摄键11的点击操作后,响应于该点击操作,运行本申请实施例提供的图像处理方法对应的程序,获取图像。For example, as shown in Figure 5, when the user starts the camera application and wants to take pictures of outdoor grass and trees at night, the user clicks the shooting button 11 on the first interface, and the electronic device detects the user's click on the shooting button 11. After the operation, in response to the click operation, run the program corresponding to the image processing method provided by the embodiment of the present application to obtain the image.
应理解,该电子设备包括的多光谱传感器不是RGB传感器,例如为RGBCMY传感器,该电子设备的光谱响应范围相对于现有技术有所扩大,也就是说,颜色还原能力有所提高,但是,由于CCM矩阵可能出现过拟合问题,所以在利用CCM矩阵进行处理后,图像的噪声可能会被放大。对此,若该电子设备采用本申请实施例提供的图像处理方法进行处理,则能够保证颜色还原度,又能降低噪声,进而提高拍摄出的图像或视频的质量。It should be understood that the multispectral sensor included in this electronic device is not an RGB sensor, for example, an RGBCMY sensor. The spectral response range of this electronic device has been expanded relative to the existing technology, that is to say, the color reduction capability has been improved. However, due to The CCM matrix may have an over-fitting problem, so after processing with the CCM matrix, the noise of the image may be amplified. In this regard, if the electronic device is processed using the image processing method provided by the embodiment of the present application, it can ensure color restoration and reduce noise, thereby improving the quality of the captured image or video.
应理解,上述为对应用场景的举例说明,并不对本申请的应用场景作任何限定。It should be understood that the above are examples of application scenarios and do not limit the application scenarios of the present application in any way.
下面结合说明书附图,对本申请实施例提供的图像处理方法进行详细描述。The image processing method provided by the embodiment of the present application will be described in detail below with reference to the accompanying drawings.
图6示出了本申请实施例提供的一种图像处理方法的流程示意图。如图6所示,本申请实施例提供了一种图像处理方法1,该图像处理方法1包括以下S11至S16。FIG. 6 shows a schematic flowchart of an image processing method provided by an embodiment of the present application. As shown in Figure 6, the embodiment of the present application provides an image processing method 1. The image processing method 1 includes the following S11 to S16.
S11、显示第一界面,第一界面包括第一控件。S11. Display the first interface, which includes the first control.
S12、检测到对第一控件的第一操作。S12. Detect the first operation on the first control.
第一控件例如为图5中所示的拍摄键11,第一操作例如为点击操作,当然,第一操作也可以为其他操作,本申请实施例对此不进行任何限制。The first control is, for example, the shooting key 11 shown in FIG. 5 , and the first operation is, for example, a click operation. Of course, the first operation can also be other operations, and the embodiment of the present application does not impose any limitation on this.
S13、响应于第一操作,获取原始图像。该原始图像包括至少四种颜色的通道信号。S13. In response to the first operation, obtain the original image. The original image includes channel signals of at least four colors.
其中,该原始图像为拜耳格式图像,或者说,位于RAW域。The original image is a Bayer format image, or is located in the RAW domain.
图7示出了一种原始图像的示意图。Figure 7 shows a schematic diagram of an original image.
例如,如图7中的(a)所示,其中原始图像包括可以包括4种颜色的通道信号(例如为t1、t2、t3和t4),或者,如图7中的(b)所示,原始图像可以包括5种颜色的通道信号(例如为t1、t2、t3、t4和t5),或者,如图7中的(c)所示,原始图像可以包括6种颜色的通道信号(例如为t1、t2、t3、t4、t5和t6)。当然,原始图像也可以包括更多种颜色的通道信号,本申请实施例对此不进行任何限制。For example, as shown in (a) in Figure 7, the original image includes channel signals that may include 4 colors (for example, t1, t2, t3, and t4), or, as shown in (b) in Figure 7, The original image may include channel signals of 5 colors (eg, t1, t2, t3, t4, and t5), or, as shown in (c) in FIG. 7 , the original image may include channel signals of 6 colors (eg, t1, t2, t3, t4, and t5). t1, t2, t3, t4, t5 and t6). Of course, the original image may also include channel signals of more colors, and the embodiments of the present application do not impose any limitation on this.
原始图像包括的通道信号的排布方式可以根据需要进行设置和修改,上述图7所示的排布方式仅为一种示例,本申请实施例对此不进行任何限制。The arrangement of channel signals included in the original image can be set and modified as needed. The arrangement shown in FIG. 7 is only an example, and the embodiment of the present application does not impose any restrictions on this.
响应于第一操作,可以获取1帧、2帧或2帧以上的原始图像。具体可以根据需要进行获取,本申请实施例对此不进行任何限制。In response to the first operation, 1 frame, 2 frames, or more than 2 frames of original images may be acquired. Specifically, it can be obtained as needed, and the embodiments of this application do not impose any restrictions on this.
应理解,该多帧原始图像可以是利用电子设备自身包括的多光谱传感器采集的或从其他设备获取的,具体可以根据需要进行设置,本申请实施例对此不进行任何限制。It should be understood that the multi-frame original image can be collected using a multispectral sensor included in the electronic device itself or obtained from other devices. The specific settings can be set as needed, and the embodiments of the present application do not impose any restrictions on this.
应理解,当利用自身多光谱传感器获取多帧原始图像时,该多光谱传感器可以同 时输出多帧原始图像,或者,也可以串行输出多帧原始图像,具体可以需要进行选择和设置,本申请实施例对此不作任何限制。It should be understood that when using its own multispectral sensor to acquire multiple frames of original images, the multispectral sensor can simultaneously Multiple frames of original images can be output simultaneously, or multiple frames of original images can be output serially. Specific selections and settings may be required, and the embodiments of the present application do not impose any restrictions on this.
还应理解,虽然从多光谱传感器输出多帧原始图像时,可以是同时输出或串行输出,但无论如何输出,该多帧原始图像其实都是对待拍摄场景进行同一次拍摄所生成的图像。待拍摄场景指的是相机拍摄视角中的所有物体,待拍摄场景又可称为目标场景,也可以理解成用户期待拍摄的场景。It should also be understood that although multiple frames of original images are output from the multispectral sensor, they can be output simultaneously or serially, but no matter how they are output, the multiple frames of original images are actually images generated by taking the same shot of the scene to be shot. The scene to be shot refers to all objects in the camera's shooting perspective. The scene to be shot can also be called the target scene, or it can also be understood as the scene that the user expects to shoot.
S14、对原始图像进行预处理,得到第一初始图像、第二初始图像和第三初始图像。S14. Preprocess the original image to obtain the first initial image, the second initial image and the third initial image.
其中,该预处理用于对原始图像包括的多种颜色通道信号进行合并重组,例如该预处理可以包括四合一像素合并(quarter binning,Qbin)处理和对角线像素合并(diagonal binning,Dbin)处理。当然,也可以利用其它方式进行合并重组,具体可以根据需要进行设置和更改,本申请实施例对此不进行任何限制。Among them, the preprocessing is used to merge and recombine multiple color channel signals included in the original image. For example, the preprocessing can include four-in-one pixel binning (quarter binning, Qbin) processing and diagonal pixel binning (diagonal binning, Dbin). )deal with. Of course, other methods can also be used for merging and reorganization, and specific settings and changes can be made as needed. The embodiments of the present application do not impose any restrictions on this.
需要说明的是,Qbin处理指的是将相邻四个像素值进行加权平均后作为一个单一像素值输出的处理方式,Dbin处理指的是将对角线方向两个像素值进行加权平均后作为一个单一像素值输出的处理方式。其中,加权时分配的权重可以根据需要进行设置和修改,本申请实施例对此不进行任何限制。例如权重可以均设置成1。It should be noted that Qbin processing refers to a processing method in which four adjacent pixel values are weighted and averaged and output as a single pixel value. Dbin processing refers to a weighted average of two pixel values in the diagonal direction and is output as a single pixel value. How a single pixel value is output. The weights assigned during weighting can be set and modified as needed, and the embodiments of the present application do not impose any restrictions on this. For example, the weights can all be set to 1.
可选地,如图11所示,上述S14可以包括:Optionally, as shown in Figure 11, the above S14 may include:
S141、对原始图像进行Qbin处理,得到第一初始图像。S141. Perform Qbin processing on the original image to obtain the first initial image.
S142、对原始图像进行Dbin处理,得到第二初始图像和第三初始图像。S142. Perform Dbin processing on the original image to obtain the second initial image and the third initial image.
示例性的,图8示出了一种原始图像进行Qbin处理的示意图。For example, FIG. 8 shows a schematic diagram of Qbin processing of original images.
如图8中的(a)所示,为本申请实施例提供的一种原始图像,该原始图像包括6种颜色的通道信号,该6种颜色的通道信号分别为红色通道信号(R)、绿色通道信号(G)、蓝色通道信号(B)、青色通道信号(C)、品红色通道信号(M)和黄色通道信号(Y),该6种颜色以4×4的排布方式,且以如图1所示的最小重复单元进行重复。As shown in (a) of Figure 8, it is an original image provided by an embodiment of the present application. The original image includes channel signals of 6 colors. The channel signals of the 6 colors are respectively the red channel signal (R), Green channel signal (G), blue channel signal (B), cyan channel signal (C), magenta channel signal (M) and yellow channel signal (Y), these 6 colors are arranged in a 4×4 arrangement. And repeat with the minimum repeating unit as shown in Figure 1.
如图8中的(b)所示,当针对该原始图像进行Qbin处理时,则相当于将第1行第1列的像素通道信号G、第1行第2列的像素通道信号Y、第2行第1列的像素通道信号Y、第2行第2列的像素通道信号G,该四个相邻的像素通道信号分别对应的像素值进行加权平均,得到一个单一像素值,例如为T1,该T1即为第一初始图像对应的第1行第1列像素的像素值。As shown in (b) in Figure 8, when Qbin processing is performed on the original image, it is equivalent to combining the pixel channel signal G in the 1st row and 1st column, the pixel channel signal Y in the 1st row and 2nd column, and the pixel channel signal Y in the 1st row and 2nd column. The pixel channel signal Y in the first column of row 2 and the pixel channel signal G in the second row and column of the second row. The pixel values corresponding to the four adjacent pixel channel signals are weighted and averaged to obtain a single pixel value, for example, T1 , the T1 is the pixel value of the pixel in the first row and the first column corresponding to the first initial image.
然后,可以将第1行第3列的像素通道信号B、第1行第4列的像素通道C、第2行第2列的像素通道信号C、第2行第4列的像素通道信号B,该四个相邻的像素通道信号分别对应的像素值进行加权平均,得到一个单一像素值,例如为T2,该T2即为第一初始图像对应的第1行第2列像素的像素值。Then, the pixel channel signal B in the 1st row and 3rd column, the pixel channel C in the 1st row and 4th column, the pixel channel signal C in the 2nd row and 2nd column, and the pixel channel signal B in the 2nd row and 4th column can be , the pixel values corresponding to the four adjacent pixel channel signals are weighted and averaged to obtain a single pixel value, for example, T2. The T2 is the pixel value of the pixel in the first row and the second column corresponding to the first initial image.
依次类推,从而可以根据原始图像,确定出第一初始图像。第一初始图像中每个像素对应一个像素值,因此,第一初始图像依然可以认为是位于RAW域的图像。By analogy, the first initial image can be determined based on the original image. Each pixel in the first initial image corresponds to a pixel value. Therefore, the first initial image can still be considered as an image in the RAW domain.
应理解,由于第一初始图像的一个像素值是由原始图像的相邻四个像素值进行加权平均后得到的,所以,第一初始图像的每个边长的尺寸为原始图像的二分之一,整个图像的面积为原始图像的四分之一。It should be understood that since one pixel value of the first initial image is obtained by a weighted average of four adjacent pixel values of the original image, the size of each side of the first initial image is half of the original image. 1. The area of the entire image is one quarter of the original image.
示例性的,图9示出了一种Dbin处理的示意图,图10示出了一种原始图像进行 Dbin处理的示意图。For example, Figure 9 shows a schematic diagram of Dbin processing, and Figure 10 shows a raw image processing Schematic diagram of Dbin processing.
如图9中的(a)所示,当对该图像进行Dbin处理时,若第1行第1列像素对应的像素值为p11,第1行第2列像素对应的像素值为p12,第2行第1列像素对应的像素值为p21,第2行第2列像素对应的像素值为p22,则进行Dbin处理后,如图9中的(b)所示,第一帧图像的第1行第1列像素对应的像素值为b1,b1=(p11+p22)/2,也即第一帧图像中每个像素对应的像素值均为图9中的(a)所示图像相邻四个像素中左上角和右下角两个像素的像素值的平均值,此时权重假设为1,其他依次类推;如图9中的(c)所示,第二帧图像的第1行第1列像素对应的像素值为c1,c1=(p12+p21)/2,也即第二帧图像中每个像素对应的像素值均为图9中的(a)所示图像相邻四个像素中左下角和右上角两个像素的像素值的平均值,此时权重假设为1,其他依次类推。As shown in (a) in Figure 9, when Dbin processing is performed on the image, if the pixel value corresponding to the pixel in row 1 and column 1 is p11, and the pixel value corresponding to the pixel in row 1 and column 2 is p12, then the pixel value corresponding to the pixel in row 1 and column 2 is p12. The pixel value corresponding to the pixel in row 2 and column 2 is p21, and the pixel value corresponding to the pixel in row 2 and column 2 is p22. After Dbin processing, as shown in (b) in Figure 9, the pixel value of the first frame of the image is p21. The pixel value corresponding to the pixel in row 1 and column 1 is b1, b1=(p11+p22)/2, that is, the pixel value corresponding to each pixel in the first frame image is the image phase shown in (a) in Figure 9 The average value of the pixel values of the upper left corner and the lower right corner of the four adjacent pixels. At this time, the weight is assumed to be 1, and so on. As shown in (c) in Figure 9, the first row of the second frame image The pixel value corresponding to the pixel in the first column is c1, c1=(p12+p21)/2, that is, the pixel value corresponding to each pixel in the second frame image is the image shown in (a) in Figure 9. The average value of the pixel values of the lower left corner and the upper right corner of each pixel. At this time, the weight is assumed to be 1, and so on for the others.
结合上述,如图10中的(a)所示,针对本申请实施例提供的包括6种颜色的通道信号的原始图像来说,进行Dbin处理后,如图10中的(b)所示,第二初始图像中第1行第1列像素的像素值为原始图像中的第1行第1列像素和第2行第2列像素的像素值的平均值,例如为绿色通道信号对应的像素值,其他依次类推;如图10中的(c)所示,第三初始图像中第1行第1列像素的像素值为原始图像中的第1行第2列像素和第2行第1列像素的像素值的平均值,例如为黄色通道信号对应的像素值,其他依次类推。In combination with the above, as shown in (a) in Figure 10, for the original image including channel signals of 6 colors provided by the embodiment of the present application, after Dbin processing, as shown in (b) in Figure 10, The pixel value of the pixel in row 1 and column 1 in the second initial image is the average of the pixel values of the pixel in row 1 and column 1 and the pixel in row 2 and column 2 in the original image, for example, it is the pixel corresponding to the green channel signal. value, and so on; as shown in (c) in Figure 10, the pixel value of the pixel in row 1, column 1 in the third initial image is the pixel in row 1, column 2 and the pixel in row 2, column 1 in the original image The average value of the pixel values of the column pixels, for example, the pixel value corresponding to the yellow channel signal, and so on for the others.
由于第二初始图像、第三初始图像中每个像素还是对应一个像素值,因此,第二初始图像和第三初始图像也是位于RAW域的图像。Since each pixel in the second initial image and the third initial image still corresponds to a pixel value, the second initial image and the third initial image are also images in the RAW domain.
应理解,由于第二初始图像中的像素值是由原始图像相邻四个像素中左上角和右下角两个像素加权平均后得到的,第三初始图像中的像素值是由原始图像相邻四个像素左下角和右上角两个像素加权平均后得到的,所以第二初始图像和第三初始图像的像素值均不同。It should be understood that since the pixel value in the second initial image is obtained by the weighted average of two pixels in the upper left corner and the lower right corner of the four adjacent pixels in the original image, the pixel value in the third initial image is obtained by the adjacent pixels in the original image. It is obtained by weighting the average of the two pixels in the lower left corner and the upper right corner of the four pixels, so the pixel values of the second initial image and the third initial image are different.
还应理解,由于第二初始图像、第三初始图像的一个像素值是由原始图像相邻四个像素中对角线方向的两个像素值加权平均后得到的,所以,第二初始图像、第三初始图像的每个边长的尺寸为原始图像的二分之一,整个图像的尺寸为原始图像的四分之一。It should also be understood that since one pixel value of the second initial image and the third initial image is obtained by a weighted average of two pixel values in the diagonal direction among the four adjacent pixels of the original image, therefore, the second initial image, The size of each side of the third initial image is half the size of the original image, and the size of the entire image is one-fourth the size of the original image.
此外,在上述处理过程中,Qbin处理和Dbin处理可以在同一图像信号处理器(image signal processing,ISP)中进行处理,或者,也可以分开在两个图像信号处理器中进行处理,或者,也可以在多光谱传感器中进行处理,具体可以根据需要进行设置,本申请实施例对此不进行任何限制。In addition, in the above processing process, Qbin processing and Dbin processing can be processed in the same image signal processor (image signal processing, ISP), or they can be processed separately in two image signal processors, or they can also be processed in two image signal processors. The processing can be performed in a multispectral sensor, and the specific settings can be set as needed. The embodiments of the present application do not impose any restrictions on this.
S15、对第二初始图像和第三初始图像进行前端处理,得到前端处理图像。S15. Perform front-end processing on the second initial image and the third initial image to obtain a front-end processed image.
此处,需要说明的是,本申请所述的前端处理仅表示该步骤位于融合之前,因此称为“前端”处理,并无其他含义。前端处理也可以称为第一处理等,本申请实施例对此不进行任何限制。Here, it should be noted that the front-end processing described in this application only means that this step is before fusion, so it is called "front-end" processing and has no other meaning. Front-end processing may also be called first processing, etc., and the embodiments of this application do not impose any restrictions on this.
其中,前端处理用于对第二初始图像和第三初始图像的色彩进行处理,使得处理后得到的前端处理图像可以为后续处理提供更好的色彩。The front-end processing is used to process the colors of the second initial image and the third initial image, so that the front-end processed image obtained after processing can provide better colors for subsequent processing.
应理解,由于第二初始图像和第三初始图像分别包括的通道信号颜色不相同,因此,根据第二初始图像和第三初始图像得到的前端处理图像颜色信息保留的更好,也 就是说,前端处理图像可以为后续处理提供更好的色彩还原能力。It should be understood that since the channel signals included in the second initial image and the third initial image respectively have different colors, the color information of the front-end processed image obtained according to the second initial image and the third initial image is better preserved and also That is to say, front-end processing of images can provide better color reproduction capabilities for subsequent processing.
本申请实施例提供的前端处理可以与上述所述的Qbin处理或Dbin处理在同一ISP中进行处理,或者,可以与Qbin处理和Dbin处理三者在同一ISP进行处理,或者,也可以在另一单独的不同的ISP中进行处理,当然,也可以在多光谱传感器中进行处理,具体可以根据需要进行设置,本申请实施例对此不进行任何限制。The front-end processing provided by the embodiment of the present application can be processed in the same ISP as the above-mentioned Qbin processing or Dbin processing, or can be processed in the same ISP as Qbin processing and Dbin processing, or it can also be processed in another The processing can be carried out in separate different ISPs. Of course, the processing can also be carried out in a multispectral sensor. The specific settings can be set according to needs. The embodiments of the present application do not impose any restrictions on this.
可选地,前端处理至少可以包括:下采样(down scale sample)、降噪(denoise)、自动白平衡和/或颜色校正、上采样(up sample)。Optionally, the front-end processing may at least include: down sampling (down scale sample), noise reduction (denoise), automatic white balance and/or color correction, and up sampling (up sample).
其中,下采样用于将图像中所包括的通道信号进行拆分重组,缩小图像的尺寸。Among them, downsampling is used to split and recombine the channel signals included in the image to reduce the size of the image.
降噪用于减少图像中噪声,一般方法有均值滤波、高斯滤波、双边滤波等。当然,也可以利用其它方式来进行降噪,本申请实施例对此不进行任何限制。Noise reduction is used to reduce noise in images. Common methods include mean filtering, Gaussian filtering, bilateral filtering, etc. Of course, other methods can also be used for noise reduction, and the embodiments of the present application do not impose any limitations on this.
自动白平衡用于将经过下采样和降噪处理后的图像,校正到D65参考光源下,使其白色呈现出真正的白色。Automatic white balance is used to correct the down-sampled and noise-reduced image to the D65 reference light source so that its white color appears truly white.
颜色校正用于校准除白色以外其他颜色的准确度。此处,相当于利用CCM矩阵将多个颜色的通道信号校正成3个颜色通道信号,例如分别为红色通道信号、绿色通道信号和蓝色通道信号。Color correction is used to calibrate the accuracy of colors other than white. Here, it is equivalent to using the CCM matrix to correct multiple color channel signals into three color channel signals, for example, a red channel signal, a green channel signal, and a blue channel signal respectively.
其中,进行颜色校正时,所使用的CCM矩阵,可以使用在先拟合好的CCM矩阵。当没有D65参考光源下的CCM矩阵时,可以利用其它色温对应的CCM矩阵插值确定出D65参考光源下的CCM矩阵。Among them, when performing color correction, the CCM matrix used can be a previously fitted CCM matrix. When there is no CCM matrix under the D65 reference light source, the CCM matrix under the D65 reference light source can be determined by interpolating the CCM matrices corresponding to other color temperatures.
上采样用于放大图像尺寸。应理解,由于前面进行了下采样,图像尺寸被缩小,所以,相应的需要进行上采样将图像尺寸放大,恢复图像尺寸,以便于后续进行融合。Upsampling is used to enlarge the image size. It should be understood that due to the previous downsampling, the image size is reduced, so accordingly, upsampling needs to be performed to enlarge the image size and restore the image size to facilitate subsequent fusion.
图12示出了一种前端处理的示意图。例如,如图12所示,前端处理按处理顺序包括:下采样、降噪、自动白平衡、颜色校正和上采样。Figure 12 shows a schematic diagram of front-end processing. For example, as shown in Figure 12, front-end processing includes: downsampling, noise reduction, automatic white balance, color correction, and upsampling in the processing order.
若针对图11所得到的第二初始图像和第三初始图像进行下采样,可以将第二初始图像中包括的三种颜色通道进行拆分,然后,将红色通道信号重组在一起生成一帧仅包括红色通道信号的单色通道图像,将绿色通道信号重组在一起生成一帧仅包括绿色通道信号的单色通道图像,以及将蓝色通道信号重组在一起生成一帧仅包括蓝色通道信号的单色通道图像。同理,可以将第三初始图像中包括的三种颜色通道进行拆分,然后,将黄色通道信号重组在一起生成一帧仅包括黄色通道信号的单色通道图像,将青色通道信号重组在一起生成一帧仅包括青色通道信号的单色通道图像,以及将品红色通道信号重组在一起生成一帧仅包括品红色通道信号的单色通道图像。If down-sampling is performed on the second initial image and the third initial image obtained in Figure 11, the three color channels included in the second initial image can be split, and then the red channel signals are recombined to generate only one frame. A monochrome channel image that includes a red channel signal, recombining the green channel signals together to produce a frame of a monochrome channel image that includes only the green channel signal, and recombining the blue channel signals together to produce a frame that includes only the blue channel signal Monochrome channel image. In the same way, the three color channels included in the third initial image can be split, and then the yellow channel signals are reorganized together to generate a single-color channel image that only includes the yellow channel signal, and the cyan channel signals are reassembled together. Generating a frame of a monochrome channel image including only the cyan channel signal, and recombining the magenta channel signals together to generate a frame of a monochrome channel image including only the magenta channel signal.
由此,将第二初始图像和第三初始图像进行下采样后,可以得到6帧分别包括不同颜色通道信号的单色通道图像,每帧单色通道图像的边长为原有的第二初始图像或第三初始图像的二分之一,或者说,每帧单色通道图像的整体面积为原有的第二初始图像或第三初始图像的四分之一。Therefore, after downsampling the second initial image and the third initial image, six frames of monochromatic channel images including different color channel signals can be obtained. The side length of each frame of the monochromatic channel image is the original second initial image. One-half of the image or the third initial image, or in other words, the overall area of each frame of the monochromatic channel image is one-quarter of the original second initial image or the third initial image.
然后,针对该6帧单色通道图像进行降噪,以降低原有的以及下采样过程中产生的噪声;再将经过噪声处理后的6帧单色通道图像进行自动白平衡和颜色校正处理,由此,可以得到一帧仅包括红色通道信号的单色通道图像,一帧仅包括绿色通道信号的单色通道图像,以及一帧仅包括蓝色通道信号的单色通道图像。Then, noise reduction is performed on the 6 frames of monochromatic channel images to reduce the original and noise generated during the downsampling process; then, the 6 frames of monochromatic channel images after noise processing are subjected to automatic white balance and color correction processing. Thus, one frame of a monochromatic channel image including only red channel signals, one frame of monochromatic channel images including only green channel signals, and one frame of monochromatic channel images including only blue channel signals can be obtained.
此处,多帧单色通道图像的尺寸均相同。 Here, multiple frames of monochromatic channel images are all of the same size.
可以理解的是,由于颜色校正后得到的三帧单色通道图像的尺寸相对于第一初始图像的尺寸较小,为了便于后续进行融合处理,可以利用上采样的方式将三帧三色通道图像所包括的红色通道信号、绿色通道信号和蓝色通道信号以拼接重组的方式,确定出包括三种颜色通道信号的前端处理图像。由此,前端处理图像可以包括较好的颜色信息。应理解,前端处理图像是拜耳格式图像,也即是说,前端处理图像是位于RAW域的图像。It can be understood that since the size of the three-frame single-color channel image obtained after color correction is smaller than the size of the first initial image, in order to facilitate subsequent fusion processing, the three-frame three-color channel image can be upsampled. The included red channel signal, green channel signal and blue channel signal are spliced and reorganized to determine a front-end processed image including three color channel signals. As a result, front-end processed images can include better color information. It should be understood that the front-end processed image is a Bayer format image, that is to say, the front-end processed image is an image located in the RAW domain.
在此基础上,前端处理还可以包括:动态坏点补偿(defect pixel correction,DPC)、镜头阴影校正(lens shading correction,LSC)和宽动态范围调整(wide range compression,WDR)中的至少一项。On this basis, front-end processing can also include: at least one of dynamic dead pixel compensation (defect pixel correction, DPC), lens shading correction (lens shading correction, LSC) and wide dynamic range adjustment (wide range compression, WDR) .
应理解,动态坏点补偿用于解决多光谱传感器上光线采集的点形成的阵列所存在的缺陷,或者光信号进行转化的过程中存在的错误;通常通过在亮度域上取其他周围像素点均值来消除坏点。It should be understood that dynamic bad pixel compensation is used to solve the defects in the array formed by the light collection points on the multi-spectral sensor, or the errors in the process of converting the light signal; usually by taking the average value of other surrounding pixels in the brightness domain to eliminate bad pixels.
应理解,镜头阴影校正用于消除由于镜头光学系统原因造成的图像四周颜色以及亮度与图像中心不一致的问题。It should be understood that lens shading correction is used to eliminate the problem of inconsistency between the color and brightness around the image and the center of the image due to the lens optical system.
宽动态范围调整指的是:当在强光源(日光、灯具或反光等)照射下的高亮度区域及阴影、逆光等相对亮度较低的区域在图像中同时存在时,图像会出现明亮区域因曝光过度成为白色,而黑暗区域因曝光不足成为黑色,严重影响图像质量。因此,可以在同一场景中对最亮区域及较暗区域进行调整,例如使暗区在图像中变亮,亮区在图像变暗,从而使得处理后的图像可以呈现暗区和亮区中的更多细节。Wide dynamic range adjustment refers to: when high-brightness areas illuminated by strong light sources (sunlight, lamps, reflections, etc.) and relatively low-brightness areas such as shadows and backlights coexist in the image, bright areas will appear in the image. Overexposure becomes white, while dark areas become black due to underexposure, seriously affecting image quality. Therefore, the brightest area and the darker area can be adjusted in the same scene, for example, the dark area is made brighter in the image, and the bright area is darkened in the image, so that the processed image can show the difference between the dark area and the bright area. more details.
应理解,前端处理可以包括上述一个或多个处理步骤,当前端处理包括多个处理步骤时,该多个处理步骤的顺序可以根据需要进行调整,本申请实施例对此不进行任何限制。此外,前端处理均还可以包括其他步骤,具体可以根据需要进行增加,本申请实施例对此不进行任何限制。It should be understood that the front-end processing may include one or more of the above-mentioned processing steps. When the front-end processing includes multiple processing steps, the order of the multiple processing steps may be adjusted as needed. This embodiment of the present application does not impose any limitation on this. In addition, the front-end processing may also include other steps, which may be added as needed. The embodiments of the present application do not impose any restrictions on this.
S16、将第一初始图像和前端处理图像进行融合处理,得到目标图像。S16. Fusion of the first initial image and the front-end processed image to obtain the target image.
应理解,由于第一初始图像是直接由原始图像的通道信号重组得到的,未经其他任何处理,因此,第一初始图像携带的纹理细节也更多。那么,为了保证细节丰富度可以利用第一初始图像进行融合处理,以使得恢复场景颜色的同时,恢复出的图像携带更多细节。It should be understood that since the first initial image is directly reconstructed from the channel signals of the original image without any other processing, the first initial image carries more texture details. Then, in order to ensure the richness of details, the first initial image can be used for fusion processing, so that while the color of the scene is restored, the restored image carries more details.
而前端处理图像是由第二初始图像和第三初始图像经过一系列色彩处理后得到的,部分细节缺失,但保留了较好的颜色信息,因此,为了保证颜色丰富度可以利用前端处理图像进行融合处理,以使得恢复场景颜色时,恢复出的图像具有良好的色彩。The front-end processed image is obtained from the second initial image and the third initial image after a series of color processing. Some details are missing, but good color information is retained. Therefore, in order to ensure the color richness, the front-end processed image can be used. Fusion processing, so that when the scene color is restored, the restored image has good color.
第一初始图像的分辨率相对较高,可以称为高分辨率(high resolution,HR)图像,前端处理图像的分辨率相对较低,可以称为低分辨率(low resolution,LR)图像。The resolution of the first initial image is relatively high and can be called a high resolution (HR) image. The resolution of the front-end processed image is relatively low and can be called a low resolution (LR) image.
其中,将第一初始图像和前端处理图像进行融合处理,可以将其对应相同位置处的像素值进行相加或按不同权重进行相乘,或者,也可以利用网络模型进行融合;当然,还可以利用其它方式进行融合处理,具体可以根据需要进行选择和设置,本申请实施例对此不进行任何限制。Among them, the first initial image and the front-end processed image are fused, and the pixel values corresponding to the same position can be added or multiplied according to different weights, or a network model can be used for fusion; of course, it can also be Fusion processing can be performed using other methods, which can be specifically selected and set according to needs. The embodiments of this application do not impose any restrictions on this.
可选地,可以将第一初始图像和前端处理图像,利用以下公式进行融合处理:
E(f)=wd(f-d)2+wx(fx-gx)2+wy(fy-gy)2
Optionally, the first initial image and the front-end processed image can be fused using the following formula:
E(f)=w d (fd) 2 +w x (f x -g x ) 2 +w y (f y -g y ) 2 ;
其中,f为目标图像的像素值,g为第一初始图像的像素值,d为前端处理图像的像素值,fx和fy为目标图像在x方向和y方向上的梯度值,gx和gy为第一初始图像的梯度值,wd、wx、wy为权重值,E(f)为f的能量函数。Among them, f is the pixel value of the target image, g is the pixel value of the first initial image, d is the pixel value of the front-end processed image, f x and f y are the gradient values of the target image in the x direction and y direction, g x and g y are the gradient values of the first initial image, w d , w x , and w y are weight values, and E(f) is the energy function of f.
确定E(f)的最小值,得到目标图像。Determine the minimum value of E(f) and obtain the target image.
应理解,高分辨率的第一初始图像相当于是目标图像的梯度值的约束图,由于第一初始图像的细节丰富度更高,因此,目标图像的梯度值越接近第一初始图像的梯度值越好。其中,梯度值用于反映图像数据的变化率。It should be understood that the high-resolution first initial image is equivalent to a constraint map of the gradient value of the target image. Since the first initial image has a higher degree of detail, the closer the gradient value of the target image is to the gradient value of the first initial image. The better. Among them, the gradient value is used to reflect the change rate of image data.
低分辨率的前端处理图像相当于是目标图像的像素值的约束图,由于前端处理图像的色彩丰富度更高,因此,目标图像的像素值越接近前端处理图像的像素值越好。The low-resolution front-end processed image is equivalent to the constraint map of the pixel values of the target image. Since the color richness of the front-end processed image is higher, the closer the pixel value of the target image is to the pixel value of the front-end processed image, the better.
应理解,上述公式右边三项之和的值越小,确定出的E(f)的值就越小;当公式右边三项之和的值最小时,确定出的E(f)的值就是最小值,此时,目标图像既能保证接近第一初始图像的梯度值,又能保证接近前端处理图像的像素值,由此,还原出的目标图像携带有较好的细节纹理信息和色彩信息。It should be understood that the smaller the value of the sum of the three terms on the right side of the above formula, the smaller the determined value of E(f); when the value of the sum of the three terms on the right side of the formula is the smallest, the determined value of E(f) is minimum value. At this time, the target image can ensure that it is close to the gradient value of the first initial image and the pixel value of the front-end processed image. Therefore, the restored target image carries better detailed texture information and color information. .
示例性的,图13为本申请实施例提供的融合处理的效果示意图。Exemplarily, FIG. 13 is a schematic diagram of the effect of the fusion process provided by the embodiment of the present application.
如图13中的(a)所示,为经Qbin处理后得到的第一初始图像,细节丰富度高;如图13中的(b)所示,为Dbin处理和前端处理后得到的前端处理图像,颜色信息丰富,但分辨率较低。基于此,利用上述S16所述的融合方式进行融合后,可以得到如图13中的(c)所示的目标图像。As shown in (a) in Figure 13, it is the first initial image obtained after Qbin processing, with high detail richness; as shown in (b) in Figure 13, it is the front-end processing obtained after Dbin processing and front-end processing Images are rich in color information but low resolution. Based on this, after fusion using the fusion method described in S16 above, the target image as shown in (c) in Figure 13 can be obtained.
由于该融合方式可以结合第一初始图像的细节信息和前端处理图像的颜色信息,因此,得到的目标图像相对于第一初始图像来说,颜色更好,相对于前端处理图像来说,纹理信息更丰富,分辨率更高。Since this fusion method can combine the detail information of the first initial image and the color information of the front-end processed image, the resulting target image has better color compared to the first initial image, and has better texture information compared to the front-end processed image. Richer and higher resolution.
应理解,目标图像将被作为拍摄图像在电子设备的界面上进行显示,或者,仅进行存储,具体可以根据需要进行选择,本申请实施例对此不进行任何限制。It should be understood that the target image will be displayed on the interface of the electronic device as a captured image, or will only be stored. The specific selection can be made as needed, and the embodiments of the present application do not impose any restrictions on this.
还应理解,上述过程仅为一种示例,具体可以根据需要进行顺序上的调整,当然,还可以增加或减少步骤,本申请实施例对此不进行任何限制。It should also be understood that the above process is only an example, and the sequence can be adjusted as needed. Of course, steps can also be added or reduced, and the embodiments of the present application do not impose any limitations on this.
本申请实施例提供的图像处理方法,通过获取包括至少4个通道信号的原始图像,然后将原始图像中的通道信号合并重组,生成第一初始图像、第二初始图像和第三初始图像,再对第二初始图像和第三初始图像进行前端处理,得到前端处理图像,由于第一初始图像仅经历了合并重组,所以细节丰富度较高,而前端处理图像的颜色准确性较高,基于此,利用第一初始图像和前端处理图像生成目标图像,从而可以实现图像细节和色彩的较好还原。The image processing method provided by the embodiment of the present application obtains an original image including at least 4 channel signals, and then merges and reorganizes the channel signals in the original image to generate a first initial image, a second initial image, and a third initial image, and then Perform front-end processing on the second initial image and the third initial image to obtain the front-end processed image. Since the first initial image has only undergone merger and reorganization, the richness of details is higher, and the color accuracy of the front-end processed image is higher. Based on this , using the first initial image and the front-end processed image to generate the target image, so that better restoration of image details and colors can be achieved.
在上述基础上,图14提供了另一种图像处理方法的流程示意图。如图14所示,上述方法还可以包括S17。Based on the above, Figure 14 provides a schematic flow chart of another image processing method. As shown in Figure 14, the above method may also include S17.
S17、对目标图像进行后端处理,得到彩色图像。S17. Perform back-end processing on the target image to obtain a color image.
此处,需要说明的是,本申请所述的后端处理仅表示该步骤位于融合之后,因此称为“后端”处理,并无其他含义。后端处理也可以称为第二处理等,本申请实施例对此不进行任何限制。Here, it should be noted that the back-end processing described in this application only means that this step is located after the fusion, so it is called "back-end" processing and has no other meaning. Back-end processing may also be called second processing, etc., and the embodiments of this application do not impose any restrictions on this.
可选地,后端处理可以包括:去马赛克。Optionally, backend processing can include: demosaicing.
在本申请中,去马赛克用于将每个像素中的单通道信号补充成多通道信号,也即 根据位于RAW域的图像重建出RGB域的彩色图像。In this application, demosaicing is used to complement the single-channel signal in each pixel into a multi-channel signal, i.e. A color image in the RGB domain is reconstructed from the image in the RAW domain.
例如,针对包括红色、绿色和蓝色通道信号的目标图像,在去马赛克之前,图像中某个像素仅对应一种颜色通道信号,比如仅对应红色通道信号;而进行了去马赛克之后,该像素对应3种颜色通道信号,分别为红色、绿色、蓝色通道信号,也即是说,针对仅有红色通道信号的像素,补充了绿色和蓝色通道信号。其他颜色的像素的补充情况依次类推,在此不再赘述。For example, for a target image that includes red, green, and blue channel signals, before demosaicing, a certain pixel in the image only corresponds to one color channel signal, such as only the red channel signal; after demosaicing, the pixel Corresponding to three color channel signals, they are red, green, and blue channel signals. In other words, for pixels with only red channel signals, green and blue channel signals are supplemented. The supplementation of pixels of other colors is analogous and will not be described again here.
可选地,第一后端处理还可以包括:伽马(Gamma)校正和风格变换(3dimensional look up table,3DLUT)、RGB域转YUV域中的至少一项。Optionally, the first back-end processing may also include: at least one of gamma correction, style transformation (3dimensional look up table, 3DLUT), and RGB domain conversion to YUV domain.
其中,伽马校正用于通过调整伽马曲线来调整图像的亮度,对比度,动态范围等;风格变换指示颜色的风格变换,即使用颜色滤镜,使原始的图像风格变成其他的图像风格,常见的风格比如,电影风格、日系风格、阴森风格等。RGB域转YUV域指的是将位于RGB域的图像转换成位于YUV域的图像。Among them, gamma correction is used to adjust the brightness, contrast, dynamic range, etc. of the image by adjusting the gamma curve; style transformation indicates the style transformation of the color, that is, using a color filter to change the original image style into other image styles. Common styles include movie style, Japanese style, spooky style, etc. RGB domain to YUV domain conversion refers to converting an image in the RGB domain into an image in the YUV domain.
应理解,后端处理可以包括上述一个或多个处理步骤,当后端处理包括多个处理步骤时,该多个处理步骤的顺序可以根据需要进行调整,本申请实施例对此不进行任何限制。此外,后端处理均还可以包括其他步骤,具体可以根据需要进行增加,本申请实施例对此不进行任何限制。It should be understood that the back-end processing may include one or more of the above-mentioned processing steps. When the back-end processing includes multiple processing steps, the order of the multiple processing steps may be adjusted as needed. The embodiments of the present application do not impose any limitation on this. . In addition, the back-end processing may also include other steps, which may be added as needed. The embodiments of this application do not impose any restrictions on this.
应理解,后端处理可以与预处理、前端处理和/或融合处理在同一图像信号处理器中进行处理,或者,后端处理也可以分开在其他图像信号处理器中进行处理,具体可以根据需要进行设置,本申请实施例对此不进行任何限制。It should be understood that the back-end processing can be processed in the same image signal processor as the pre-processing, front-end processing and/or fusion processing, or the back-end processing can also be processed separately in other image signal processors, as needed. Settings are made, and the embodiments of this application do not impose any restrictions on this.
图15示出了一种后端处理的示意图。例如,如图15所示,后端处理按处理顺序包括:去马赛克、伽马校正、风格变换和RGB域转YUV域。Figure 15 shows a schematic diagram of back-end processing. For example, as shown in Figure 15, the back-end processing includes: demosaic, gamma correction, style transformation and RGB domain conversion to YUV domain in processing order.
应理解,经过后端处理之后,目标图像从RAW域转换至YUV域,可以减小后续传输的数据量,节省带宽。It should be understood that after back-end processing, the target image is converted from the RAW domain to the YUV domain, which can reduce the amount of subsequent transmission data and save bandwidth.
应理解,彩色图像位于YUV域。彩色图像可以被作为拍摄图像在电子设备100的界面上进行显示,或者,仅进行存储,具体可以根据需要进行设置,本申请实施例对此不进行任何限制。It should be understood that color images are in the YUV domain. The color image may be displayed on the interface of the electronic device 100 as a captured image, or may only be stored. Specifically, the color image may be set as needed, and the embodiment of the present application does not impose any limitation on this.
在该实施例中,基于第一初始图像和前端处理图像进行融合,生成包含较好细节信息的和较好颜色信息的目标图像,然后对融合后的目标图像进行后端处理,对其颜色和细节进一步进行调整,从而可以实现图像细节和色彩的较好还原。In this embodiment, a target image containing better detail information and better color information is generated based on the fusion of the first initial image and the front-end processed image, and then back-end processing is performed on the fused target image to obtain its color and The details are further adjusted to achieve better restoration of image details and colors.
还应理解,上述过程仅为一种示例,具体可以根据需要进行顺序上的调整,当然,还可以增加或减少步骤,本申请实施例对此不进行任何限制。It should also be understood that the above process is only an example, and the sequence can be adjusted as needed. Of course, steps can also be added or reduced, and the embodiments of the present application do not impose any limitations on this.
示例性的,图16为本申请实施例提供的图像处理方法的效果示意图。For example, FIG. 16 is a schematic diagram of the effect of the image processing method provided by the embodiment of the present application.
如图16中的(a)所示,为未利用本申请实施例提供的图像处理方法处理所得到彩色图像,图16中的(c)所示的内容为(a)中的一部分。As shown in (a) of Figure 16 , it is a color image obtained without using the image processing method provided by the embodiment of the present application. The content shown in (c) of Figure 16 is part of (a).
如图16中的(b)所示,为利用本申请实施例提供的图像处理方法处理后得到的彩色图像,图16中的(d)所示的内容为(b)中的一部分。相对于(a)和(c)来说,经过本申请实施例提供的图像处理方法处理后,图像的细节和色彩还原效果都相对更好。 As shown in (b) of Figure 16 , it is a color image processed by the image processing method provided by the embodiment of the present application, and the content shown in (d) of Figure 16 is part of (b). Compared with (a) and (c), after being processed by the image processing method provided by the embodiment of the present application, the details and color restoration effect of the image are relatively better.
上述对本申请实施例提供的图像处理方法进行了详细介绍,下面结合电子设备的显示界面介绍一下用户如何启用本申请实施例提供的图像处理方法。The image processing method provided by the embodiment of the present application is introduced in detail above. The following describes how the user activates the image processing method provided by the embodiment of the present application in conjunction with the display interface of the electronic device.
图17为本申请实施例提供的一种电子设备的显示界面的示意图。FIG. 17 is a schematic diagram of a display interface of an electronic device provided by an embodiment of the present application.
示例性的,响应于用户的点击操作,当电子设备100运行相机应用时,电子设备100显示如图17中的(a)所示的拍摄界面。用户可以在该界面上进行滑动操作,使得拍摄键11指示拍摄选项“更多”上。Exemplarily, in response to the user's click operation, when the electronic device 100 runs the camera application, the electronic device 100 displays a shooting interface as shown in (a) of FIG. 17 . The user can perform a sliding operation on the interface so that the shooting key 11 indicates the shooting option "more".
响应于用户针对拍摄选项“更多”的点击操作,电子设备100显示如图17中的(b)所示的拍摄界面,在该界面上显示有多个拍摄模式选项,例如:专业模式、全景模式、HDR模式、延时摄影模式、水印模式、细节色彩还原模式等。应理解,上述拍摄模式选项仅为示例,具体可以根据需要进行设定和修改,本申请实施例对此不进行任何限制。In response to the user's click operation on the shooting option "More", the electronic device 100 displays a shooting interface as shown in (b) of Figure 17 , on which multiple shooting mode options are displayed, such as: professional mode, panorama mode, HDR mode, time-lapse photography mode, watermark mode, detailed color restoration mode, etc. It should be understood that the above shooting mode options are only examples, and can be set and modified as needed. The embodiments of the present application do not impose any restrictions on this.
响应于用户针对“细节色彩还原”模式的点击操作,电子设备100可以在拍摄启用本申请实施例提供的图像处理方法相关的程序。In response to the user's click operation on the "detailed color restoration" mode, the electronic device 100 can enable the program related to the image processing method provided by the embodiment of the present application during shooting.
图18为本申请实施例提供的另一种电子设备的显示界面的示意图。FIG. 18 is a schematic diagram of a display interface of another electronic device provided by an embodiment of the present application.
示例性的,响应于用户的点击操作,当电子设备100运行相机应用时,电子设备100显示如图18中的(a)所示的拍摄界面,在该拍摄界面的右上角显示有“设置”按钮。用户可以在该界面上点击“设置”按钮,进入设置界面进行相关功能的设置。Exemplarily, in response to the user's click operation, when the electronic device 100 runs the camera application, the electronic device 100 displays a shooting interface as shown in (a) of Figure 18 , with "Settings" displayed in the upper right corner of the shooting interface. button. Users can click the "Settings" button on this interface to enter the setting interface to set related functions.
响应于用户针对“设置”按钮的点击操作,电子设备100显示如图18中的(b)所示的设置界面,在该界面上显示有多个功能,例如,照片比例用于实现拍照模式下对照片比例的设定,声控拍照用于实现拍照模式下是否通过声音进行触发的设定,视频分辨率用于实现对视频分辨率的调整,视频帧率用于实现对视频帧率的调整,此外还有通用的参考线、水平仪、细节色彩还原等。In response to the user's click operation on the "Settings" button, the electronic device 100 displays a setting interface as shown in (b) of FIG. 18 . Multiple functions are displayed on this interface. For example, the photo ratio is used to realize the photo-taking mode. For setting the photo ratio, voice-activated photography is used to set whether to trigger by sound in the photo mode, video resolution is used to adjust the video resolution, and video frame rate is used to adjust the video frame rate. In addition, there are universal reference lines, levels, detailed color restoration, etc.
响应于用户针对“细节色彩还原”对应的开关按钮的拖动操作,电子设备100可以在拍摄时启用本申请实施例提供的图像处理方法相关的程序。In response to the user's drag operation of the switch button corresponding to "Detailed Color Restoration", the electronic device 100 can activate the program related to the image processing method provided by the embodiment of the present application when shooting.
应理解,上述仅为用户从电子设备的显示界面启用本申请实施例提供的图像处理方法的两种示例,当然也可以通过其他方式来启用本申请实施例提供的图像处理方法,或者,也可以在拍摄过程默认直接使用本申请实施例提供的图像处理方法,本申请实施例对此不进行任何限制。It should be understood that the above are only two examples in which the user enables the image processing method provided by the embodiment of the present application from the display interface of the electronic device. Of course, the image processing method provided by the embodiment of the present application can also be enabled in other ways, or it can also be In the shooting process, the image processing method provided by the embodiment of the present application is directly used by default, and the embodiment of the present application does not impose any restrictions on this.
上文结合图1至图18详细描述了本申请实施例提供的图像处理方法以及相关的显示界面和效果图;下面将结合图19至图22详细描述本申请实施例提供的电子设备、装置和芯片。应理解,本申请实施例中的电子设备、装置和芯片可以执行前述本申请实施例的各种图像处理方法,即以下各种产品的具体工作过程,可以参考前述方法实施例中的对应过程。The image processing method and related display interfaces and renderings provided by the embodiments of the present application are described in detail above with reference to Figures 1 to 18; below, the electronic equipment, devices and devices provided by the embodiments of the present application will be described in detail with reference to Figures 19 to 22. chip. It should be understood that the electronic equipment, devices and chips in the embodiments of the present application can perform various image processing methods in the embodiments of the present application. That is, for the specific working processes of the following various products, reference can be made to the corresponding processes in the foregoing method embodiments.
图19示出了一种适用于本申请的电子设备的硬件系统。电子设备100可用于实现上述方法实施例中描述的图像处理方法。Figure 19 shows a hardware system suitable for the electronic device of the present application. The electronic device 100 may be used to implement the image processing method described in the above method embodiment.
电子设备100可以是手机、智慧屏、平板电脑、可穿戴电子设备、车载电子设备、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个 人数字助理(personal digital assistant,PDA)、投影仪等等,本申请实施例对电子设备100的具体类型不作任何限制。The electronic device 100 may be a mobile phone, a smart screen, a tablet, a wearable electronic device, a vehicle-mounted electronic device, an augmented reality (AR) device, a virtual reality (VR) device, a notebook computer, or a super mobile personal computer ( ultra-mobile personal computer (UMPC), netbook, personal computer Personal digital assistant (personal digital assistant, PDA), projector, etc., the embodiment of the present application does not place any restrictions on the specific type of the electronic device 100.
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
需要说明的是,图19所示的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图19所示的部件更多或更少的部件,或者,电子设备100可以包括图19所示的部件中某些部件的组合,或者,电子设备100可以包括图19所示的部件中某些部件的子部件。图19示的部件可以以硬件、软件、或软件和硬件的组合实现。It should be noted that the structure shown in FIG. 19 does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, the electronic device 100 may include more or less components than those shown in FIG. 19 , or the electronic device 100 may include a combination of some of the components shown in FIG. 19 , or , the electronic device 100 may include sub-components of some of the components shown in FIG. 19 . The components shown in Figure 19 may be implemented in hardware, software, or a combination of software and hardware.
处理器110可以包括一个或多个处理单元。例如,处理器110可以包括以下处理单元中的至少一个:应用处理器(application processor,AP)、调制解调处理器、图形处理器(graphics processing unit,GPU)、图像信号处理器(image signal processor,ISP)、控制器、视频编解码器、数字信号处理器(digital signal processor,DSP)、基带处理器、神经网络处理器(neural-network processing unit,NPU)。其中,不同的处理单元可以是独立的器件,也可以是集成的器件。Processor 110 may include one or more processing units. For example, the processor 110 may include at least one of the following processing units: an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), an image signal processor (image signal processor) , ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, neural network processing unit (NPU). Among them, different processing units can be independent devices or integrated devices.
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。The controller may be the nerve center and command center of the electronic device 100 . The controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。The processor 110 may also be provided with a memory for storing instructions and data. In some embodiments, the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
在本申请实施例中,处理器110可以执行显示第一界面,第一界面包括第一控件;检测到对第一控件的第一操作;响应于第一操作,获取原始图像,原始图像包括至少4种颜色的通道信号;对原始图像进行预处理,得到第一初始图像、第二初始图像和第三初始图像;然后,对第二初始图像和第三初始图像进行前端处理,得到前端处理图像;对第一初始图像和前端处理图像进行融合处理,得到目标图像。In this embodiment of the present application, the processor 110 may display a first interface, the first interface including a first control; detect a first operation on the first control; and in response to the first operation, obtain an original image, the original image including at least Channel signals of 4 colors; preprocess the original image to obtain the first initial image, the second initial image and the third initial image; then perform front-end processing on the second initial image and the third initial image to obtain the front-end processed image ; Fusion process the first initial image and the front-end processed image to obtain the target image.
图19所示的各模块间的连接关系只是示意性说明,并不构成对电子设备100的各模块间的连接关系的限定。可选地,电子设备100的各模块也可以采用上述实施例中多种连接方式的组合。The connection relationship between the modules shown in FIG. 19 is only a schematic illustration and does not constitute a limitation on the connection relationship between the modules of the electronic device 100 . Optionally, each module of the electronic device 100 may also adopt a combination of various connection methods in the above embodiments.
电子设备100的无线通信功能可以通过天线1、天线2、移动通信模块150、无线通信模块160、调制解调处理器以及基带处理器等器件实现。The wireless communication function of the electronic device 100 can be implemented through antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, modem processor, baseband processor and other components.
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于 覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals. Each antenna in electronic device 100 may be used to Covers single or multiple communication bands. Different antennas can also be reused to improve antenna utilization. For example: Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
电子设备100可以通过GPU、显示屏194以及应用处理器实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。The electronic device 100 may implement display functions through a GPU, a display screen 194, and an application processor. The GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
显示屏194可以用于显示图像或视频。Display 194 may be used to display images or videos.
电子设备100可以通过ISP、摄像头193、视频编解码器、GPU、显示屏194以及应用处理器等实现拍摄功能。The electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP可以对图像的噪点、亮度和色彩进行算法优化,ISP还可以优化拍摄场景的曝光和色温等参数。在一些实施例中,ISP可以设置在摄像头193中。The ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye. ISP can algorithmically optimize the noise, brightness and color of the image. ISP can also optimize parameters such as exposure and color temperature of the shooting scene. In some embodiments, the ISP may be provided in the camera 193.
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的红绿蓝(red green blue,RGB),YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。Camera 193 is used to capture still images or video. The object passes through the lens to produce an optical image that is projected onto the photosensitive element. The photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal. ISP outputs digital image signals to DSP for processing. DSP converts digital image signals into standard red green blue (RGB), YUV and other format image signals. In some embodiments, the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1、MPEG2、MPEG3和MPEG4。Video codecs are used to compress or decompress digital video. Electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3 and MPEG4.
上文详细描述了电子设备100的硬件系统,下面介绍电子设备100的软件系统。The hardware system of the electronic device 100 is described in detail above, and the software system of the electronic device 100 is introduced below.
图20是本申请实施例提供的电子设备的软件系统的示意图。Figure 20 is a schematic diagram of a software system of an electronic device provided by an embodiment of the present application.
如图20所示,系统架构中可以包括应用层210、应用框架层220、硬件抽象层230、驱动层240以及硬件层250。As shown in Figure 20, the system architecture may include an application layer 210, an application framework layer 220, a hardware abstraction layer 230, a driver layer 240 and a hardware layer 250.
应用层210可以包括相机应用程序或者其他应用程序,其他应用程序包括但不限于:相机、图库等应用程序。The application layer 210 may include a camera application or other applications. Other applications include but are not limited to: camera, gallery and other applications.
应用框架层220可以向应用层的应用程序提供应用程序编程接口(application programming interface,API)和编程框架;应用框架层可以包括一些预定义的函数。The application framework layer 220 can provide an application programming interface (API) and programming framework to applications in the application layer; the application framework layer can include some predefined functions.
例如,应用框架层220可以包括相机访问接口;相机访问接口中可以包括相机管理与相机设备;其中,相机管理可以用于提供管理相机的访问接口;相机设备可以用于提供访问相机的接口。 For example, the application framework layer 220 may include a camera access interface; the camera access interface may include camera management and camera equipment; where camera management may be used to provide an access interface for managing cameras; and the camera device may be used to provide an interface for accessing cameras.
硬件抽象层230用于将硬件抽象化。比如,硬件抽象层可以包相机抽象层以及其他硬件设备抽象层;相机硬件抽象层可以调用相机算法库中的相机算法。Hardware abstraction layer 230 is used to abstract hardware. For example, the hardware abstraction layer can include the camera abstraction layer and other hardware device abstraction layers; the camera hardware abstraction layer can call the camera algorithm in the camera algorithm library.
例如,硬件抽象层230中包括相机硬件抽象层2301与相机算法库;相机算法库中可以包括软件算法;比如,算法1、算法2等可以是用于图像处理的软件算法。For example, the hardware abstraction layer 230 includes a camera hardware abstraction layer 2301 and a camera algorithm library; the camera algorithm library may include software algorithms; for example, Algorithm 1, Algorithm 2, etc. may be software algorithms for image processing.
驱动层240用于为不同硬件设备提供驱动。例如,驱动层可以包括相机设备驱动、数字信号处理器驱动和图形处理器驱动。The driver layer 240 is used to provide drivers for different hardware devices. For example, the driver layer may include camera device drivers, digital signal processor drivers, and graphics processor drivers.
硬件层250可以包括多个图像传感器(sensor)、多个图像信号处理器、数字信号处理器、图形处理器以及其他硬件设备。The hardware layer 250 may include multiple image sensors, multiple image signal processors, digital signal processors, graphics processors, and other hardware devices.
例如,硬件层250包括传感器和图像信号处理器;传感器中可以包括传感器1、传感器2、深度传感器(time of flight,TOF)、多光谱传感器等。图像信号处理器中可以包括图像信号处理器1、图像信号处理器2等。For example, the hardware layer 250 includes a sensor and an image signal processor; the sensor may include sensor 1, sensor 2, depth sensor (time of flight, TOF), multispectral sensor, etc. The image signal processor may include image signal processor 1, image signal processor 2, etc.
在本申请中,通过调用硬件抽象层230中的硬件抽象层接口,可以实现硬件抽象层230上方的应用程序层210、应用程序框架层220与下方的驱动层240、硬件层250的连接,实现摄像头数据传输及功能控制。In this application, by calling the hardware abstraction layer interface in the hardware abstraction layer 230, the connection between the application layer 210 and the application framework layer 220 above the hardware abstraction layer 230 and the driver layer 240 and the hardware layer 250 below can be realized. Camera data transmission and function control.
其中,在硬件抽象层230中的摄像头硬件接口层中,厂商可以根据需求在此定制功能。摄像头硬件接口层相比硬件抽象层接口,更加高效、灵活、低延迟,也能更加丰富的调用ISP和GPU,来实现图像处理。其中,输入硬件抽象层230中的图像可以来自图像传感器,也可以来自存储的图片。Among them, in the camera hardware interface layer in the hardware abstraction layer 230, manufacturers can customize functions here according to needs. Compared with the hardware abstraction layer interface, the camera hardware interface layer is more efficient, flexible, and low-latency, and can also make more abundant calls to ISP and GPU to implement image processing. The image input to the hardware abstraction layer 230 may come from an image sensor or a stored picture.
硬件抽象层230中的调度层,包含了通用功能性接口,用于实现管理和控制。The scheduling layer in the hardware abstraction layer 230 includes general functional interfaces for implementing management and control.
硬件抽象层230中的摄像头服务层,用于访问ISP和其他硬件的接口。The camera service layer in the hardware abstraction layer 230 is used to access interfaces of ISP and other hardware.
下面结合捕获拍照场景,示例性说明电子设备100软件以及硬件的工作流程。The following exemplifies the workflow of the software and hardware of the electronic device 100 in conjunction with capturing the photographing scene.
应用程序层中的相机应用可以以图标的方式显示在电子设备100的屏幕上。当相机应用的图标被用户点击以进行触发时,电子设备100开始运行相机应用。当相机应用运行在电子设备100上时,相机应用调用应用程序框架层210中的相机应用对应的接口,然后,通过调用硬件抽象层230启动摄像头驱动,开启电子设备100上的包含多光谱传感器的摄像头,并通过多光谱传感器采集原始图像。此时,多光谱传感器可按一定工作频率进行采集,并将采集的图像在多光谱传感器内部或传输至1路或多路图像信号处理器中进行处理,然后,再将处理后的目标图像或彩色图像进行保存和/或传输至显示屏进行显示。The camera application in the application layer may be displayed on the screen of the electronic device 100 in the form of an icon. When the icon of the camera application is clicked by the user to be triggered, the electronic device 100 starts to run the camera application. When the camera application is running on the electronic device 100, the camera application calls the interface corresponding to the camera application in the application framework layer 210, and then starts the camera driver by calling the hardware abstraction layer 230, turning on the multispectral sensor on the electronic device 100. camera and collect raw images through multispectral sensors. At this time, the multispectral sensor can collect according to a certain operating frequency, and the collected images are processed inside the multispectral sensor or transmitted to one or more image signal processors, and then the processed target image or Color images are saved and/or transferred to the monitor for display.
下面介绍本申请实施例提供的一种用于实现上述图像处理方法的图像处理装置300。图21是本申请实施例提供的图像处理装置300的示意图。An image processing device 300 for implementing the above image processing method provided by an embodiment of the present application is introduced below. FIG. 21 is a schematic diagram of an image processing device 300 provided by an embodiment of the present application.
如图21所示,图像处理装置300包括显示单元310、获取单元320和处理单元330。As shown in FIG. 21 , the image processing device 300 includes a display unit 310 , an acquisition unit 320 and a processing unit 330 .
其中,显示单元310用于显示第一界面,第一界面包括第一控件。The display unit 310 is used to display a first interface, and the first interface includes first controls.
获取单元320用于检测到对第一控件的第一操作。The obtaining unit 320 is used to detect the first operation on the first control.
处理单元330用于响应于第一操作,获取原始图像,原始图像包括至少4种颜色的通道信号。The processing unit 330 is configured to acquire an original image in response to the first operation, where the original image includes channel signals of at least 4 colors.
处理单元330还用于对原始图像进行预处理,得到第一初始图像、第二初始图像和第三初始图像;对第二初始图像和第三初始图像进行前端处理,得到前端处理图像;将第一初始图像和前端处理图像进行融合处理,得到目标图像。 The processing unit 330 is also used to pre-process the original image to obtain the first initial image, the second initial image and the third initial image; perform front-end processing on the second initial image and the third initial image to obtain the front-end processed image; An initial image and the front-end processed image are fused to obtain the target image.
需要说明的是,上述图像处理装置300以功能单元的形式体现。这里的术语“单元”可以通过软件和/或硬件形式实现,对此不作具体限定。It should be noted that the above image processing device 300 is embodied in the form of functional units. The term "unit" here can be implemented in the form of software and/or hardware, and is not specifically limited.
例如,“单元”可以是实现上述功能的软件程序、硬件电路或二者结合。所述硬件电路可能包括应用特有集成电路(application specific integrated circuit,ASIC)、电子电路、用于执行一个或多个软件或固件程序的处理器(例如共享处理器、专有处理器或组处理器等)和存储器、合并逻辑电路和/或其它支持所描述的功能的合适组件。For example, a "unit" may be a software program, a hardware circuit, or a combination of both that implements the above functions. The hardware circuit may include an application specific integrated circuit (ASIC), an electronic circuit, a processor (such as a shared processor, a dedicated processor, or a group processor) for executing one or more software or firmware programs. etc.) and memory, merged logic circuitry, and/or other suitable components to support the described functionality.
因此,在本申请的实施例中描述的各示例的单元,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。Therefore, the units of each example described in the embodiments of the present application can be implemented by electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are performed in hardware or software depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each specific application, but such implementations should not be considered beyond the scope of this application.
本申请实施例还提供一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机指令;当所述计算机可读存储介质在图像处理装置300上运行时,使得该图像处理装置300执行前述所示的图像处理方法。Embodiments of the present application also provide a computer-readable storage medium, in which computer instructions are stored; when the computer-readable storage medium is run on the image processing device 300, the image processing device 300 causes the image processing device 300 to Execute the image processing method shown previously.
所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或者数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可以用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带),光介质、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。The computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transferred from a website, computer, server, or data center Transmission to another website, computer, server or data center through wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) means. The computer-readable storage medium can be any available medium that can be accessed by a computer or include one or more data storage devices such as servers and data centers that can be integrated with the medium. The available media may be magnetic media (eg, floppy disk, hard disk, tape), optical media, or semiconductor media (eg, solid state disk (SSD)), etc.
本申请实施例还提供了一种包含计算机指令的计算机程序产品,当其在图像处理装置300上运行时,使得图像处理装置300可以执行前述所示的图像处理方法。Embodiments of the present application also provide a computer program product containing computer instructions, which when run on the image processing device 300 enables the image processing device 300 to execute the image processing method shown above.
图22为本申请实施例提供的一种芯片的结构示意图。图22所示的芯片可以为通用处理器,也可以为专用处理器。该芯片包括处理器401。其中,处理器401用于支持图像处理装置300执行前述所示的技术方案。Figure 22 is a schematic structural diagram of a chip provided by an embodiment of the present application. The chip shown in Figure 22 can be a general-purpose processor or a special-purpose processor. The chip includes a processor 401. The processor 401 is used to support the image processing device 300 in executing the technical solutions shown above.
可选的,该芯片还包括收发器402,收发器402用于接受处理器401的控制,用于支持图像处理装置300执行前述所示的技术方案。Optionally, the chip also includes a transceiver 402, which is used to accept the control of the processor 401 and to support the image processing device 300 in executing the technical solution shown above.
可选的,图22所示的芯片还可以包括:存储介质403。Optionally, the chip shown in Figure 22 may also include: a storage medium 403.
需要说明的是,图22所示的芯片可以使用下述电路或者器件来实现:一个或多个现场可编程门阵列(field programmable gate array,FPGA)、可编程逻辑器件(programmable logic device,PLD)、控制器、状态机、门逻辑、分立硬件部件、任何其他适合的电路、或者能够执行本申请通篇所描述的各种功能的电路的任意组合。It should be noted that the chip shown in Figure 22 can be implemented using the following circuits or devices: one or more field programmable gate arrays (FPGA), programmable logic devices (PLD) , controller, state machine, gate logic, discrete hardware components, any other suitable circuit, or any combination of circuits capable of performing the various functions described throughout this application.
上述本申请实施例提供的电子设备、图像处理装置300、计算机存储介质、计算机程序产品、芯片均用于执行上文所提供的方法,因此,其所能达到的有益效果可参考上文所提供的方法对应的有益效果,在此不再赘述。The electronic equipment, image processing device 300, computer storage media, computer program products, and chips provided by the embodiments of the present application are all used to execute the methods provided above. Therefore, the beneficial effects they can achieve can be referred to the methods provided above. The beneficial effects corresponding to the method will not be repeated here.
应理解,上述只是为了帮助本领域技术人员更好地理解本申请实施例,而非要限制本申请实施例的范围。本领域技术人员根据所给出的上述示例,显然可以进行各种等价的修改或变化,例如,上述检测方法的各个实施例中某些步骤可以是不必须的, 或者可以新加入某些步骤等。或者上述任意两种或者任意多种实施例的组合。这样的修改、变化或者组合后的方案也落入本申请实施例的范围内。It should be understood that the above is only to help those skilled in the art better understand the embodiments of the present application, but is not intended to limit the scope of the embodiments of the present application. Those skilled in the art can obviously make various equivalent modifications or changes based on the above examples given. For example, some steps in the various embodiments of the above detection method may not be necessary. Or you can add some new steps, etc. Or a combination of any two or more of the above embodiments. Such modified, changed or combined solutions also fall within the scope of the embodiments of the present application.
还应理解,上文对本申请实施例的描述着重于强调各个实施例之间的不同之处,未提到的相同或相似之处可以互相参考,为了简洁,这里不再赘述。It should also be understood that the above description of the embodiments of the present application focuses on emphasizing the differences between the various embodiments. The similarities or similarities that are not mentioned can be referred to each other. For the sake of brevity, they will not be described again here.
还应理解,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。It should also be understood that the size of the serial numbers of the above-mentioned processes does not mean the order of execution. The execution order of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiment of the present application.
还应理解,本申请实施例中,“预先设定”、“预先定义”可以通过在设备(例如,包括电子设备)中预先保存相应的代码、表格或其他可用于指示相关信息的方式来实现,本申请对于其具体的实现方式不做限定。It should also be understood that in the embodiments of the present application, "preset" and "predefined" can be realized by pre-saving corresponding codes, tables or other methods that can be used to indicate relevant information in the device (for example, including electronic devices). , this application does not limit its specific implementation.
还应理解,本申请实施例中的方式、情况、类别以及实施例的划分仅是为了描述的方便,不应构成特别的限定,各种方式、类别、情况以及实施例中的特征在不矛盾的情况下可以相结合。It should also be understood that the division of modes, situations, categories and embodiments in the embodiments of this application is only for the convenience of description and should not constitute a special limitation. The features in various modes, categories, situations and embodiments are not contradictory unless cases can be combined.
还应理解,在本申请的各个实施例中,如果没有特殊说明以及逻辑冲突,不同的实施例之间的术语和/或描述具有一致性、且可以相互引用,不同的实施例中的技术特征根据其内在的逻辑关系可以组合形成新的实施例。It should also be understood that in the various embodiments of the present application, if there are no special instructions or logical conflicts, the terms and/or descriptions between different embodiments are consistent and can be referenced to each other. The technical features in different embodiments New embodiments can be formed based on their internal logical relationships.
最后应说明的是:以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。 Finally, it should be noted that the above are only specific implementation modes of the present application, but the protection scope of the present application is not limited thereto. Any changes or substitutions within the technical scope disclosed in the present application shall be covered by this application. within the scope of protection applied for. Therefore, the protection scope of this application should be subject to the protection scope of the claims.

Claims (14)

  1. 一种图像处理方法,其特征在于,应用于电子设备,所述方法包括:An image processing method, characterized in that it is applied to electronic equipment, and the method includes:
    显示第一界面,所述第一界面包括第一控件;Display a first interface, the first interface including a first control;
    检测到对所述第一控件的第一操作;detecting a first operation on the first control;
    响应于所述第一操作,获取原始图像,所述原始图像包括至少4种颜色的通道信号;In response to the first operation, acquiring an original image, the original image including channel signals of at least 4 colors;
    对所述原始图像进行预处理,得到第一初始图像、第二初始图像和第三初始图像,所述预处理用于对所述原始图像中包括的多种颜色的通道信号进行合并重组;Perform preprocessing on the original image to obtain a first initial image, a second initial image and a third initial image, where the preprocessing is used to merge and recombine channel signals of multiple colors included in the original image;
    对所述第二初始图像和所述第三初始图像进行前端处理,得到前端处理图像;Perform front-end processing on the second initial image and the third initial image to obtain a front-end processed image;
    将所述第一初始图像和所述前端处理图像进行融合处理,得到目标图像。The first initial image and the front-end processed image are fused to obtain a target image.
  2. 根据权利要求1所述的图像处理方法,其特征在于,对所述原始图像进行预处理,得到第一初始图像、第二初始图像和第三初始图像,包括:The image processing method according to claim 1, characterized in that preprocessing the original image to obtain the first initial image, the second initial image and the third initial image includes:
    对所述原始图像进行四合一像素合并处理,得到所述第一初始图像;Perform a four-in-one pixel merging process on the original image to obtain the first initial image;
    对所述原始图像进行对角线像素合并处理,得到所述第二初始图像和所述第三初始图像。Perform diagonal pixel merging processing on the original image to obtain the second initial image and the third initial image.
  3. 根据权利要求1或2所述的图像处理方法,其特征在于,所述前端处理至少包括:下采样、降噪、自动白平衡和/或颜色校正、上采样。The image processing method according to claim 1 or 2, characterized in that the front-end processing at least includes: downsampling, noise reduction, automatic white balance and/or color correction, and upsampling.
  4. 根据权利要求1至3中任一项所述的图像处理方法,其特征在于,将所述第一初始图像和所述前端处理图像进行融合处理,得到目标图像,包括:The image processing method according to any one of claims 1 to 3, characterized in that, the first initial image and the front-end processed image are fused to obtain a target image, including:
    将所述第一初始图像和所述前端处理图像,利用以下公式进行融合处理:
    E(f)=wd(f-d)2+wx(fx-gx)2+wy(fy-gy)2
    The first initial image and the front-end processed image are fused using the following formula:
    E(f)=w d (fd) 2 +w x (f x -g x ) 2 +w y (f y -g y ) 2 ;
    其中,f为所述目标图像的像素值,g为所述第一初始图像的像素值,d为所述前端处理图像的像素值,fx和fy为所述目标图像在x方向和y方向上的梯度值,gx和gy为所述第一初始图像的梯度值,wd、wx、wy为权重值,E(f)为f的能量函数;Where, f is the pixel value of the target image, g is the pixel value of the first initial image, d is the pixel value of the front-end processed image, f x and f y are the x and y directions of the target image. The gradient values in the direction, g x and g y are the gradient values of the first initial image, w d , w x , and w y are weight values, and E(f) is the energy function of f;
    确定E(f)的最小值,得到所述目标图像。Determine the minimum value of E(f) to obtain the target image.
  5. 根据权利要求4所述的图像处理方法,其特征在于,所述图像处理方法还包括:The image processing method according to claim 4, characterized in that the image processing method further includes:
    对所述目标图像进行后端处理,得到彩色图像。Perform back-end processing on the target image to obtain a color image.
  6. 根据权利要求5所述的图像处理方法,其特征在于,所述后端处理包括:去马赛克、伽马校正、风格变换中的至少一项。The image processing method according to claim 5, characterized in that the back-end processing includes: at least one of demosaicing, gamma correction, and style transformation.
  7. 根据权利要求1至6中任一项所述的图像处理方法,其特征在于,所述原始图像包括红色通道信号、绿色通道信号、蓝色通道信号、黄色通道信号、青色通道信号和品红色通道信号。The image processing method according to any one of claims 1 to 6, characterized in that the original image includes a red channel signal, a green channel signal, a blue channel signal, a yellow channel signal, a cyan channel signal and a magenta channel Signal.
  8. 一种电子设备,其特征在于,所述电子设备包括:An electronic device, characterized in that the electronic device includes:
    一个或多个处理器和存储器;one or more processors and memories;
    所述存储器与所述一个或多个处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行如权利要求1至7中任一项所述的图像处理方法。The memory is coupled to the one or more processors, the memory is used to store computer program code, the computer program code includes computer instructions, and the one or more processors invoke the computer instructions to cause the The electronic device executes the image processing method according to any one of claims 1 to 7.
  9. 一种芯片系统,其特征在于,所述芯片系统应用于电子设备,所述芯片系统包 括一个或多个处理器,所述处理器用于调用计算机指令以使得所述电子设备执行如权利要求1至7中任一项所述的图像处理方法。A chip system, characterized in that the chip system is applied to electronic equipment, and the chip system includes It includes one or more processors, the processor is used to call computer instructions to cause the electronic device to execute the image processing method according to any one of claims 1 to 7.
  10. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储了计算机程序,当所述计算机程序被处理器执行时,使得处理器执行权利要求1至7中任一项所述的图像处理方法。A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program. When the computer program is executed by a processor, the processor causes the processor to execute any one of claims 1 to 7. image processing methods.
  11. 根据权利要求1所述的图像处理方法,其特征在于,所述方法包括:The image processing method according to claim 1, characterized in that the method includes:
    显示第一界面,所述第一界面包括第一控件;Display a first interface, the first interface including a first control;
    检测到对所述第一控件的第一操作;detecting a first operation on the first control;
    响应于所述第一操作,获取原始图像,所述原始图像包括至少4种颜色的通道信号;In response to the first operation, acquiring an original image, the original image including channel signals of at least 4 colors;
    对所述原始图像进行四合一像素合并处理,得到第一初始图像;Perform a four-in-one pixel merging process on the original image to obtain a first initial image;
    对所述原始图像进行不同对角线方向上的对角线像素合并处理,分别得到第二初始图像和第三初始图像;Perform diagonal pixel merging processing in different diagonal directions on the original image to obtain a second initial image and a third initial image respectively;
    对所述第二初始图像和所述第三初始图像进行前端处理,得到前端处理图像;所述前端处理至少包括:下采样、降噪、自动白平衡和/或颜色校正、上采样;Perform front-end processing on the second initial image and the third initial image to obtain a front-end processed image; the front-end processing at least includes: downsampling, noise reduction, automatic white balance and/or color correction, and upsampling;
    将所述第一初始图像和所述前端处理图像进行融合处理,得到目标图像。The first initial image and the front-end processed image are fused to obtain a target image.
  12. 根据权利要求11所述的图像处理方法,其特征在于,将所述第一初始图像和所述前端处理图像进行融合处理,得到目标图像,包括:The image processing method according to claim 11, characterized in that, the first initial image and the front-end processed image are fused to obtain a target image, including:
    将所述第一初始图像和所述前端处理图像,利用以下公式进行融合处理:
    E(f)=wd(f-d)2+wx(fx-gx)2+wy(fy-gy)2
    The first initial image and the front-end processed image are fused using the following formula:
    E(f)=w d (fd) 2 +w x (f x -g x ) 2 +w y (f y -g y ) 2 ;
    其中,f为所述目标图像的像素值,g为所述第一初始图像的像素值,d为所述前端处理图像的像素值,fx和fy为所述目标图像在x方向和y方向上的梯度值,gx和gy为所述第一初始图像的梯度值,wd、wx、wy为权重值,E(f)为f的能量函数;Where, f is the pixel value of the target image, g is the pixel value of the first initial image, d is the pixel value of the front-end processed image, f x and f y are the x and y directions of the target image. The gradient values in the direction, g x and g y are the gradient values of the first initial image, w d , w x , and w y are weight values, and E(f) is the energy function of f;
    确定E(f)的最小值,得到所述目标图像。Determine the minimum value of E(f) to obtain the target image.
  13. 根据权利要求12所述的图像处理方法,其特征在于,所述图像处理方法还包括:The image processing method according to claim 12, characterized in that the image processing method further includes:
    对所述目标图像进行后端处理,得到彩色图像;所述后端处理包括:去马赛克、伽马校正、风格变换中的至少一项。Perform back-end processing on the target image to obtain a color image; the back-end processing includes: at least one of demosaicing, gamma correction, and style transformation.
  14. 根据权利要求11至13中任一项所述的图像处理方法,其特征在于,所述原始图像包括红色通道信号、绿色通道信号、蓝色通道信号、黄色通道信号、青色通道信号和品红色通道信号。 The image processing method according to any one of claims 11 to 13, characterized in that the original image includes a red channel signal, a green channel signal, a blue channel signal, a yellow channel signal, a cyan channel signal and a magenta channel Signal.
PCT/CN2023/087568 2022-05-31 2023-04-11 Image processing method and related device thereof WO2023231583A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210606523.5A CN114693580B (en) 2022-05-31 2022-05-31 Image processing method and related device
CN202210606523.5 2022-05-31

Publications (1)

Publication Number Publication Date
WO2023231583A1 true WO2023231583A1 (en) 2023-12-07

Family

ID=82131399

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/087568 WO2023231583A1 (en) 2022-05-31 2023-04-11 Image processing method and related device thereof

Country Status (2)

Country Link
CN (1) CN114693580B (en)
WO (1) WO2023231583A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114693580B (en) * 2022-05-31 2022-10-18 荣耀终端有限公司 Image processing method and related device
CN115908221B (en) * 2023-03-08 2023-12-08 荣耀终端有限公司 Image processing method, electronic device and storage medium
CN116630204B (en) * 2023-07-19 2023-09-26 南京佳格耕耘科技有限公司 Remote sensing image online analysis processing system
CN117459836A (en) * 2023-12-05 2024-01-26 荣耀终端有限公司 Image processing method, device and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110876027A (en) * 2018-08-29 2020-03-10 三星电子株式会社 Image sensor, electronic device including the same, and image scaling processing method
CN111131798A (en) * 2019-10-18 2020-05-08 华为技术有限公司 Image processing method, image processing apparatus, and imaging apparatus
US20200304732A1 (en) * 2019-03-20 2020-09-24 Apple Inc. Multispectral image decorrelation method and system
CN112261391A (en) * 2020-10-26 2021-01-22 Oppo广东移动通信有限公司 Image processing method, camera assembly and mobile terminal
CN113411506A (en) * 2020-03-16 2021-09-17 索尼半导体解决方案公司 Imaging element and electronic apparatus
CN113676675A (en) * 2021-08-16 2021-11-19 Oppo广东移动通信有限公司 Image generation method and device, electronic equipment and computer-readable storage medium
CN114331916A (en) * 2022-03-07 2022-04-12 荣耀终端有限公司 Image processing method and electronic device
CN114693580A (en) * 2022-05-31 2022-07-01 荣耀终端有限公司 Image processing method and related device

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101834974B (en) * 2009-03-09 2013-12-18 博立码杰通讯(深圳)有限公司 Multispectral photoreceptive device and sampling method thereof
US8638342B2 (en) * 2009-10-20 2014-01-28 Apple Inc. System and method for demosaicing image data using weighted gradients
CN105794203B (en) * 2013-12-04 2020-03-20 拉姆伯斯公司 High dynamic range image sensor
CN108419062B (en) * 2017-02-10 2020-10-02 杭州海康威视数字技术股份有限公司 Image fusion apparatus and image fusion method
CN112767290B (en) * 2019-11-01 2022-11-11 RealMe重庆移动通信有限公司 Image fusion method, image fusion device, storage medium and terminal device
CN110944160B (en) * 2019-11-06 2022-11-04 维沃移动通信有限公司 Image processing method and electronic equipment
CN113141475B (en) * 2020-01-17 2024-02-02 思特威(上海)电子科技股份有限公司 Imaging system and pixel merging method
CN111405204B (en) * 2020-03-11 2022-07-26 Oppo广东移动通信有限公司 Image acquisition method, imaging device, electronic device, and readable storage medium
JP6864942B1 (en) * 2020-11-18 2021-04-28 株式会社SensAI Imaging system
CN113676708B (en) * 2021-07-01 2023-11-14 Oppo广东移动通信有限公司 Image generation method, device, electronic equipment and computer readable storage medium
CN113676635B (en) * 2021-08-16 2023-05-05 Oppo广东移动通信有限公司 Method and device for generating high dynamic range image, electronic equipment and storage medium
CN114125242A (en) * 2021-12-01 2022-03-01 Oppo广东移动通信有限公司 Image sensor, camera module, electronic equipment, image generation method and device
CN114549383A (en) * 2022-02-23 2022-05-27 浙江大华技术股份有限公司 Image enhancement method, device, equipment and medium based on deep learning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110876027A (en) * 2018-08-29 2020-03-10 三星电子株式会社 Image sensor, electronic device including the same, and image scaling processing method
US20200304732A1 (en) * 2019-03-20 2020-09-24 Apple Inc. Multispectral image decorrelation method and system
CN111131798A (en) * 2019-10-18 2020-05-08 华为技术有限公司 Image processing method, image processing apparatus, and imaging apparatus
CN113411506A (en) * 2020-03-16 2021-09-17 索尼半导体解决方案公司 Imaging element and electronic apparatus
CN112261391A (en) * 2020-10-26 2021-01-22 Oppo广东移动通信有限公司 Image processing method, camera assembly and mobile terminal
CN113676675A (en) * 2021-08-16 2021-11-19 Oppo广东移动通信有限公司 Image generation method and device, electronic equipment and computer-readable storage medium
CN114331916A (en) * 2022-03-07 2022-04-12 荣耀终端有限公司 Image processing method and electronic device
CN114693580A (en) * 2022-05-31 2022-07-01 荣耀终端有限公司 Image processing method and related device

Also Published As

Publication number Publication date
CN114693580A (en) 2022-07-01
CN114693580B (en) 2022-10-18

Similar Documents

Publication Publication Date Title
WO2023231583A1 (en) Image processing method and related device thereof
WO2023016039A1 (en) Video processing method and apparatus, electronic device, and storage medium
WO2023036034A1 (en) Image processing method and related device thereof
CN113850367B (en) Network model training method, image processing method and related equipment thereof
WO2023016035A1 (en) Video processing method and apparatus, electronic device, and storage medium
WO2023016037A1 (en) Video processing method and apparatus, electronic device, and storage medium
WO2023124123A1 (en) Image processing method and related device thereof
CN115550575B (en) Image processing method and related device
CN113824914A (en) Video processing method and device, electronic equipment and storage medium
EP4195679A1 (en) Image processing method and electronic device
US20090324127A1 (en) Method and System for Automatic Red-Eye Correction
WO2023040725A1 (en) White balance processing method and electronic device
WO2023016040A1 (en) Video processing method and apparatus, electronic device, and storage medium
WO2023016044A1 (en) Video processing method and apparatus, electronic device, and storage medium
CN109218604A (en) Image capture unit, image brilliance modulating method and image processor
CN103167183B (en) Translucent camera aperture processing method, system and mobile terminal
WO2023016042A1 (en) Video processing method and apparatus, electronic device, and storage medium
WO2023016041A1 (en) Video processing method and apparatus, electronic device, and storage medium
US20230017498A1 (en) Flexible region of interest color processing for cameras
WO2023016043A1 (en) Video processing method and apparatus, electronic device, and storage medium
CN115955611B (en) Image processing method and electronic equipment
CN114640834B (en) Image processing method and related device
CN116051368B (en) Image processing method and related device
WO2023124165A1 (en) Image processing method and related electronic device
US20240137650A1 (en) Video Processing Method and Apparatus, Electronic Device, and Storage Medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23814779

Country of ref document: EP

Kind code of ref document: A1