WO2023231583A1 - Procédé de traitement d'image et son dispositif associé - Google Patents

Procédé de traitement d'image et son dispositif associé Download PDF

Info

Publication number
WO2023231583A1
WO2023231583A1 PCT/CN2023/087568 CN2023087568W WO2023231583A1 WO 2023231583 A1 WO2023231583 A1 WO 2023231583A1 CN 2023087568 W CN2023087568 W CN 2023087568W WO 2023231583 A1 WO2023231583 A1 WO 2023231583A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
initial
initial image
color
processing method
Prior art date
Application number
PCT/CN2023/087568
Other languages
English (en)
Chinese (zh)
Inventor
李子荣
毕涵
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2023231583A1 publication Critical patent/WO2023231583A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present application relates to the field of image processing, and in particular, to an image processing method and related equipment.
  • CMOS image sensors are traditional RGB (red, green, blue) sensors. In other words, this image sensor can only receive red. channel signal, green channel signal and blue channel signal.
  • RGB red, green, blue
  • This application provides an image processing method and related equipment. By merging and reorganizing the channel signals of the original image, two frames of images with better details and better color information are generated, and then the two frames of images are fused to generate a target image. This enables better restoration of image details and colors.
  • an image processing method which is applied to electronic devices.
  • the method includes:
  • the first initial image and the front-end processed image are fused to obtain a target image.
  • the image processing method provided by the embodiment of the present application obtains an original image including at least 4 channel signals, and then merges and reorganizes the channel signals in the original image to generate a first initial image, a second initial image, and a third initial image, and then Perform front-end processing on the second initial image and the third initial image to obtain the front-end processed image. Since The first initial image has only undergone merging and reorganization, so the detail richness is higher, and the color accuracy of the front-end processed image is higher. Based on this, the first initial image and the front-end processed image are used to generate the target image, so that image details and Better color restoration.
  • the original image is preprocessed to obtain a first initial image, a second initial image and a third initial image, including:
  • four-in-one pixel merging processing refers to a processing method in which four adjacent pixel values are weighted and averaged and output as a single pixel value
  • diagonal pixel merging processing refers to two pixels in the diagonal direction.
  • the weighted average value is output as a single pixel value.
  • a first initial image with high detail richness can be obtained; and by merging diagonal pixels on the original image, Two second initial images and a third initial image including different color channel signals are obtained, so that the second initial image and the third initial image can provide better color information for subsequent color restoration.
  • the front-end processing at least includes: downsampling, noise reduction, automatic white balance and/or color correction, and upsampling.
  • front-end processing is used to process the colors of the second initial image and the third initial image, so that the processed front-end processed image can provide better colors for subsequent processing.
  • the energy function formula can be used to fuse the first initial image and the front-end processed image, so that the target image is expected to be close to both the gradient value of the first initial image and the pixel value of the front-end processed image, so that The restored target image carries better detailed texture information and color information.
  • the image processing method further includes:
  • back-end processing is used to further enhance the detail and color of the target image.
  • the back-end processing includes: at least one of demosaicing, gamma correction, and style transformation.
  • the original image includes a red channel signal, a green channel signal, a blue channel signal, a yellow channel signal, a cyan channel signal and a magenta channel signal.
  • the first operation refers to an operation of clicking the camera application.
  • the first interface refers to a photographing interface of the electronic device
  • the first control refers to a control for instructing to photograph.
  • the first operation refers to an operation of clicking a control for instructing to take a photo.
  • the first interface refers to a video shooting interface of the electronic device
  • the first control refers to a control used to instruct video shooting.
  • the first operation refers to an operation of clicking a control indicating shooting a video.
  • the above description takes the first operation as a click operation as an example; the first operation may also include a voice instruction operation, or other operations of instructing the electronic device to take photos or videos; the above is an example and does not limit the application in any way. .
  • an electronic device including a module/unit for performing the first aspect or any method in the first aspect.
  • an electronic device including one or more processors and memories;
  • the memory is coupled to one or more processors, the memory is used to store computer program code, the computer program code includes computer instructions, and the one or more processors invoke the computer instructions to cause the electronic device to Perform the first aspect or any method in the first aspect.
  • a chip system is provided.
  • the chip system is applied to an electronic device.
  • the chip system includes one or more processors.
  • the processor is used to call computer instructions to cause the electronic device to execute the first aspect. Or any method in the first aspect.
  • a computer-readable storage medium stores a computer program.
  • the computer program includes program instructions. When executed by a processor, the program instructions cause the processor to Perform the first aspect or any method in the first aspect.
  • a computer program product includes: computer program code.
  • the computer program code When the computer program code is run by an electronic device, the electronic device causes the electronic device to execute the first aspect or any one of the first aspects. method.
  • This application provides an image processing method and related equipment.
  • a first initial image, a second initial image and a third initial image are generated.
  • the initial image is then subjected to front-end processing on the second initial image and the third initial image to obtain the front-end processed image. Since the first initial image has only undergone merger and reorganization, the richness of details is higher, while the color accuracy of the front-end processed image is lower. High, based on this, the first initial image and the front-end processed image are used to generate the target image, so that better restoration of image details and colors can be achieved.
  • Figure 1 is an imaging schematic diagram of an RGBCMY sensor
  • Figure 2 is a spectral response curve of RGBCMY
  • Figure 3 is a schematic diagram of using 24 color blocks to determine the CCM matrix
  • Figure 4 is a comparison before and after using CCM matrix for processing
  • Figure 5 is a schematic diagram of an application scenario
  • FIG. 6 is a schematic flowchart of an image processing method provided by an embodiment of the present application.
  • Figure 7 is a schematic diagram of an original image provided by an embodiment of the present application.
  • Figure 8 is a schematic diagram of Qbin processing of an original image provided by an embodiment of the present application.
  • Figure 9 is a Dbin processing process provided by the embodiment of the present application.
  • Figure 10 is a schematic diagram of Dbin processing of an original image provided by an embodiment of the present application.
  • Figure 11 is a schematic flow chart of another image processing method provided by an embodiment of the present application.
  • Figure 12 is a schematic diagram of front-end processing provided by an embodiment of the present application.
  • Figure 13 is a schematic diagram of the effect of the fusion process provided by the embodiment of the present application.
  • Figure 14 is a schematic flow chart of another image processing method provided by an embodiment of the present application.
  • Figure 15 is a schematic diagram of back-end processing provided by an embodiment of the present application.
  • Figure 16 is a schematic diagram of the effect of the image processing method provided by the embodiment of the present application.
  • Figure 17 is a schematic diagram of a display interface of an electronic device provided by an embodiment of the present application.
  • Figure 18 is a schematic diagram of a display interface of another electronic device provided by an embodiment of the present application.
  • Figure 19 is a schematic diagram of a hardware system suitable for the electronic device of the present application.
  • Figure 20 is a schematic diagram of a software system suitable for the electronic device of the present application.
  • Figure 21 is a schematic structural diagram of an image processing device provided by an embodiment of the present application.
  • Figure 22 is a schematic structural diagram of a chip system provided by an embodiment of the present application.
  • first and second are used for descriptive purposes only and cannot be understood as indicating or implying relative importance or implicitly indicating the quantity of indicated technical features. Therefore, features defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of this embodiment, unless otherwise specified, “plurality” means two or more.
  • RGB (red, green, blue) color space or RGB domain refers to a color model related to the structure of the human visual system. Think of all colors as different combinations of red, green, and blue, based on the structure of the human eye. Red, green and blue are called the three primary colors. It should be understood that a primary color refers to a "basic color" that cannot be obtained by mixing other colors.
  • YUV color space or YUV domain refers to a color encoding method.
  • Y represents brightness
  • U and V represent chroma.
  • the above-mentioned RGB color space focuses on the human eye's perception of color, while the YUV color space focuses on the visual sensitivity to brightness.
  • the RGB color space and the YUV color space can be converted into each other.
  • Pixel value refers to a set of color components corresponding to each pixel in a color image located in the RGB color space.
  • each pixel corresponds to a set of three primary color components, where the three primary color components are the red component R, the green component G, and the blue component B respectively.
  • Bayer pattern color filter array When the image is converted from the actual scene into image data, the image sensor usually receives the red channel signal, the green channel signal and the blue channel signal respectively. information of three channel signals, and then synthesize the information of the three channel signals into a color image. However, in this scheme, three filters are required for each pixel position, which is expensive and difficult to produce. Therefore, it can be used on the image sensor. The surface is covered with a color filter array to obtain information from the three channel signals.
  • Bayer format color filter array refers to filters arranged in a checkerboard format. For example, the minimum repeating unit in the Bayer format color filter array is: one filter to obtain the red channel signal, two filters to obtain the green channel signal, A filter that acquires the blue channel signal is arranged in a 2 ⁇ 2 pattern.
  • Bayer image that is, the image output by the image sensor based on the Bayer format color filter array. Pixels of multiple colors in this image are arranged in a Bayer format. Among them, each pixel in the Bayer format image only corresponds to the channel signal of one color. For example, since human vision is more sensitive to green, it can be set that green pixels (pixels corresponding to green channel signals) account for 50% of all pixels, blue pixels (pixels corresponding to blue channel signals) and red pixels (pixels corresponding to the red channel signal) each account for 25% of all pixels. Among them, the smallest repeating unit of the Bayer format image is: one red pixel, two green pixels and one blue pixel arranged in a 2 ⁇ 2 manner. It should be understood that the RAW domain is the RAW color space, and the Bayer format image can be called an image located in the RAW domain.
  • Grayscale image is a single-channel image, used to represent different brightness levels. The brightest is completely white, and the darkest is completely black. That is, each pixel in a grayscale image corresponds to a different degree of brightness between black and white. Usually in order to describe the brightness change from the brightest to the darkest, it is divided, for example, into 256 parts, which represents 256 levels of brightness, and is called 256 gray levels (the 0th gray level to the 0th gray level). 255 grayscale).
  • Spectral response which can also be called spectral sensitivity.
  • Spectral response represents the ability of the image sensor to convert incident light energy of different wavelengths into electrical energy. Among them, if the light energy of a certain wavelength of light incident on the image sensor is converted into the number of photons, and the current generated by the image sensor and transmitted to the external circuit is expressed in the number of electrons, it means that each incident photon can be converted into The ability of electrons in the external circuit is called quantum efficiency (QE), and the unit is expressed in percentage.
  • QE quantum efficiency
  • the spectral responsivity of the image sensor depends on the quantum efficiency, as well as parameters such as wavelength and integration time.
  • the human eye has the characteristic of color constancy. In most cases, the color of the same object seen in various light source scenarios is consistent. For example, white paper looks white. Then, in order to eliminate the impact of the light source on the imaging of the image sensor, simulate the color constancy of human vision, and ensure that the white seen in any scene is truly white, therefore, it is necessary to correct the color temperature and automatically adjust the white balance to the appropriate Location.
  • the filter colors of different color filter arrays constitute the camera color space (RAW domain or RAW color space). Therefore, the camera color space is not a universal color space.
  • a color filter array with a filter color of RGGB forms a camera color space of RAW RGB. If the Bayer format image or RAW image generated by the color filter array is directly displayed, the image will be greenish.
  • CCM Color correction matrix
  • the CCM matrix is mainly used to convert image data obtained by automatic white balance into standard color space (sRGB). Since there is a big difference between the spectral response of the CMOS sensor and the spectral response of the human eye to visible light, the color reproduction of the camera is very different from the color of the object perceived by the observer. Therefore, it is necessary to improve the color of the object through the CCM matrix. Color saturation makes the color of the image captured by the camera closer to the perception of the human eye. Among them, the process of correction using the CCM matrix is the process of color correction.
  • CMOS image sensors currently used for visible light imaging are traditional RGB sensors. Due to hardware limitations, this image sensor can only receive red channel signals, green channel signals, and blue channel signals. In this way, the number of spectral response channels of the image sensor is very limited, and a small number of spectral response channels will limit the color restoration capability of the image sensor and affect the color and other information of the restored image.
  • CMOS sensors also known as multispectral sensors
  • noise problems will occur when using multispectral sensors for imaging, and usually as the number of spectral response channels increases, the noise problems that occur during imaging will become more serious.
  • multispectral means that the spectral bands used for imaging include 2 or more bands. According to this definition, since the RGB sensor utilizes three bands of red, green and blue, the RGB sensor is also strictly a multispectral response. However, it should be noted that the visible light referred to in this application as a multispectral response CMOS sensors actually refer to other multispectral sensors that have a larger number of spectral response channels than RGB sensors.
  • the multispectral sensor may be an RGBC sensor, an RGBM sensor, an RGBY sensor, an RGBCM sensor, an RGBCY sensor, an RGBMY sensor, an RGBCMY sensor, etc.
  • the RGBCMY sensor receives red channel signal, green channel signal, blue channel signal, cyan channel signal, magenta channel signal and yellow channel signal. The channel colors received by other sensors are deduced in sequence and will not be described again here.
  • the multispectral sensor can also be a sensor that receives signals from other color channels, and can be specifically selected and set according to needs.
  • the embodiments of the present application do not impose any restrictions on this.
  • Figure 1 provides an imaging schematic diagram of an RGBCMY sensor.
  • the color filter array covered on the surface of the RGBCMY sensor can obtain information from six color channel signals.
  • the minimum repeating unit in the Bayer format color filter array is: two filters to obtain the red channel signal, four filters to obtain the green channel signal, two filters to obtain the blue channel signal, and two filters to obtain the cyan channel signal. Filters for channel signals, two filters for acquiring magenta channel signals, and four filters for acquiring yellow channel signals, arranged in a 4 ⁇ 4 matrix.
  • the minimum repeating unit of the Bayer format image obtained using the RGBCMY sensor is: two red pixels, four green pixels, two blue pixels, two cyan pixels, two magenta pixels, Four yellow pixels, arranged in a 4 ⁇ 4 matrix.
  • Figure 2 provides a schematic diagram of the spectral response curve of RGBCMY.
  • the horizontal axis represents the wavelength, and the vertical axis represents the spectral responsivity corresponding to different spectra.
  • the spectral response curve indicated by R represents the different spectral responsivity of red light at different wavelengths
  • the spectral response curve indicated by G represents the different spectral responsivity of green light at different wavelengths
  • the curve represents the different spectral responsivity of blue light at different wavelengths
  • the spectral response curve indicated by C represents the different spectral responsivity of cyan light at different wavelengths
  • the spectral response curve indicated by M represents the different spectral responsivity of magenta light at different wavelengths.
  • the spectrum indicated by Y The response curve represents the different spectral responsivity of yellow light at different wavelengths.
  • the RGBCMY sensor Take the RGBCMY sensor as an example. Compared with the RGB sensor, due to the increase in the number of primary colors and the increase in the number of spectral response channels, the RGBCMY sensor can generally achieve relatively better color restoration capabilities, that is, color accuracy.
  • the Bayer format image acquired by the sensor is usually processed through automatic white balance and CCM matrix to restore the scene color.
  • the Bayer format image acquired by the RGBCMY sensor it is usually also possible to process the Bayer format image through automatic white balance and CCM.
  • Matrix is processed to restore the scene color.
  • the CCM matrix used in this process needs to be fitted in advance.
  • the CCM matrix corresponding to the RGBCMY sensor is a 6 ⁇ 3 matrix, which includes more parameter values.
  • the CCM matrix corresponding to the RGBCMY sensor is a 6 ⁇ 3 matrix, which includes more parameter values.
  • the CCM matrix corresponding to the RGBCMY sensor usually when fitting the CCM Matrix will encounter over-fitting phenomenon, which will cause some parameter values in the fitted CCM matrix to be too large.
  • the noise in the Bayer format image obtained by the RGBCMY sensor will be amplified. , causing serious color noise problems.
  • color noise refers to colored noise.
  • Figure 3 provides a schematic diagram of determining the CCM matrix using 24 color blocks.
  • the image data acquired by the RGBCMY sensor is generated after automatic white balance processing and demosaic (DM) 24 color cards.
  • DM demosaic
  • the corresponding CCM matrix at a color temperature of 6500K can be obtained.
  • This CCM matrix represents the 24 colors shown in (a) in Figure 3
  • the coefficient matrix that needs to be multiplied is required.
  • each color shown in (a) in Figure 3 corresponds to the six primary color values of R, G, B, C, M and Y, while each color shown in (b) in Figure 3 only corresponds to R,
  • the fitted CCM matrix is a 6 ⁇ 3 matrix, that is, the fitted CCM matrix includes 18 parameter values. Since in the fitting process, over-fitting phenomenon is usually encountered, causing some of the 18 parameter values included in the CCM matrix to be too large, which will lead to the problem of using the fitted parameters during actual processing. The noise of the image processed by the CCM matrix is amplified.
  • Figure 4 provides a before-and-after comparison of processing using the CCM matrix.
  • the image on the left is a Bayer format image acquired by the RGB sensor
  • the image on the right is an image processed using the corresponding 3 ⁇ 3 CCM matrix
  • the image on the right is Compared to the image on the left, although the noise is amplified, it is not very obvious.
  • the image on the left is a Bayer format image obtained by the RGBCMY sensor
  • the image on the right is an image processed using the corresponding 6 ⁇ 3 CCM matrix
  • the image on the right is relative to the left
  • the noise is amplified; compared to (a) in Figure 4, the noise problem is more serious.
  • embodiments of the present application provide an image processing method that combines and reorganizes the channel signals of the original image to generate two frames of images with better details and better color information respectively, and then fuses the two frames of images to generate The target image can achieve better restoration of the details and colors of the target image.
  • the image processing method provided by the embodiment of the present application can be applied to the field of photography.
  • Figure 5 shows a schematic diagram of an application scenario provided by the embodiment of the present application.
  • the electronic device is a mobile phone, which includes a multispectral sensor other than an RGB sensor.
  • the electronic device in response to the user's operation, can start the camera application and display a graphical user interface (GUI) as shown in Figure 5.
  • GUI graphical user interface
  • the GUI interface can be called a first interface.
  • the first interface includes multiple shooting mode options and first controls.
  • the multiple shooting modes include, for example, photo taking mode, video recording mode, etc.
  • the first control is, for example, the shooting key 11 , and the shooting key 11 is used to indicate that the current shooting mode is one of the multiple shooting modes.
  • the user when the user starts the camera application and wants to take pictures of outdoor grass and trees at night, the user clicks the shooting button 11 on the first interface, and the electronic device detects the user's click on the shooting button 11. After the operation, in response to the click operation, run the program corresponding to the image processing method provided by the embodiment of the present application to obtain the image.
  • the multispectral sensor included in this electronic device is not an RGB sensor, for example, an RGBCMY sensor.
  • the spectral response range of this electronic device has been expanded relative to the existing technology, that is to say, the color reduction capability has been improved.
  • due to The CCM matrix may have an over-fitting problem, so after processing with the CCM matrix, the noise of the image may be amplified.
  • the electronic device is processed using the image processing method provided by the embodiment of the present application, it can ensure color restoration and reduce noise, thereby improving the quality of the captured image or video.
  • FIG. 6 shows a schematic flowchart of an image processing method provided by an embodiment of the present application. As shown in Figure 6, the embodiment of the present application provides an image processing method 1.
  • the image processing method 1 includes the following S11 to S16.
  • the first control is, for example, the shooting key 11 shown in FIG. 5
  • the first operation is, for example, a click operation.
  • the first operation can also be other operations, and the embodiment of the present application does not impose any limitation on this.
  • the original image includes channel signals of at least four colors.
  • the original image is a Bayer format image, or is located in the RAW domain.
  • Figure 7 shows a schematic diagram of an original image.
  • the original image includes channel signals that may include 4 colors (for example, t1, t2, t3, and t4), or, as shown in (b) in Figure 7,
  • the original image may include channel signals of 5 colors (eg, t1, t2, t3, t4, and t5), or, as shown in (c) in FIG. 7 , the original image may include channel signals of 6 colors (eg, t1, t2, t3, t4, and t5). t1, t2, t3, t4, t5 and t6).
  • the original image may also include channel signals of more colors, and the embodiments of the present application do not impose any limitation on this.
  • the arrangement of channel signals included in the original image can be set and modified as needed.
  • the arrangement shown in FIG. 7 is only an example, and the embodiment of the present application does not impose any restrictions on this.
  • 1 frame, 2 frames, or more than 2 frames of original images may be acquired. Specifically, it can be obtained as needed, and the embodiments of this application do not impose any restrictions on this.
  • the multi-frame original image can be collected using a multispectral sensor included in the electronic device itself or obtained from other devices.
  • the specific settings can be set as needed, and the embodiments of the present application do not impose any restrictions on this.
  • the multispectral sensor when using its own multispectral sensor to acquire multiple frames of original images, can simultaneously Multiple frames of original images can be output simultaneously, or multiple frames of original images can be output serially. Specific selections and settings may be required, and the embodiments of the present application do not impose any restrictions on this.
  • multiple frames of original images are output from the multispectral sensor, they can be output simultaneously or serially, but no matter how they are output, the multiple frames of original images are actually images generated by taking the same shot of the scene to be shot.
  • the scene to be shot refers to all objects in the camera's shooting perspective.
  • the scene to be shot can also be called the target scene, or it can also be understood as the scene that the user expects to shoot.
  • the preprocessing is used to merge and recombine multiple color channel signals included in the original image.
  • the preprocessing can include four-in-one pixel binning (quarter binning, Qbin) processing and diagonal pixel binning (diagonal binning, Dbin). )deal with.
  • Qbin quarter binning
  • Dbin diagonal pixel binning
  • other methods can also be used for merging and reorganization, and specific settings and changes can be made as needed.
  • the embodiments of the present application do not impose any restrictions on this.
  • Qbin processing refers to a processing method in which four adjacent pixel values are weighted and averaged and output as a single pixel value.
  • Dbin processing refers to a weighted average of two pixel values in the diagonal direction and is output as a single pixel value. How a single pixel value is output.
  • the weights assigned during weighting can be set and modified as needed, and the embodiments of the present application do not impose any restrictions on this. For example, the weights can all be set to 1.
  • the above S14 may include:
  • FIG. 8 shows a schematic diagram of Qbin processing of original images.
  • the original image includes channel signals of 6 colors.
  • the channel signals of the 6 colors are respectively the red channel signal (R), Green channel signal (G), blue channel signal (B), cyan channel signal (C), magenta channel signal (M) and yellow channel signal (Y), these 6 colors are arranged in a 4 ⁇ 4 arrangement. And repeat with the minimum repeating unit as shown in Figure 1.
  • the pixel channel signal B in the 1st row and 3rd column, the pixel channel C in the 1st row and 4th column, the pixel channel signal C in the 2nd row and 2nd column, and the pixel channel signal B in the 2nd row and 4th column can be , the pixel values corresponding to the four adjacent pixel channel signals are weighted and averaged to obtain a single pixel value, for example, T2.
  • the T2 is the pixel value of the pixel in the first row and the second column corresponding to the first initial image.
  • the first initial image can be determined based on the original image.
  • Each pixel in the first initial image corresponds to a pixel value. Therefore, the first initial image can still be considered as an image in the RAW domain.
  • the size of each side of the first initial image is half of the original image. 1.
  • the area of the entire image is one quarter of the original image.
  • Figure 9 shows a schematic diagram of Dbin processing
  • Figure 10 shows a raw image processing Schematic diagram of Dbin processing.
  • the weight is assumed to be 1, and so on.
  • the weight is assumed to be 1, and so on for the others.
  • the pixel value of the pixel in row 1 and column 1 in the second initial image is the average of the pixel values of the pixel in row 1 and column 1 and the pixel in row 2 and column 2 in the original image, for example, it is the pixel corresponding to the green channel signal.
  • the pixel value of the pixel in row 1, column 1 in the third initial image is the pixel in row 1, column 2 and the pixel in row 2, column 1 in the original image
  • the average value of the pixel values of the column pixels for example, the pixel value corresponding to the yellow channel signal, and so on for the others.
  • the second initial image and the third initial image are also images in the RAW domain.
  • the pixel value in the second initial image is obtained by the weighted average of two pixels in the upper left corner and the lower right corner of the four adjacent pixels in the original image
  • the pixel value in the third initial image is obtained by the adjacent pixels in the original image. It is obtained by weighting the average of the two pixels in the lower left corner and the upper right corner of the four pixels, so the pixel values of the second initial image and the third initial image are different.
  • the second initial image since one pixel value of the second initial image and the third initial image is obtained by a weighted average of two pixel values in the diagonal direction among the four adjacent pixels of the original image, therefore, the second initial image,
  • the size of each side of the third initial image is half the size of the original image, and the size of the entire image is one-fourth the size of the original image.
  • Qbin processing and Dbin processing can be processed in the same image signal processor (image signal processing, ISP), or they can be processed separately in two image signal processors, or they can also be processed in two image signal processors.
  • the processing can be performed in a multispectral sensor, and the specific settings can be set as needed. The embodiments of the present application do not impose any restrictions on this.
  • Front-end processing only means that this step is before fusion, so it is called “front-end” processing and has no other meaning.
  • Front-end processing may also be called first processing, etc., and the embodiments of this application do not impose any restrictions on this.
  • the front-end processing is used to process the colors of the second initial image and the third initial image, so that the front-end processed image obtained after processing can provide better colors for subsequent processing.
  • the channel signals included in the second initial image and the third initial image respectively have different colors, the color information of the front-end processed image obtained according to the second initial image and the third initial image is better preserved and also That is to say, front-end processing of images can provide better color reproduction capabilities for subsequent processing.
  • the front-end processing provided by the embodiment of the present application can be processed in the same ISP as the above-mentioned Qbin processing or Dbin processing, or can be processed in the same ISP as Qbin processing and Dbin processing, or it can also be processed in another
  • the processing can be carried out in separate different ISPs.
  • the processing can also be carried out in a multispectral sensor.
  • the specific settings can be set according to needs.
  • the embodiments of the present application do not impose any restrictions on this.
  • the front-end processing may at least include: down sampling (down scale sample), noise reduction (denoise), automatic white balance and/or color correction, and up sampling (up sample).
  • downsampling is used to split and recombine the channel signals included in the image to reduce the size of the image.
  • Noise reduction is used to reduce noise in images. Common methods include mean filtering, Gaussian filtering, bilateral filtering, etc. Of course, other methods can also be used for noise reduction, and the embodiments of the present application do not impose any limitations on this.
  • Automatic white balance is used to correct the down-sampled and noise-reduced image to the D65 reference light source so that its white color appears truly white.
  • Color correction is used to calibrate the accuracy of colors other than white.
  • it is equivalent to using the CCM matrix to correct multiple color channel signals into three color channel signals, for example, a red channel signal, a green channel signal, and a blue channel signal respectively.
  • the CCM matrix used when performing color correction, can be a previously fitted CCM matrix.
  • the CCM matrix under the D65 reference light source can be determined by interpolating the CCM matrices corresponding to other color temperatures.
  • Upsampling is used to enlarge the image size. It should be understood that due to the previous downsampling, the image size is reduced, so accordingly, upsampling needs to be performed to enlarge the image size and restore the image size to facilitate subsequent fusion.
  • FIG 12 shows a schematic diagram of front-end processing.
  • front-end processing includes: downsampling, noise reduction, automatic white balance, color correction, and upsampling in the processing order.
  • the three color channels included in the second initial image can be split, and then the red channel signals are recombined to generate only one frame.
  • the three color channels included in the third initial image can be split, and then the yellow channel signals are reorganized together to generate a single-color channel image that only includes the yellow channel signal, and the cyan channel signals are reassembled together. Generating a frame of a monochrome channel image including only the cyan channel signal, and recombining the magenta channel signals together to generate a frame of a monochrome channel image including only the magenta channel signal.
  • each frame of the monochromatic channel image is the original second initial image.
  • One-half of the image or the third initial image, or in other words, the overall area of each frame of the monochromatic channel image is one-quarter of the original second initial image or the third initial image.
  • one frame of a monochromatic channel image including only red channel signals, one frame of monochromatic channel images including only green channel signals, and one frame of monochromatic channel images including only blue channel signals can be obtained.
  • multiple frames of monochromatic channel images are all of the same size.
  • the three-frame three-color channel image can be upsampled.
  • the included red channel signal, green channel signal and blue channel signal are spliced and reorganized to determine a front-end processed image including three color channel signals.
  • front-end processed images can include better color information.
  • the front-end processed image is a Bayer format image, that is to say, the front-end processed image is an image located in the RAW domain.
  • front-end processing can also include: at least one of dynamic dead pixel compensation (defect pixel correction, DPC), lens shading correction (lens shading correction, LSC) and wide dynamic range adjustment (wide range compression, WDR) .
  • DPC dynamic dead pixel correction
  • LSC lens shading correction
  • WDR wide dynamic range adjustment
  • dynamic bad pixel compensation is used to solve the defects in the array formed by the light collection points on the multi-spectral sensor, or the errors in the process of converting the light signal; usually by taking the average value of other surrounding pixels in the brightness domain to eliminate bad pixels.
  • lens shading correction is used to eliminate the problem of inconsistency between the color and brightness around the image and the center of the image due to the lens optical system.
  • Wide dynamic range adjustment refers to: when high-brightness areas illuminated by strong light sources (sunlight, lamps, reflections, etc.) and relatively low-brightness areas such as shadows and backlights coexist in the image, bright areas will appear in the image. Overexposure becomes white, while dark areas become black due to underexposure, seriously affecting image quality. Therefore, the brightest area and the darker area can be adjusted in the same scene, for example, the dark area is made brighter in the image, and the bright area is darkened in the image, so that the processed image can show the difference between the dark area and the bright area. more details.
  • the front-end processing may include one or more of the above-mentioned processing steps.
  • the front-end processing includes multiple processing steps, the order of the multiple processing steps may be adjusted as needed.
  • This embodiment of the present application does not impose any limitation on this.
  • the front-end processing may also include other steps, which may be added as needed. The embodiments of the present application do not impose any restrictions on this.
  • the first initial image since the first initial image is directly reconstructed from the channel signals of the original image without any other processing, the first initial image carries more texture details. Then, in order to ensure the richness of details, the first initial image can be used for fusion processing, so that while the color of the scene is restored, the restored image carries more details.
  • the front-end processed image is obtained from the second initial image and the third initial image after a series of color processing. Some details are missing, but good color information is retained. Therefore, in order to ensure the color richness, the front-end processed image can be used. Fusion processing, so that when the scene color is restored, the restored image has good color.
  • the resolution of the first initial image is relatively high and can be called a high resolution (HR) image.
  • the resolution of the front-end processed image is relatively low and can be called a low resolution (LR) image.
  • the first initial image and the front-end processed image are fused, and the pixel values corresponding to the same position can be added or multiplied according to different weights, or a network model can be used for fusion; of course, it can also be Fusion processing can be performed using other methods, which can be specifically selected and set according to needs.
  • Fusion processing can be performed using other methods, which can be specifically selected and set according to needs. The embodiments of this application do not impose any restrictions on this.
  • f is the pixel value of the target image
  • g is the pixel value of the first initial image
  • d is the pixel value of the front-end processed image
  • f x and f y are the gradient values of the target image in the x direction and y direction
  • g x and g y are the gradient values of the first initial image
  • w d , w x , and w y are weight values
  • E(f) is the energy function of f.
  • the high-resolution first initial image is equivalent to a constraint map of the gradient value of the target image. Since the first initial image has a higher degree of detail, the closer the gradient value of the target image is to the gradient value of the first initial image. The better. Among them, the gradient value is used to reflect the change rate of image data.
  • the low-resolution front-end processed image is equivalent to the constraint map of the pixel values of the target image. Since the color richness of the front-end processed image is higher, the closer the pixel value of the target image is to the pixel value of the front-end processed image, the better.
  • the target image can ensure that it is close to the gradient value of the first initial image and the pixel value of the front-end processed image. Therefore, the restored target image carries better detailed texture information and color information. .
  • FIG. 13 is a schematic diagram of the effect of the fusion process provided by the embodiment of the present application.
  • this fusion method can combine the detail information of the first initial image and the color information of the front-end processed image, the resulting target image has better color compared to the first initial image, and has better texture information compared to the front-end processed image. Richer and higher resolution.
  • target image will be displayed on the interface of the electronic device as a captured image, or will only be stored.
  • the specific selection can be made as needed, and the embodiments of the present application do not impose any restrictions on this.
  • the image processing method provided by the embodiment of the present application obtains an original image including at least 4 channel signals, and then merges and reorganizes the channel signals in the original image to generate a first initial image, a second initial image, and a third initial image, and then Perform front-end processing on the second initial image and the third initial image to obtain the front-end processed image. Since the first initial image has only undergone merger and reorganization, the richness of details is higher, and the color accuracy of the front-end processed image is higher. Based on this , using the first initial image and the front-end processed image to generate the target image, so that better restoration of image details and colors can be achieved.
  • Figure 14 provides a schematic flow chart of another image processing method. As shown in Figure 14, the above method may also include S17.
  • back-end processing only means that this step is located after the fusion, so it is called “back-end” processing and has no other meaning.
  • Back-end processing may also be called second processing, etc., and the embodiments of this application do not impose any restrictions on this.
  • backend processing can include: demosaicing.
  • demosaicing is used to complement the single-channel signal in each pixel into a multi-channel signal, i.e. A color image in the RGB domain is reconstructed from the image in the RAW domain.
  • a certain pixel in the image only corresponds to one color channel signal, such as only the red channel signal; after demosaicing, the pixel Corresponding to three color channel signals, they are red, green, and blue channel signals.
  • the pixel Corresponding to three color channel signals they are red, green, and blue channel signals.
  • green and blue channel signals are supplemented. The supplementation of pixels of other colors is analogous and will not be described again here.
  • the first back-end processing may also include: at least one of gamma correction, style transformation (3dimensional look up table, 3DLUT), and RGB domain conversion to YUV domain.
  • gamma correction is used to adjust the brightness, contrast, dynamic range, etc. of the image by adjusting the gamma curve
  • style transformation indicates the style transformation of the color, that is, using a color filter to change the original image style into other image styles.
  • Common styles include movie style, Japanese style, spooky style, etc.
  • RGB domain to YUV domain conversion refers to converting an image in the RGB domain into an image in the YUV domain.
  • the back-end processing may include one or more of the above-mentioned processing steps.
  • the back-end processing includes multiple processing steps, the order of the multiple processing steps may be adjusted as needed.
  • the embodiments of the present application do not impose any limitation on this.
  • the back-end processing may also include other steps, which may be added as needed. The embodiments of this application do not impose any restrictions on this.
  • back-end processing can be processed in the same image signal processor as the pre-processing, front-end processing and/or fusion processing, or the back-end processing can also be processed separately in other image signal processors, as needed. Settings are made, and the embodiments of this application do not impose any restrictions on this.
  • Figure 15 shows a schematic diagram of back-end processing.
  • the back-end processing includes: demosaic, gamma correction, style transformation and RGB domain conversion to YUV domain in processing order.
  • the target image is converted from the RAW domain to the YUV domain, which can reduce the amount of subsequent transmission data and save bandwidth.
  • color images are in the YUV domain.
  • the color image may be displayed on the interface of the electronic device 100 as a captured image, or may only be stored. Specifically, the color image may be set as needed, and the embodiment of the present application does not impose any limitation on this.
  • a target image containing better detail information and better color information is generated based on the fusion of the first initial image and the front-end processed image, and then back-end processing is performed on the fused target image to obtain its color and The details are further adjusted to achieve better restoration of image details and colors.
  • FIG. 16 is a schematic diagram of the effect of the image processing method provided by the embodiment of the present application.
  • the image processing method provided by the embodiment of the present application is introduced in detail above. The following describes how the user activates the image processing method provided by the embodiment of the present application in conjunction with the display interface of the electronic device.
  • FIG. 17 is a schematic diagram of a display interface of an electronic device provided by an embodiment of the present application.
  • the electronic device 100 displays a shooting interface as shown in (a) of FIG. 17 .
  • the user can perform a sliding operation on the interface so that the shooting key 11 indicates the shooting option "more".
  • the electronic device 100 displays a shooting interface as shown in (b) of Figure 17 , on which multiple shooting mode options are displayed, such as: professional mode, panorama mode, HDR mode, time-lapse photography mode, watermark mode, detailed color restoration mode, etc.
  • shooting mode options are only examples, and can be set and modified as needed. The embodiments of the present application do not impose any restrictions on this.
  • the electronic device 100 can enable the program related to the image processing method provided by the embodiment of the present application during shooting.
  • FIG. 18 is a schematic diagram of a display interface of another electronic device provided by an embodiment of the present application.
  • the electronic device 100 displays a shooting interface as shown in (a) of Figure 18 , with “Settings” displayed in the upper right corner of the shooting interface. button. Users can click the “Settings” button on this interface to enter the setting interface to set related functions.
  • the electronic device 100 displays a setting interface as shown in (b) of FIG. 18 .
  • Multiple functions are displayed on this interface.
  • the photo ratio is used to realize the photo-taking mode.
  • voice-activated photography is used to set whether to trigger by sound in the photo mode
  • video resolution is used to adjust the video resolution
  • video frame rate is used to adjust the video frame rate.
  • the electronic device 100 can activate the program related to the image processing method provided by the embodiment of the present application when shooting.
  • the above are only two examples in which the user enables the image processing method provided by the embodiment of the present application from the display interface of the electronic device.
  • the image processing method provided by the embodiment of the present application can also be enabled in other ways, or it can also be In the shooting process, the image processing method provided by the embodiment of the present application is directly used by default, and the embodiment of the present application does not impose any restrictions on this.
  • FIG 19 shows a hardware system suitable for the electronic device of the present application.
  • the electronic device 100 may be used to implement the image processing method described in the above method embodiment.
  • the electronic device 100 may be a mobile phone, a smart screen, a tablet, a wearable electronic device, a vehicle-mounted electronic device, an augmented reality (AR) device, a virtual reality (VR) device, a notebook computer, or a super mobile personal computer ( ultra-mobile personal computer (UMPC), netbook, personal computer Personal digital assistant (personal digital assistant, PDA), projector, etc., the embodiment of the present application does not place any restrictions on the specific type of the electronic device 100.
  • AR augmented reality
  • VR virtual reality
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (SIM) card interface 195, etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structure shown in FIG. 19 does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or less components than those shown in FIG. 19 , or the electronic device 100 may include a combination of some of the components shown in FIG. 19 , or , the electronic device 100 may include sub-components of some of the components shown in FIG. 19 .
  • the components shown in Figure 19 may be implemented in hardware, software, or a combination of software and hardware.
  • Processor 110 may include one or more processing units.
  • the processor 110 may include at least one of the following processing units: an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), an image signal processor (image signal processor) , ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, neural network processing unit (NPU).
  • an application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural network processing unit
  • different processing units can be independent devices or integrated devices.
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • the processor 110 may display a first interface, the first interface including a first control; detect a first operation on the first control; and in response to the first operation, obtain an original image, the original image including at least Channel signals of 4 colors; preprocess the original image to obtain the first initial image, the second initial image and the third initial image; then perform front-end processing on the second initial image and the third initial image to obtain the front-end processed image ; Fusion process the first initial image and the front-end processed image to obtain the target image.
  • connection relationship between the modules shown in FIG. 19 is only a schematic illustration and does not constitute a limitation on the connection relationship between the modules of the electronic device 100 .
  • each module of the electronic device 100 may also adopt a combination of various connection methods in the above embodiments.
  • the wireless communication function of the electronic device 100 can be implemented through antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, modem processor, baseband processor and other components.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to Covers single or multiple communication bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
  • the electronic device 100 may implement display functions through a GPU, a display screen 194, and an application processor.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display 194 may be used to display images or videos.
  • the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can algorithmically optimize the noise, brightness and color of the image. ISP can also optimize parameters such as exposure and color temperature of the shooting scene.
  • the ISP may be provided in the camera 193.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard red green blue (RGB), YUV and other format image signals.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3 and MPEG4.
  • MPEG moving picture experts group
  • MPEG2 MPEG2
  • MPEG3 MPEG3
  • the hardware system of the electronic device 100 is described in detail above, and the software system of the electronic device 100 is introduced below.
  • Figure 20 is a schematic diagram of a software system of an electronic device provided by an embodiment of the present application.
  • the system architecture may include an application layer 210, an application framework layer 220, a hardware abstraction layer 230, a driver layer 240 and a hardware layer 250.
  • the application layer 210 may include a camera application or other applications.
  • Other applications include but are not limited to: camera, gallery and other applications.
  • the application framework layer 220 can provide an application programming interface (API) and programming framework to applications in the application layer; the application framework layer can include some predefined functions.
  • API application programming interface
  • the application framework layer 220 may include a camera access interface; the camera access interface may include camera management and camera equipment; where camera management may be used to provide an access interface for managing cameras; and the camera device may be used to provide an interface for accessing cameras.
  • Hardware abstraction layer 230 is used to abstract hardware.
  • the hardware abstraction layer can include the camera abstraction layer and other hardware device abstraction layers; the camera hardware abstraction layer can call the camera algorithm in the camera algorithm library.
  • the hardware abstraction layer 230 includes a camera hardware abstraction layer 2301 and a camera algorithm library;
  • the camera algorithm library may include software algorithms; for example, Algorithm 1, Algorithm 2, etc. may be software algorithms for image processing.
  • the driver layer 240 is used to provide drivers for different hardware devices.
  • the driver layer may include camera device drivers, digital signal processor drivers, and graphics processor drivers.
  • the hardware layer 250 may include multiple image sensors, multiple image signal processors, digital signal processors, graphics processors, and other hardware devices.
  • the hardware layer 250 includes a sensor and an image signal processor; the sensor may include sensor 1, sensor 2, depth sensor (time of flight, TOF), multispectral sensor, etc.
  • the image signal processor may include image signal processor 1, image signal processor 2, etc.
  • the connection between the application layer 210 and the application framework layer 220 above the hardware abstraction layer 230 and the driver layer 240 and the hardware layer 250 below can be realized.
  • the camera hardware interface layer in the hardware abstraction layer 230 manufacturers can customize functions here according to needs. Compared with the hardware abstraction layer interface, the camera hardware interface layer is more efficient, flexible, and low-latency, and can also make more abundant calls to ISP and GPU to implement image processing.
  • the image input to the hardware abstraction layer 230 may come from an image sensor or a stored picture.
  • the scheduling layer in the hardware abstraction layer 230 includes general functional interfaces for implementing management and control.
  • the camera service layer in the hardware abstraction layer 230 is used to access interfaces of ISP and other hardware.
  • the following exemplifies the workflow of the software and hardware of the electronic device 100 in conjunction with capturing the photographing scene.
  • the camera application in the application layer may be displayed on the screen of the electronic device 100 in the form of an icon.
  • the electronic device 100 starts to run the camera application.
  • the camera application calls the interface corresponding to the camera application in the application framework layer 210, and then starts the camera driver by calling the hardware abstraction layer 230, turning on the multispectral sensor on the electronic device 100.
  • camera and collect raw images through multispectral sensors can collect according to a certain operating frequency, and the collected images are processed inside the multispectral sensor or transmitted to one or more image signal processors, and then the processed target image or Color images are saved and/or transferred to the monitor for display.
  • FIG. 21 is a schematic diagram of an image processing device 300 provided by an embodiment of the present application.
  • the image processing device 300 includes a display unit 310 , an acquisition unit 320 and a processing unit 330 .
  • the display unit 310 is used to display a first interface, and the first interface includes first controls.
  • the obtaining unit 320 is used to detect the first operation on the first control.
  • the processing unit 330 is configured to acquire an original image in response to the first operation, where the original image includes channel signals of at least 4 colors.
  • the processing unit 330 is also used to pre-process the original image to obtain the first initial image, the second initial image and the third initial image; perform front-end processing on the second initial image and the third initial image to obtain the front-end processed image; An initial image and the front-end processed image are fused to obtain the target image.
  • image processing device 300 is embodied in the form of functional units.
  • unit here can be implemented in the form of software and/or hardware, and is not specifically limited.
  • a "unit” may be a software program, a hardware circuit, or a combination of both that implements the above functions.
  • the hardware circuit may include an application specific integrated circuit (ASIC), an electronic circuit, a processor (such as a shared processor, a dedicated processor, or a group processor) for executing one or more software or firmware programs. etc.) and memory, merged logic circuitry, and/or other suitable components to support the described functionality.
  • ASIC application specific integrated circuit
  • processor such as a shared processor, a dedicated processor, or a group processor for executing one or more software or firmware programs. etc.
  • memory merged logic circuitry, and/or other suitable components to support the described functionality.
  • the units of each example described in the embodiments of the present application can be implemented by electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are performed in hardware or software depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each specific application, but such implementations should not be considered beyond the scope of this application.
  • Embodiments of the present application also provide a computer-readable storage medium, in which computer instructions are stored; when the computer-readable storage medium is run on the image processing device 300, the image processing device 300 causes the image processing device 300 to Execute the image processing method shown previously.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transferred from a website, computer, server, or data center Transmission to another website, computer, server or data center through wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium can be any available medium that can be accessed by a computer or include one or more data storage devices such as servers and data centers that can be integrated with the medium.
  • the available media may be magnetic media (eg, floppy disk, hard disk, tape), optical media, or semiconductor media (eg, solid state disk (SSD)), etc.
  • Embodiments of the present application also provide a computer program product containing computer instructions, which when run on the image processing device 300 enables the image processing device 300 to execute the image processing method shown above.
  • Figure 22 is a schematic structural diagram of a chip provided by an embodiment of the present application.
  • the chip shown in Figure 22 can be a general-purpose processor or a special-purpose processor.
  • the chip includes a processor 401.
  • the processor 401 is used to support the image processing device 300 in executing the technical solutions shown above.
  • the chip also includes a transceiver 402, which is used to accept the control of the processor 401 and to support the image processing device 300 in executing the technical solution shown above.
  • the chip shown in Figure 22 may also include: a storage medium 403.
  • the chip shown in Figure 22 can be implemented using the following circuits or devices: one or more field programmable gate arrays (FPGA), programmable logic devices (PLD) , controller, state machine, gate logic, discrete hardware components, any other suitable circuit, or any combination of circuits capable of performing the various functions described throughout this application.
  • FPGA field programmable gate arrays
  • PLD programmable logic devices
  • controller state machine
  • gate logic discrete hardware components
  • any other suitable circuit any combination of circuits capable of performing the various functions described throughout this application.
  • the electronic equipment, image processing device 300, computer storage media, computer program products, and chips provided by the embodiments of the present application are all used to execute the methods provided above. Therefore, the beneficial effects they can achieve can be referred to the methods provided above. The beneficial effects corresponding to the method will not be repeated here.
  • preset and predefined can be realized by pre-saving corresponding codes, tables or other methods that can be used to indicate relevant information in the device (for example, including electronic devices). , this application does not limit its specific implementation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Image Processing (AREA)

Abstract

La présente demande se rapporte au domaine du traitement d'image et concerne un procédé de traitement d'image et son dispositif associé. Le procédé de traitement d'image consiste : à afficher une première interface, la première interface comprenant une première commande; à détecter une première opération sur la première commande; à la suite de la première opération, à obtenir une image d'origine; à prétraiter l'image d'origine pour obtenir une première image initiale, une deuxième image initiale et une troisième image initiale; à effectuer un traitement d'extrémité avant sur la deuxième image initiale et la troisième image initiale pour obtenir des images traitées d'extrémité avant; et à effectuer un traitement de fusion sur la première image initiale et les images traitées d'extrémité avant pour obtenir une image cible. De cette manière, une meilleure restauration de détails et de couleurs des images peut être réalisée.
PCT/CN2023/087568 2022-05-31 2023-04-11 Procédé de traitement d'image et son dispositif associé WO2023231583A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210606523.5A CN114693580B (zh) 2022-05-31 2022-05-31 图像处理方法及其相关设备
CN202210606523.5 2022-05-31

Publications (1)

Publication Number Publication Date
WO2023231583A1 true WO2023231583A1 (fr) 2023-12-07

Family

ID=82131399

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/087568 WO2023231583A1 (fr) 2022-05-31 2023-04-11 Procédé de traitement d'image et son dispositif associé

Country Status (2)

Country Link
CN (1) CN114693580B (fr)
WO (1) WO2023231583A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114693580B (zh) * 2022-05-31 2022-10-18 荣耀终端有限公司 图像处理方法及其相关设备
CN115908221B (zh) * 2023-03-08 2023-12-08 荣耀终端有限公司 图像处理方法、电子设备及存储介质
CN116630204B (zh) * 2023-07-19 2023-09-26 南京佳格耕耘科技有限公司 遥感影像在线分析处理系统
CN117459836B (zh) * 2023-12-05 2024-05-10 荣耀终端有限公司 一种图像处理方法、设备及存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110876027A (zh) * 2018-08-29 2020-03-10 三星电子株式会社 图像传感器和包括其的电子设备、以及图像缩放处理方法
CN111131798A (zh) * 2019-10-18 2020-05-08 华为技术有限公司 图像处理方法、图像处理装置以及摄像装置
US20200304732A1 (en) * 2019-03-20 2020-09-24 Apple Inc. Multispectral image decorrelation method and system
CN112261391A (zh) * 2020-10-26 2021-01-22 Oppo广东移动通信有限公司 图像处理方法、摄像头组件及移动终端
CN113411506A (zh) * 2020-03-16 2021-09-17 索尼半导体解决方案公司 摄像元件及电子设备
CN113676675A (zh) * 2021-08-16 2021-11-19 Oppo广东移动通信有限公司 图像生成方法、装置、电子设备和计算机可读存储介质
CN114331916A (zh) * 2022-03-07 2022-04-12 荣耀终端有限公司 图像处理方法及电子设备
CN114693580A (zh) * 2022-05-31 2022-07-01 荣耀终端有限公司 图像处理方法及其相关设备

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101834974B (zh) * 2009-03-09 2013-12-18 博立码杰通讯(深圳)有限公司 一种多光谱感光器件及其采样方法
US8638342B2 (en) * 2009-10-20 2014-01-28 Apple Inc. System and method for demosaicing image data using weighted gradients
WO2015084991A1 (fr) * 2013-12-04 2015-06-11 Rambus Inc. Capteur d'image ayant une grande plage dynamique
CN111988587B (zh) * 2017-02-10 2023-02-07 杭州海康威视数字技术股份有限公司 图像融合设备和图像融合方法
CN112767290B (zh) * 2019-11-01 2022-11-11 RealMe重庆移动通信有限公司 图像融合方法、图像融合装置、存储介质与终端设备
CN110944160B (zh) * 2019-11-06 2022-11-04 维沃移动通信有限公司 一种图像处理方法及电子设备
CN113141475B (zh) * 2020-01-17 2024-02-02 思特威(上海)电子科技股份有限公司 成像系统及像素合并方法
CN111405204B (zh) * 2020-03-11 2022-07-26 Oppo广东移动通信有限公司 图像获取方法、成像装置、电子设备及可读存储介质
CN113676708B (zh) * 2021-07-01 2023-11-14 Oppo广东移动通信有限公司 图像生成方法、装置、电子设备和计算机可读存储介质
CN113676635B (zh) * 2021-08-16 2023-05-05 Oppo广东移动通信有限公司 高动态范围图像的生成方法、装置、电子设备和存储介质
CN114125242A (zh) * 2021-12-01 2022-03-01 Oppo广东移动通信有限公司 图像传感器、摄像模组、电子设备、图像生成方法和装置
CN114549383A (zh) * 2022-02-23 2022-05-27 浙江大华技术股份有限公司 一种基于深度学习的图像增强方法、装置、设备及介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110876027A (zh) * 2018-08-29 2020-03-10 三星电子株式会社 图像传感器和包括其的电子设备、以及图像缩放处理方法
US20200304732A1 (en) * 2019-03-20 2020-09-24 Apple Inc. Multispectral image decorrelation method and system
CN111131798A (zh) * 2019-10-18 2020-05-08 华为技术有限公司 图像处理方法、图像处理装置以及摄像装置
CN113411506A (zh) * 2020-03-16 2021-09-17 索尼半导体解决方案公司 摄像元件及电子设备
CN112261391A (zh) * 2020-10-26 2021-01-22 Oppo广东移动通信有限公司 图像处理方法、摄像头组件及移动终端
CN113676675A (zh) * 2021-08-16 2021-11-19 Oppo广东移动通信有限公司 图像生成方法、装置、电子设备和计算机可读存储介质
CN114331916A (zh) * 2022-03-07 2022-04-12 荣耀终端有限公司 图像处理方法及电子设备
CN114693580A (zh) * 2022-05-31 2022-07-01 荣耀终端有限公司 图像处理方法及其相关设备

Also Published As

Publication number Publication date
CN114693580A (zh) 2022-07-01
CN114693580B (zh) 2022-10-18

Similar Documents

Publication Publication Date Title
WO2023231583A1 (fr) Procédé de traitement d'image et son dispositif associé
WO2023016039A1 (fr) Procédé et appareil de traitement vidéo, dispositif électronique et support de stockage
WO2023036034A1 (fr) Procédé de traitement d'images et dispositif correspondant
CN113850367B (zh) 网络模型的训练方法、图像处理方法及其相关设备
WO2023016035A1 (fr) Procédé et appareil de traitement vidéo, dispositif électronique et support de stockage
WO2023016037A1 (fr) Procédé et appareil de traitement vidéo, dispositif électronique et support de stockage
WO2023124123A1 (fr) Procédé de traitement d'images et dispositif correspondant
CN113824914A (zh) 视频处理方法、装置、电子设备和存储介质
EP4195679A1 (fr) Procédé de traitement d'image et dispositif électronique
US20090324127A1 (en) Method and System for Automatic Red-Eye Correction
WO2023016040A1 (fr) Procédé et appareil de traitement vidéo, dispositif électronique et support de stockage
EP4175275A1 (fr) Procédé de traitement d'équilibrage des blancs et dispositif électronique
WO2023016044A1 (fr) Procédé et appareil de traitement vidéo, dispositif électronique et support de stockage
CN109218604A (zh) 影像撷取装置、影像亮度调变方法及影像处理装置
CN115550575B (zh) 图像处理方法及其相关设备
CN103167183B (zh) 一种半透明取景框处理方法、系统及移动终端
WO2023016042A1 (fr) Procédé et appareil de traitement vidéo, dispositif électronique et support de stockage
WO2023016041A1 (fr) Procédé et appareil de traitement vidéo, dispositif électronique et support de stockage
US20230017498A1 (en) Flexible region of interest color processing for cameras
WO2023016043A1 (fr) Procédé et appareil de traitement vidéo, dispositif électronique et support de stockage
CN117135293B (zh) 图像处理方法和电子设备
CN115955611B (zh) 图像处理方法与电子设备
CN114640834B (zh) 一种图像处理的方法以及相关装置
CN116051368B (zh) 图像处理方法及其相关设备
WO2023124165A1 (fr) Procédé de traitement d'image et dispositif électronique associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23814779

Country of ref document: EP

Kind code of ref document: A1