WO2022127174A1 - Procédé de traitement d'image et dispositif électronique - Google Patents

Procédé de traitement d'image et dispositif électronique Download PDF

Info

Publication number
WO2022127174A1
WO2022127174A1 PCT/CN2021/114109 CN2021114109W WO2022127174A1 WO 2022127174 A1 WO2022127174 A1 WO 2022127174A1 CN 2021114109 W CN2021114109 W CN 2021114109W WO 2022127174 A1 WO2022127174 A1 WO 2022127174A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
weight
image
value
original
Prior art date
Application number
PCT/CN2021/114109
Other languages
English (en)
Chinese (zh)
Inventor
杨烨
鹿镇
Original Assignee
北京达佳互联信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京达佳互联信息技术有限公司 filed Critical 北京达佳互联信息技术有限公司
Publication of WO2022127174A1 publication Critical patent/WO2022127174A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present disclosure relates to the field of computer technology, and in particular, to an image processing method and an electronic device.
  • an image processing method comprising:
  • weighted fusion is performed on the pixel value of the original pixel point and the pixel value of the first pixel point to obtain the pixel value of the second pixel point;
  • a second image is generated.
  • an image processing apparatus comprising:
  • an image acquisition module configured to acquire the original image
  • a brightness enhancement module configured to enhance the brightness of the original image to obtain a first image
  • a weight determination module configured to determine a first weight of a first pixel in the first image and an original weight of the original pixel based on the pixel value of the original pixel in the original image, wherein the original The sum of the weight and the first weight is 1, and the first weight has a negative correlation with the pixel value of the original pixel point;
  • the first fusion module is configured to perform weighted fusion on the pixel value of the original pixel point and the pixel value of the first pixel point based on the original weight and the first weight to obtain the pixel value of the second pixel point value;
  • the image generation module is configured to generate a second image based on the pixel values of the second pixel points.
  • an electronic device comprising:
  • a memory for storing the processor-executable instructions
  • processor is configured to execute the instructions to implement the following steps:
  • weighted fusion is performed on the pixel value of the original pixel point and the pixel value of the first pixel point to obtain the pixel value of the second pixel point;
  • a second image is generated.
  • a computer-readable storage medium wherein instructions in the computer-readable storage medium are executed by a processor of an electronic device, so that the electronic device can perform the following steps:
  • weighted fusion is performed on the pixel value of the original pixel point and the pixel value of the first pixel point to obtain the pixel value of the second pixel point;
  • a second image is generated.
  • a computer program product comprising a computer program, the computer program being executed by a processor of an electronic device to implement the following steps:
  • weighted fusion is performed on the pixel value of the original pixel point and the pixel value of the first pixel point to obtain the pixel value of the second pixel point;
  • a second image is generated.
  • the technical solution of the present disclosure obtains the first image by brightening the original image, and then performs weighted fusion of the original image and the first image. Since the first weight of the first image has a negative correlation with the pixel value of the original pixel, there are Targetedly brightens the dark areas of the original image, while preventing overexposure in highlights, improving image quality, and making the final fill light effect meet the needs of human visual perception.
  • Fig. 1 is a flowchart of an image processing method according to an exemplary embodiment.
  • Fig. 2 is a flow chart of a texture enhancement method according to an exemplary embodiment.
  • Fig. 3 is a flow chart of a color enhancement method according to an exemplary embodiment.
  • Fig. 4 is a flow chart of a method for illumination compensation according to an exemplary embodiment.
  • FIG. 5 is a comparison diagram of an original image and a second image according to an exemplary embodiment.
  • Fig. 6 is a structural block diagram of an image processing apparatus according to an exemplary embodiment.
  • Fig. 7 is a block diagram of an electronic device according to an exemplary embodiment.
  • Fig. 8 is a block diagram of an electronic device according to an exemplary embodiment.
  • Fig. 1 is a flowchart of an image processing method according to an exemplary embodiment. As shown in Fig. 1 , the method includes the following steps.
  • step S11 the original image is acquired.
  • the execution body of the embodiment of the present disclosure is an electronic device such as a terminal.
  • the original image is an image stored by the terminal or a preview image captured by the terminal, or the like.
  • step S12 the brightness of the original image is enhanced to obtain a first image.
  • Step S12 is to enhance the overall brightness of the original image, that is, to enhance the brightness of each pixel in the original image to obtain a brightened first image. For example, by performing Gamma curve enhancement on the overall brightness of the original image, a brightened first image is obtained.
  • the Gamma curve is a special tone curve, and the brightness of the image can be adjusted by adjusting the Gamma value.
  • a Gamma value equal to 1 means that the input and output brightness are the same; a Gamma value higher than 1 will cause the output to be darkened, and a Gamma value below 1 will cause the output to be brighter. In practical applications, the Gamma value can be adjusted according to requirements.
  • step S13 the first weight of the first pixel in the first image and the original weight of the original pixel are determined based on the pixel value of the original pixel in the original image, wherein the sum of the original weight and the first weight is 1 , the first weight has a negative correlation with the pixel value of the original pixel point.
  • the original pixel is any pixel in the original image
  • the first pixel is the pixel in the first image corresponding to the position of the original pixel, that is, the position of the first pixel in the first image is the same as the original pixel.
  • the pixel value of the pixel point is the brightness value or gray value of the pixel point.
  • step S13 includes: determining the first weight of the first pixel point based on the first adjustment parameter and the pixel value of the original pixel point, wherein the first adjustment parameter is used to adjust the intensity of the brightness enhancement of the original image , the first adjustment parameter is positively correlated with the first weight.
  • the first adjustment parameter affects the brightness enhancement effect of the image by affecting the size of the first weight, so that the brightness enhancement effect can be flexibly adjusted by flexibly adjusting the first adjustment parameter.
  • x represents the y-channel value of the original pixel in the image yuv space, that is, the luminance value;
  • ⁇ 1 and ⁇ are parameters that are flexibly set according to actual needs, and determine the steepness of the curve.
  • ⁇ 1 and ⁇ determine the steepness of the curve, that is, ⁇ 1 and ⁇ affect the slope of the curve.
  • the first adjustment parameter ⁇ 1 is used to adjust the fusion strength between the original image and the first image, and the first adjustment parameter ⁇ 1 is input by the user, machine input or automatically generated, for example, the size of ⁇ 1 can be determined by the user on the terminal interface Self-adjusting. Since the first weight is related to the first adjustment parameter, the first adjustment parameter can affect the brightness enhancement effect by affecting the size of the first weight, and the first adjustment parameter can be flexibly adjusted as needed, and accordingly, the brightness enhancement effect will also follow the first adjustment. A flexible adjustment is achieved by adjusting the adjustment parameters.
  • the first weight and the second weight perform weighted fusion of the original image and the first image, which can achieve targeted highlighting of the dark areas of the original image, and at the same time effectively suppress overexposure in the highlights.
  • the first weight Alpha 1 of the first pixel is only related to the pixel value of the corresponding original pixel, and has nothing to do with the pixel values of other pixels. Therefore, the first weight of each first pixel can be calculated independently, and there is no need to perform a full Calculation of graph information, thereby improving computational efficiency.
  • step S14 based on the original weight and the first weight, the pixel value of the original pixel point and the pixel value of the first pixel point are weighted and fused to obtain the pixel value of the second pixel point.
  • the pixel value S0 of the original pixel point and the pixel value S1 of the first pixel point can be weighted and fused once.
  • the process of weighted fusion includes: determining the first product of the pixel value of the original pixel point and the original weight, and determining the second product of the pixel value of the first pixel point and the first weight; The sum value is determined as the pixel value of the second pixel point.
  • the pixel value S2 of the pixel point is closer to the pixel value S1 of the first pixel point in the first image; on the contrary, the higher the brightness of the original pixel point (the larger the x), the smaller the first weight Alpha 1 value of the first pixel point, the weighted
  • the pixel value S2 of the second pixel point obtained after fusion is closer to the pixel value S0 of the original pixel point in the original image.
  • FIG. 5 shows a comparison diagram of the original image and the second image. It can be seen from the comparison diagram that, compared with the original image, the dark areas of the second image are highlighted, and the highlights are not overexposed.
  • step S15 a second image is generated based on the pixel values of the second pixel points.
  • a second pixel corresponds to an original pixel, and the second pixel and the original pixel have the same position in the image.
  • the second pixel corresponding to each original pixel can be obtained.
  • the pixel value of the pixel point, and the pixel value of each second pixel point is arranged according to the position to form the second image.
  • the second image is output as the final image.
  • at least one of texture enhancement, color enhancement, and dehazing is also performed on the second image to obtain a final image. For example, performing texture enhancement on the second image to obtain a final image; or performing color enhancement on the second image to obtain a final image; or performing dehazing on the second image to obtain a final image; or performing texture on the second image enhancement and color enhancement to obtain a final image; or, performing texture enhancement and dehazing on the second image to obtain a final image; or, performing color enhancement and dehazing on the second image to obtain a final image; or, performing Texture enhancement, color enhancement, and dehazing to get the final image.
  • the order of performing various processing on the second image can be set flexibly.
  • the second image is further processed by texture enhancement, color enhancement, and dehazing in sequence to obtain a final image.
  • a first image is obtained by brightening the original image, and then weighted fusion is performed on the original image and the first image. Since the first weight of the first image is equal to the pixel value of the original pixel point Therefore, the dark parts of the original image are brightened in a targeted manner, and at the same time, the overexposure of the highlights is prevented, and the image quality is improved, so that the final fill light effect meets the needs of human visual perception.
  • the image processing method provided by the embodiment of the present disclosure performs independent processing on each pixel, without the need to calculate the whole image information, and each pixel can be processed in parallel, which improves the image processing efficiency, so that the original image can be processed in real time, Real-time preview.
  • texture enhancement is also performed on the second image, that is, after step S15, the following steps are further included:
  • Step S16 Based on the pixel values of the second pixels in the second image, texture enhancement is performed on the second image to obtain a third image.
  • the original texture details of the original image may be suppressed, resulting in blurred texture on the naked eye effect. Improve the clarity of the image, thereby enhancing the visual effect of the image.
  • step S16 includes: when the pixel value of the second pixel is less than or equal to the first threshold, reducing the pixel value of the second pixel to obtain the pixel value of the third pixel in the third image ; When the pixel value of the second pixel point is greater than the second threshold, increase the pixel value of the second pixel point to obtain the pixel value of the third pixel point. That is, for the dark parts with strong brightness enhancement, reduce the brightness appropriately, and for the bright parts with weak brightness enhancement, increase the brightness appropriately, so that the texture of the dark and bright parts of the image is more distinct, so that the dark and bright parts of the image are more prominent.
  • the texture details of the parts make the structure of the image texture more distinct and the details richer, realize the texture enhancement of the image, and improve the clarity of the image.
  • the first threshold is less than or equal to 0.5.
  • the second threshold is greater than or equal to 0.5.
  • the first threshold and the second threshold can be flexibly set as required.
  • the first threshold and the second threshold are the same, and both the first threshold and the second threshold are 0.5. In some embodiments, the first threshold is different from the second threshold, eg, the first threshold is a value less than 0.5, and the second threshold is 0.5.
  • the above technical solution can enhance the texture of the second image, and make the texture details of the dark and bright parts of the image stand out, so as to ensure that the structure level of the image texture is more distinct and the details are more abundant.
  • step S16 after the third image is obtained, weighted fusion is performed on the third image and the second image, so as to prevent the texture enhancement effect from offsetting the brightening effect, and improve the texture details of the image while maintaining the brightening effect of the image sense of hierarchy. That is, referring to FIG. 2, the following steps are included after step S16:
  • Step S21 based on the pixel value of the original pixel point, determine the second weight of the second pixel point and the third weight of the third pixel point in the third image, wherein the sum of the second weight and the third weight is 1, and the first weight is 1.
  • the relationship between the two weights changing with the pixel value of the original pixel point is a downward convex function relationship.
  • the third pixel is a pixel corresponding to the position of the second pixel in the third image, and the third pixel and the second pixel have the same position in the image.
  • the functional relationship in which the second weight changes with the pixel value of the original pixel point can be set according to actual needs, for example, the functional relationship is a downward convex quadratic functional relationship or an exponential functional relationship.
  • the above step of determining the second weight of the second pixel point based on the pixel value of the original pixel point includes: determining the second weight of the second pixel point based on the second adjustment parameter and the pixel value of the original pixel point , wherein the second adjustment parameter is used to adjust the intensity of texture enhancement on the second image, and the second adjustment parameter is positively correlated with the second weight.
  • the second adjustment parameter affects the texture enhancement effect of the image by affecting the size of the second weight, so that flexible adjustment of the texture enhancement effect can be achieved by flexibly adjusting the second adjustment parameter.
  • x represents the y channel value of the original pixel in the image yuv space, that is, the luminance value;
  • ⁇ 2 is a parameter flexibly set according to actual requirements, which determines the steepness of the curve.
  • ⁇ 2 determines the steepness of the curve, that is, ⁇ 2 affects the slope of the curve.
  • the second adjustment parameter ⁇ 2 is used to adjust the fusion strength between the second image and the third image, and the second adjustment parameter ⁇ 2 is input by the user, machine input or automatically generated, for example, the size of ⁇ 2 can be determined by the user on the terminal interface self-adjustment. Since the second weight is related to the second adjustment parameter, the second adjustment parameter can affect the texture enhancement effect by affecting the size of the second weight, and the second adjustment parameter can be flexibly adjusted as needed, and accordingly, the texture enhancement effect will also follow the first adjustment. A flexible adjustment is achieved by adjusting the adjustment parameters.
  • the relationship between the second weight and the pixel value of the original pixel point is a downward convex function relationship.
  • the original When the brightness value of the pixel is extreme (too bright or too dark), the second weight Alpha 2 value of the second pixel is larger, and then the second image and the third image are processed based on the second weight and the third weight.
  • Weighted fusion the weaker the effect of texture enhancement on the brightening of the dark parts of the image and the suppression of highlights, thus not only maintaining the brightening of the dark parts without overexposure in the highlights, but also improving the layering of image texture details.
  • the second weight of the second pixel point is only related to the pixel value of the corresponding original pixel point, and has nothing to do with the pixel value of other pixel points, so the second weight of each second pixel point can be calculated independently, without the need for full-image information , so as to improve the computational efficiency.
  • Step S22 based on the second weight and the third weight, weighted fusion is performed on the pixel value of the second pixel point and the pixel value of the third pixel point to obtain the pixel value of the fourth pixel point.
  • S4 is the pixel value of the fourth pixel point
  • Alpha 2 represents the second weight of the second pixel point
  • 1-Alpha 2 is the third weight of the third pixel point.
  • the second weight Alpha 2 value of the second pixel point is higher. is larger, the pixel value S4 of the fourth pixel point obtained by fusion will be closer to the pixel value S2 of the second pixel point. In this way, the effect of texture enhancement on the brightening of dark parts and the suppression of highlights in the image is reduced, and the layering of image texture details is also improved.
  • Step S23 generating a fourth image based on the pixel value of the fourth pixel point.
  • a fourth pixel corresponds to a second pixel, and the fourth pixel and the second pixel have the same position in the image.
  • the pixel value of the fourth pixel point of , and the pixel value of each fourth pixel point is arranged according to the position to form a fourth image.
  • the above technical solution prevents the texture enhancement operation in step S16 from offsetting the brightening effect of the second image by performing a weighted fusion of the third image and the second image, so that the brightening effect of the image is not greatly damaged, while improving The layering of image texture details.
  • color enhancement is also performed on the fourth image, that is, after step S23, the following steps are further included:
  • Step S31 acquiring the red channel value, the blue channel value and the green channel value of the fourth pixel in the fourth image.
  • the fourth pixel is any pixel in the fourth image.
  • the above step S31 includes: acquiring RGB color information of the fourth pixel in the fourth image in the RGB space, where the RGB color information includes a red channel value, a blue channel value and a green channel value.
  • Step S32 based on the color lookup table, correct the red channel value, blue channel value and green channel value of the fourth pixel to obtain the red channel value, blue channel value and green channel value of the fifth pixel in the fifth image value.
  • the RGB color information of the fourth pixel is corrected to obtain the RGB color information of the fifth pixel.
  • the color lookup table includes relationship data between the first color information and the second color information, the first color information is the color information that needs to be corrected, the second color information is the corrected color information, the relationship The data is used to represent the correspondence between the first color information and the second color information.
  • the above-mentioned steps of correcting the RGB color information of the fourth pixel point based on the color lookup table, and obtaining the RGB color information of the fifth pixel point include: From the color lookup table, query the relationship data corresponding to the RGB color information of the fourth pixel; from the relationship data, obtain the RGB color information of the fifth pixel.
  • the RGB color information of the fourth pixel is corrected to obtain the RGB color information of the fifth pixel, that is, the RGB color information of the fourth pixel is corrected according to the following formula, and the fifth pixel is obtained.
  • S4' represents the RGB color information of the fourth pixel
  • S5' represents the RGB color information of the fifth pixel
  • LUT represents the query from the color lookup table
  • the RGB color information of the fifth pixel point corresponding to the RGB color information of the fourth pixel point.
  • the color look-up table can be designed by users in different styles according to actual needs.
  • Step S33 generating a fifth image based on the red channel value, the blue channel value and the green channel value of the fifth pixel.
  • a fifth pixel corresponds to a fourth pixel, and the fifth pixel and the fourth pixel have the same position in the image.
  • the red channel value, blue channel value, and green channel value of the fifth pixel point based on the red channel value, blue channel value and green channel value of each fifth pixel point, determine the pixel value of each fifth pixel ;
  • the pixel value of each fifth pixel point is arranged according to the position to form the fifth image.
  • the above technical solution further improves the image quality and improves the visual effect of the image by enhancing the color of the texture-enhanced image.
  • weighted fusion is also performed on the fifth image and the fourth image, so as to balance the color enhancement effect and the brightening effect, and enhance the color of the image while maintaining the brightening effect of the image Richness, that is, referring to FIG. 3 , after step S33, the following steps are further included:
  • step S34 a third adjustment parameter is obtained, and the third adjustment parameter is used to adjust the intensity of color enhancement of the fourth image.
  • Step S35 based on the third adjustment parameter, determine the fourth weight of the fourth pixel point and the fifth weight of the fifth pixel point, wherein the sum of the fourth weight and the fifth weight is 1.
  • the fifth pixel is a pixel corresponding to the position of the fourth pixel in the fifth image, and the fifth pixel and the fourth pixel have the same position in the image.
  • Step S36 based on the fourth weight and the fifth weight, weighted fusion is performed on the pixel value of the fourth pixel point and the pixel value of the fifth pixel point to obtain the pixel value of the sixth pixel point.
  • the process of weighted fusion includes: determining the fifth product of the pixel value of the fourth pixel point and the fourth weight, and determining the sixth product of the pixel value of the fifth pixel point and the fifth weight; The sum value of the six products is determined as the pixel value of the sixth pixel point.
  • the third adjustment parameter Alpha 3 is input by the user, input by the machine or automatically generated, for example, the size of the third adjustment parameter can be adjusted by the user on the terminal interface. Since the fifth weight of the fifth pixel point and the fourth weight of the fourth pixel point are both determined by Alpha 3 , the intensity of color enhancement can be adjusted according to needs, which improves the flexibility of adjustment.
  • the pixel value of the sixth pixel point is only related to the pixel value of the fourth pixel point and the fifth pixel point corresponding to the position, and has nothing to do with the pixel value of other pixel points, so the pixel value of each sixth pixel point can be calculated independently , there is no need to calculate the whole image information, so as to improve the calculation efficiency and realize the real-time processing of the image.
  • Step S37 generating a sixth image based on the pixel value of the sixth pixel.
  • a sixth pixel corresponds to a fourth pixel, and the sixth pixel and the fourth pixel have the same position in the image.
  • the above embodiments describe the process of performing color enhancement on the fourth image after the fourth image is generated.
  • the generated second image or third image may also be Do color enhancement.
  • the process of performing color enhancement processing on any generated image is the same as the above-mentioned steps 31 to 37, and will not be repeated here.
  • the sixth image is further dehazed to improve the clarity of the image, that is, the following steps are included after step S37:
  • Step S41 acquiring high-frequency information of the sixth pixel in the sixth image.
  • the sixth pixel is any pixel in the sixth image.
  • the high-frequency information indicates the sharpness of the pixel value change between the sixth pixel and its surrounding pixels.
  • High-frequency information also refers to high-frequency components. The larger the high-frequency component is, the more severe the pixel value of the sixth pixel is compared to its surrounding pixels, and the pixel value of the sixth pixel is used to represent the sixth image. Edge details; the smaller the high-frequency component, the smoother the pixel value change of the sixth pixel is compared with its surrounding pixels.
  • Step S42 determining the target blur radius of the sixth pixel based on the high frequency information of the sixth pixel.
  • the process of dehazing the sixth image is similar to the multi-scale retina enhancement algorithm with color recovery (MSRCR).
  • MSRCR multi-scale retina enhancement algorithm with color recovery
  • the target blur radius is determined according to the size of the high frequency component, and the high frequency component has a negative correlation with the target blur radius.
  • the larger the high-frequency component is the smaller the target blur radius is to apply a smaller dehazing intensity, which not only achieves dehazing, but also avoids the loss of edge details of the image; on the contrary, the smaller the high-frequency component is, the larger the target is selected.
  • Step S43 based on the corresponding relationship between the blur radius and the ambient light illumination component, determine the target ambient light illumination component corresponding to the target blur radius of the sixth pixel point.
  • different blur radii such as 3*3, 5*5, 7*7, 9*9, 11*11 are used in advance to perform Gaussian blurring on the sixth image respectively, to obtain the corresponding ambient light illumination components such as L3*3, L5*5, L7*7, L9*9, L11*11, establish the corresponding relationship between the blur radius and the ambient light illumination component.
  • the target ambient light illumination component corresponding to the target blur radius needs to be determined, the target ambient light illumination component corresponding to the target blur radius is determined based on the established correspondence.
  • Gaussian blur is performed on the sixth image based on the target blur radius to obtain the target ambient light illumination component.
  • Step S44 determining the pixel value of the seventh pixel point based on the pixel value of the sixth pixel point and the target ambient light illumination component.
  • the pixel value of the seventh pixel is only related to the pixel value of the corresponding sixth pixel and the target ambient light illumination component, and has nothing to do with other pixels, the pixel value of each seventh pixel can be calculated independently, without the need for full Calculation of image information, thereby improving computing efficiency and realizing real-time processing of images.
  • Step S45 generating a seventh image based on the pixel value of the seventh pixel.
  • a seventh pixel corresponds to a sixth pixel, and the seventh pixel and the sixth pixel have the same position in the image.
  • the part may show a uniform brightening effect, and the image will have a foggy visual effect when observed with the naked eye.
  • the above embodiments describe the process of dehazing the sixth image after the sixth image is generated.
  • the generated second image and third image may also be dehazed.
  • the fourth image, or the fifth image for dehazing may also be dehazed.
  • the process of defogging any generated image is the same as the above steps 41 to 45, and will not be repeated here.
  • At least one of processing such as sharpening and denoising is further performed on the generated image.
  • the generated image is also sharpened; or the generated image is also denoised; or the generated image is also sharpened and denoised.
  • the Laplacian filtering method is used to sharpen the generated image to obtain the eighth image, which realizes the enhancement of high-frequency details.
  • the process of brightening dark areas further enhances the color noise in the device, impairing the visual effect of the resulting image to the naked eye. Therefore, in some embodiments, in order to eliminate the influence of noise, the red channel value, blue channel value and green channel value of each pixel in the generated image are further obtained; The red channel value, blue channel value and green channel value are denoised separately to obtain the final image.
  • the three RGB channels of each pixel in the generated image are denoised respectively to improve image quality.
  • Any suitable method can be used for denoising, such as NL-means, three-dimensional block matching algorithm (BM3D), and so on.
  • the image processing method provided by the embodiments of the present disclosure can effectively brighten the dark parts while suppressing overexposure of highlights, and at the same time, it can enhance the texture and color of the image, improve the layering of the image structure and the visual effect of the final imaging, and improve the image quality ; And can achieve real-time processing, improve the efficiency of image processing, and achieve real-time preview in the shooting window.
  • Fig. 6 is a block diagram of an image processing apparatus according to an exemplary embodiment. 6, the image processing apparatus includes:
  • an image acquisition module 61 configured to acquire an original image
  • the brightness enhancement module 62 is configured to enhance the brightness of the original image to obtain the first image
  • the weight determination module 63 is configured to determine the first weight of the first pixel in the first image and the original weight of the original pixel based on the pixel value of the original pixel in the original image, wherein the sum of the original weight and the first weight The value is 1, and the first weight is negatively correlated with the pixel value of the original pixel;
  • the first fusion module 64 is configured to perform weighted fusion on the pixel value of the original pixel point and the pixel value of the first pixel point based on the original weight and the first weight to obtain the pixel value of the second pixel point;
  • the image generation module 65 is configured to generate a second image based on the pixel values of the second pixel points.
  • the weight determination module 63 is configured to: determine the first weight of the first pixel point based on the first adjustment parameter and the pixel value of the original pixel point, where the first adjustment parameter is used to adjust the brightness enhancement of the original image The strength of the first adjustment parameter is positively correlated with the first weight.
  • weight determination module 63 is configured to:
  • the image processing apparatus further includes:
  • the texture enhancement module is configured to perform texture enhancement on the second image based on the pixel value of the second pixel in the second image to obtain a third image.
  • the texture enhancement module is configured to:
  • the pixel value of the second pixel point is greater than the second threshold, the pixel value of the second pixel point is increased to obtain the pixel value of the third pixel point.
  • the second threshold is greater than or equal to 0.5
  • the image processing apparatus further includes:
  • the second fusion module configured as:
  • the second weight of the second pixel point and the third weight of the third pixel point in the third image are determined, wherein the sum of the second weight and the third weight is 1, and the second weight varies with
  • the relationship between the changes of the pixel values of the original pixel points is a downward convex function relationship
  • weighted fusion is performed on the pixel value of the second pixel point and the pixel value of the third pixel point to obtain the pixel value of the fourth pixel point;
  • a fourth image is generated.
  • the second fusion module is configured to determine the second weight of the second pixel point based on the second adjustment parameter and the pixel value of the original pixel point, the second adjustment parameter is used to adjust the texture enhancement of the second image The strength of the second adjustment parameter is positively correlated with the second weight.
  • the second fusion module is configured to:
  • the image processing apparatus further includes:
  • Color enhancement module configured as:
  • the red channel value, the blue channel value and the green channel value of the fourth pixel are respectively corrected to obtain the red channel value, the blue channel value and the green channel value of the fifth pixel;
  • a fifth image is generated based on the red channel value, the blue channel value and the green channel value of the fifth pixel.
  • the image processing apparatus further includes:
  • the third fusion module configured as:
  • the third adjustment parameter is used to adjust the intensity of color enhancement of the generated image
  • weighted fusion is performed on the pixel value of the fourth pixel point and the pixel value of the fifth pixel point to obtain the pixel value of the sixth pixel point;
  • a sixth image is generated.
  • the image processing apparatus further includes:
  • Light compensation module configured as:
  • a seventh image is generated.
  • the illumination compensation module is configured to:
  • FIG. 7 is a block diagram of an electronic device 700 shown in the present disclosure.
  • electronic device 700 is a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, fitness device, or personal digital assistant, among others.
  • an electronic device 700 includes one or more of the following components: a processing component 702, a memory 704, a power component 706, a multimedia component 708, an audio component 710, an input/output (I/O) interface 712, a sensor component 714, and communication component 716 .
  • the processing component 702 generally controls the overall operation of the electronic device 700, such as operations associated with display, phone calls, data communications, camera operations, and recording operations.
  • the processing component 702 includes an OR processor 720 to execute instructions to perform all or part of the steps of the image processing method of any embodiment.
  • processing component 702 also includes one or a module that facilitates interaction between processing component 702 and other components.
  • processing component 702 includes a multimedia module to facilitate interaction between multimedia component 708 and processing component 702.
  • Memory 704 is configured to store various types of data to support operation at device 700 . Examples of such data include instructions for any application or method operating on electronic device 700, contact data, phonebook data, messages, pictures, videos, and the like. Memory 704 is implemented by any type of volatile or nonvolatile storage device or combination thereof, such as static random access memory (SRAM), erasable programmable read only memory (EPROM), programmable read only memory ( PROM), read only memory (ROM), magnetic memory, flash memory, magnetic disk or optical disk.
  • SRAM static random access memory
  • EPROM erasable programmable read only memory
  • PROM programmable read only memory
  • ROM read only memory
  • magnetic memory flash memory
  • flash memory magnetic disk or optical disk.
  • Power supply component 706 provides power to various components of electronic device 700 .
  • Power component 706 includes a power management system, an or power source, and other components associated with generating, managing, and distributing power to electronic device 700 .
  • Multimedia component 708 includes a screen that provides an output interface between electronic device 700 and the user.
  • the screen includes a liquid crystal display (LCD) and a touch panel (TP).
  • the screen is a touch screen to receive input signals from a user.
  • the touch panel includes an or touch sensor to sense touch, swipe, and gestures on the touch panel.
  • a touch sensor not only senses the boundaries of a touch or swipe action, but also detects the duration and pressure associated with the touch or swipe action.
  • multimedia component 708 includes a front-facing camera and/or a rear-facing camera. When the device 700 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera receive external multimedia data.
  • Each front and rear camera is a fixed optical lens system or has focal length and optical zoom capability.
  • Audio component 710 is configured to output and/or input audio signals.
  • audio component 710 includes a microphone (MIC) that is configured to receive external audio signals when electronic device 700 is in operating modes, such as calling mode, recording mode, and voice recognition mode. The received audio signal is further stored in memory 704 or transmitted via communication component 716 .
  • audio component 710 also includes a speaker for outputting audio signals.
  • the I/O interface 712 provides an interface between the processing component 702 and a peripheral interface module, such as a keyboard, a click wheel or a button, and the like. These buttons may include, but are not limited to: home button, volume buttons, start button, and lock button.
  • Sensor assembly 714 includes one or a sensor for providing status assessment of various aspects of electronic device 700 .
  • the sensor assembly 714 detects the open/closed state of the device 700, the relative positioning of components, such as the display and keypad of the electronic device 700, the sensor assembly 714 also detects a change in the position of the electronic device 700 or a component of the electronic device 700, The presence or absence of user contact with the electronic device 700 , the orientation or acceleration/deceleration of the electronic device 700 and the temperature change of the electronic device 700 .
  • Sensor assembly 714 includes a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact.
  • Sensor assembly 714 also includes a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor assembly 714 also includes an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 716 is configured to facilitate wired or wireless communication between electronic device 700 and other devices.
  • the electronic device 700 accesses a wireless network based on a communication standard, such as WiFi, a carrier network (eg, 2G, 3G, 4G, or 5G), or a combination thereof.
  • the communication component 716 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel.
  • the communication component 716 also includes a near field communication (NFC) module to facilitate short-range communication.
  • the NFC module may be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • electronic device 700 is implemented by one or an application specific integrated circuit (ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), field programmable gate array (FPGA), a controller, a microcontroller, a microprocessor or other electronic components implemented for performing the image processing method of any of the embodiments.
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • DSPD digital signal processing device
  • PLD programmable logic device
  • FPGA field programmable gate array
  • controller a microcontroller, a microprocessor or other electronic components implemented for performing the image processing method of any of the embodiments.
  • Embodiments of the present disclosure also provide an electronic device, including: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to execute the above instructions to achieve the following operations: acquiring an original image; The brightness of the original image is enhanced to obtain the first image; based on the pixel value of the original pixel in the original image, the first weight of the first pixel in the first image and the original weight of the original pixel are determined, wherein the original weight is the same as the first weight.
  • the sum value of a weight is 1, and the first weight has a negative correlation with the pixel value of the original pixel point; based on the original weight and the first weight, the pixel value of the original pixel point and the pixel value of the first pixel point are weighted and fused, The pixel value of the second pixel point is obtained; based on the pixel value of the second pixel point, a second image is generated.
  • the processor is further configured to execute the above-mentioned instructions, so as to implement the image processing methods provided by other embodiments of the above-mentioned method embodiments.
  • non-transitory computer-readable storage medium including instructions, such as memory 704 including instructions, executable by the processor 720 of the electronic device 700 to complete the image of any of the embodiments Approach.
  • the non-transitory computer-readable storage medium is ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
  • Embodiments of the present disclosure also provide a storage medium, where the instructions in the computer-readable storage medium are executed by a processor of an electronic device, so that the electronic device can perform the following steps: acquiring an original image; enhancing the brightness of the original image to obtain the first An image; based on the pixel value of the original pixel in the original image, determine the first weight of the first pixel in the first image and the original weight of the original pixel, wherein the sum of the original weight and the first weight is 1, and the first weight is 1.
  • a weight is negatively correlated with the pixel value of the original pixel point; based on the original weight and the first weight, the pixel value of the original pixel point and the pixel value of the first pixel point are weighted and fused to obtain the pixel value of the second pixel point; Based on the pixel values of the second pixel points, a second image is generated.
  • the instructions in the computer-readable storage medium are executed by the processor of the electronic device, so that the electronic device can execute the image processing methods provided in other embodiments of the foregoing method embodiments.
  • a computer program product includes a computer program, and the computer program can be executed by the processor 720 of the electronic device 700 to complete the image processing method of any one of the embodiments.
  • the computer program is stored in a storage medium of electronic device 700, the computer readable storage medium being ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage devices, and the like.
  • Embodiments of the present disclosure also provide a computer program product, including a computer program, the computer program being executed by a processor to implement the following steps: acquiring an original image; enhancing the brightness of the original image to obtain a first image; based on the original pixels in the original image The pixel value of the point, to determine the first weight of the first pixel in the first image and the original weight of the original pixel, where the sum of the original weight and the first weight is 1, and the first weight and the pixel value of the original pixel A negative correlation; based on the original weight and the first weight, the pixel value of the original pixel point and the pixel value of the first pixel point are weighted and fused to obtain the pixel value of the second pixel point; based on the pixel value of the second pixel point, Generate a second image.
  • the computer program is executed by the processor, so that the processor can execute the image processing methods provided by other embodiments of the above method embodiments.
  • FIG. 8 is a block diagram of an electronic device 800 shown in the present disclosure.
  • the electronic device 800 is provided as a server.
  • electronic device 800 includes processing component 822, which further includes an OR processor, and a memory resource represented by memory 832 for storing instructions executable by processing component 822, such as applications.
  • the application program stored in memory 832 includes one or more modules, each corresponding to a set of instructions.
  • the processing component 822 is configured to execute instructions to perform the image processing method of any of the embodiments.
  • the electronic device 800 also includes a power supply assembly 826 configured to perform power management of the electronic device 800, a wired or wireless network interface 850 configured to connect the electronic device 800 to a network, and an input output (I/O) interface 858.
  • Electronic device 800 operates based on an operating system stored in memory 832, such as WindowsServerTM, MacOSXTM, UnixTM, LinuxTM, FreeBSDTM or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un procédé de traitement d'image et un dispositif électronique. Le procédé comprend les étapes consistant à : acquérir une image brute (S11) ; améliorer la luminosité de l'image brute de façon à obtenir une première image (S12) ; sur la base de valeurs de pixels de points de pixels bruts dans l'image brute, déterminer une première pondération de premiers points de pixels dans la première image et une pondération brute des points de pixels bruts (S13) ; sur la base de la pondération brute et de la première pondération, effectuer une fusion pondérée sur les valeurs de pixels des points de pixels bruts et sur les valeurs de pixels des premiers points de pixels de façon à obtenir des valeurs de pixels de seconds points de pixels (S14) ; et générer une seconde image sur la base des valeurs de pixels des seconds points de pixels (S15).
PCT/CN2021/114109 2020-12-18 2021-08-23 Procédé de traitement d'image et dispositif électronique WO2022127174A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011507495.9 2020-12-18
CN202011507495.9A CN112614064B (zh) 2020-12-18 2020-12-18 图像处理方法、装置、电子设备及存储介质

Publications (1)

Publication Number Publication Date
WO2022127174A1 true WO2022127174A1 (fr) 2022-06-23

Family

ID=75240691

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/114109 WO2022127174A1 (fr) 2020-12-18 2021-08-23 Procédé de traitement d'image et dispositif électronique

Country Status (2)

Country Link
CN (1) CN112614064B (fr)
WO (1) WO2022127174A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112614064B (zh) * 2020-12-18 2023-04-25 北京达佳互联信息技术有限公司 图像处理方法、装置、电子设备及存储介质
CN113781370A (zh) * 2021-08-19 2021-12-10 北京旷视科技有限公司 图像的增强方法、装置和电子设备
CN115115554B (zh) * 2022-08-30 2022-11-04 腾讯科技(深圳)有限公司 基于增强图像的图像处理方法、装置和计算机设备

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040136603A1 (en) * 2002-07-18 2004-07-15 Vitsnudel Iiia Enhanced wide dynamic range in imaging
US20100232726A1 (en) * 2009-03-13 2010-09-16 Liu Yung-Chiu Method for simulating dark-part exposure compensation of high dynamic range images using a single image and image processing device for use with the method
CN109829864A (zh) * 2019-01-30 2019-05-31 北京达佳互联信息技术有限公司 图像处理方法、装置、设备及存储介质
CN110619610A (zh) * 2019-09-12 2019-12-27 紫光展讯通信(惠州)有限公司 图像处理方法及装置
CN110766621A (zh) * 2019-10-09 2020-02-07 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN111145114A (zh) * 2019-12-19 2020-05-12 腾讯科技(深圳)有限公司 一种图像增强的方法、装置及计算机可读存储介质
CN111311532A (zh) * 2020-03-26 2020-06-19 深圳市商汤科技有限公司 图像处理方法及装置、电子设备、存储介质
CN111325680A (zh) * 2020-01-08 2020-06-23 深圳深知未来智能有限公司 一种具有抑制局部过曝的图像增亮方法
CN112614064A (zh) * 2020-12-18 2021-04-06 北京达佳互联信息技术有限公司 图像处理方法、装置、电子设备及存储介质

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530848A (zh) * 2013-09-27 2014-01-22 中国人民解放军空军工程大学 一种非均匀光照图像二次曝光的实现方法
CN105046663B (zh) * 2015-07-10 2017-08-04 西南科技大学 一种模拟人类视觉感知的自适应低照度图像增强方法
US9881364B2 (en) * 2015-08-10 2018-01-30 Fuji Xerox Co., Ltd. Image processing apparatus, image processing method and computer readable medium for image enhancement
CN109272459B (zh) * 2018-08-20 2020-12-01 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040136603A1 (en) * 2002-07-18 2004-07-15 Vitsnudel Iiia Enhanced wide dynamic range in imaging
US20100232726A1 (en) * 2009-03-13 2010-09-16 Liu Yung-Chiu Method for simulating dark-part exposure compensation of high dynamic range images using a single image and image processing device for use with the method
CN109829864A (zh) * 2019-01-30 2019-05-31 北京达佳互联信息技术有限公司 图像处理方法、装置、设备及存储介质
CN110619610A (zh) * 2019-09-12 2019-12-27 紫光展讯通信(惠州)有限公司 图像处理方法及装置
CN110766621A (zh) * 2019-10-09 2020-02-07 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN111145114A (zh) * 2019-12-19 2020-05-12 腾讯科技(深圳)有限公司 一种图像增强的方法、装置及计算机可读存储介质
CN111325680A (zh) * 2020-01-08 2020-06-23 深圳深知未来智能有限公司 一种具有抑制局部过曝的图像增亮方法
CN111311532A (zh) * 2020-03-26 2020-06-19 深圳市商汤科技有限公司 图像处理方法及装置、电子设备、存储介质
CN112614064A (zh) * 2020-12-18 2021-04-06 北京达佳互联信息技术有限公司 图像处理方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN112614064B (zh) 2023-04-25
CN112614064A (zh) 2021-04-06

Similar Documents

Publication Publication Date Title
WO2022127174A1 (fr) Procédé de traitement d'image et dispositif électronique
CN109345485B (zh) 一种图像增强方法、装置、电子设备及存储介质
TWI549106B (zh) 影像顯示方法及電子裝置
CN110958401B (zh) 一种超级夜景图像颜色校正方法、装置和电子设备
CN111709890B (zh) 一种图像增强模型的训练方法、装置及存储介质
JP7166171B2 (ja) インタフェース画像の表示方法、装置及びプログラム
CN108986053B (zh) 屏幕显示方法及装置
CN112950499B (zh) 图像处理方法、装置、电子设备及存储介质
CN113450713B (zh) 屏幕显示方法及装置、灰阶映射信息生成方法及装置
CN111625213A (zh) 画面显示方法、装置和存储介质
CN106803920B (zh) 一种图像处理的方法、装置及智能会议终端
CN113674718B (zh) 显示亮度调整方法、装置及存储介质
CN107563957B (zh) 眼部图像处理方法及装置
CN115239570A (zh) 图像处理方法、图像处理装置及存储介质
CN111901519B (zh) 屏幕补光方法、装置及电子设备
CN110662115B (zh) 视频处理方法、装置、电子设备及存储介质
CN105472228B (zh) 图像处理方法、装置及终端
CN108156381B (zh) 拍照方法及装置
CN116866495A (zh) 图像获取方法、装置、终端设备及存储介质
WO2022226963A1 (fr) Procédé et appareil de traitement d'images, dispositif électronique et support de stockage
CN112785537A (zh) 图像处理方法、装置以及存储介质
CN112019680A (zh) 屏幕亮度调整方法及装置
US20160292825A1 (en) System and method to refine image data
WO2023240624A1 (fr) Procédé de réglage de luminosité, appareil et support d'enregistrement
EP4304188A1 (fr) Procédé et appareil de photographie, support et puce

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21905107

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 02.10.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21905107

Country of ref document: EP

Kind code of ref document: A1