CN117616775A - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN117616775A
CN117616775A CN202280004418.2A CN202280004418A CN117616775A CN 117616775 A CN117616775 A CN 117616775A CN 202280004418 A CN202280004418 A CN 202280004418A CN 117616775 A CN117616775 A CN 117616775A
Authority
CN
China
Prior art keywords
image
brightness
images
frames
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280004418.2A
Other languages
Chinese (zh)
Inventor
张超
龚瑶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Publication of CN117616775A publication Critical patent/CN117616775A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control

Abstract

The disclosure relates to an image processing method and device, wherein the method comprises the following steps: acquiring a plurality of frames of first images, wherein the plurality of frames of first images are acquired according to different exposure degrees, and the plurality of frames of first images are reference images for performing white balance processing on a shot second image; for the multiple frames of first images, determining weights of the multiple frames of first images at the same image position according to the image brightness; weighting the gray scale of the first image of each frame according to the weight of the first image of the plurality of frames at the same image position so as to determine the fusion gray scale at least one image position; and performing white balance processing on the second image according to the fusion gray level at the at least one image position.

Description

Image processing method and device Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus.
Background
With the development of economy and the progress of technology, cameras with photographing functions are increasingly appearing in people's lives. When the camera is used for photographing, reasons such as improper exposure, influence of color environment, inconformity of color temperature of the camera and color temperature of illumination light rays, and the like, the color cast problem of an image collected by the camera can be caused, so that the original color of an object cannot be known, and the display effect of the image is poor.
Currently, for a color cast image, a white balance process is generally adopted to perform color correction on the image, and the accuracy of the white balance correction is important to make the image approach to a true color, so as to improve the display effect of the image.
Disclosure of Invention
The disclosure provides an image processing method and device.
According to a first aspect of embodiments of the present disclosure, there is provided an image processing method, the method including: acquiring a plurality of frames of first images, wherein the plurality of frames of first images are acquired according to different exposure degrees, and the plurality of frames of first images are reference images for performing white balance processing on a shot second image; for the multiple frames of first images, determining weights of the multiple frames of first images at the same image position according to the image brightness; weighting the gray scale of the first image of each frame according to the weight of the first image of the plurality of frames at the same image position so as to determine the fusion gray scale at least one image position; and performing white balance processing on the second image according to the fusion gray level at the at least one image position.
In one embodiment of the disclosure, the determining weights of the multiple frames of the first image at the same image position according to the image brightness includes: determining a brightness interval of the first image of each frame at the same image position according to the image brightness of the first image of each frame at the same image position; and determining the weight of the first image at the same image position of each frame according to the corresponding relation between the brightness interval and the weight.
In one embodiment of the present disclosure, the luminance section of the plurality of frames of the first image includes: a high brightness section and a low brightness section divided in brightness order; and when the brightness of the image is lower than a first threshold value, the weight corresponding to the low brightness interval is smaller than the weight corresponding to the high brightness interval.
In one embodiment of the present disclosure, the luminance section of the plurality of frames of the first image includes: a high brightness section and a low brightness section divided in brightness order; and when the brightness of the image is higher than a second threshold value, the weight corresponding to the high brightness interval is smaller than the weight corresponding to the low brightness interval.
In one embodiment of the disclosure, before determining, according to the image brightness of the first image at the same image position of each frame, a brightness interval to which the first image of each frame belongs at the same image position, the method further includes: the first image of each frame is partitioned in the same partitioning mode, so that a plurality of subareas are obtained; and determining a brightness average value of pixel points in the same subarea aiming at the first image of any frame, and taking the brightness average value as the brightness of the corresponding subarea.
In one embodiment of the present disclosure, the performing white balance processing on the captured second image according to the fusion gray scale at the at least one image position includes: clustering the fusion gray scales at the positions of the images to determine a center gray scale; determining a white balance gain value according to the difference between the center gray scale and the reference gray scale; and performing white balance processing on the second image based on the white balance gain value.
In one embodiment of the present disclosure, the second image is an image acquired in response to a photographing operation, and the plurality of frames of the first image is an image acquired after the second image is acquired.
According to a second aspect of embodiments of the present disclosure, there is also provided an image processing apparatus including: the acquisition module is used for acquiring a plurality of frames of first images, wherein the plurality of frames of first images are acquired according to different exposure degrees, and the plurality of frames of first images are reference images for performing white balance processing on a shot second image; the first determining module is used for determining weights of the multiple frames of first images at the same image position according to the image brightness; the second determining module is used for weighting the gray scale of the first image of each frame according to the weight of the first image of the plurality of frames at the same image position so as to determine the fusion gray scale at least one image position; and the processing module is used for carrying out white balance processing on the second image according to the fusion gray level at the position of the at least one image.
In one embodiment of the present disclosure, the first determining module includes: a first determining unit, configured to determine, according to image brightness of the first image at the same image position of each frame, a brightness interval to which the first image of each frame belongs at the same image position; and the second determining unit is used for determining the weight of the first image at the same image position of each frame according to the corresponding relation between the brightness interval and the weight.
In one embodiment of the present disclosure, the luminance section of the plurality of frames of the first image includes: a high brightness section and a low brightness section divided in brightness order; and when the brightness of the image is lower than a first threshold value, the weight corresponding to the low brightness interval is smaller than the weight corresponding to the high brightness interval.
In one embodiment of the present disclosure, the luminance section of the plurality of frames of the first image includes: a high brightness section and a low brightness section divided in brightness order; and when the brightness of the image is higher than a second threshold value, the weight corresponding to the high brightness interval is smaller than the weight corresponding to the low brightness interval.
In one embodiment of the disclosure, the first determining module further includes: the first processing unit is used for partitioning the first image of each frame in the same partitioning mode to obtain a plurality of sub-areas; and the third determining unit is used for determining a brightness average value of the pixel points in the same subarea aiming at the first image of any frame, and taking the brightness average value as the brightness of the corresponding subarea.
In one embodiment of the present disclosure, the processing module includes: the clustering unit is used for clustering the fusion gray scales at the positions of the images so as to determine the center gray scale; a fourth determining unit for determining a white balance gain value according to a difference between the center gray and the reference gray; and the second processing unit is used for carrying out white balance processing on the second image based on the white balance gain value.
In one embodiment of the present disclosure, the second image is an image acquired in response to a photographing operation, and the plurality of frames of the first image is an image acquired after the second image is acquired.
According to a third aspect of embodiments of the present disclosure, there is also provided an electronic device, including: a processor; a memory for storing the processor-executable instructions; wherein the processor is configured to execute the instructions to implement the image processing method as described above.
According to a fourth aspect of embodiments of the present disclosure, there is also provided a non-transitory computer-readable storage medium, which when executed by a processor, causes the processor to perform the image processing method as described above.
The method comprises the steps of acquiring a plurality of frames of first images, acquiring the plurality of frames of first images according to different exposure degrees, performing white balance processing on a shot second image, determining weights of the plurality of frames of first images at the same image position according to image brightness of the plurality of frames of first images, weighting gray scales of the plurality of frames of first images at the same image position according to the weights of the plurality of frames of first images to determine fusion gray scales at least one image position, performing white balance processing on the second image according to the fusion gray scales at the at least one image position, and improving the accuracy of white balance correction, so that the shot image is closer to the true color, and the display effect of the image is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure and do not constitute an undue limitation on the disclosure.
FIG. 1 is a flow chart of an image processing method of one embodiment of the present disclosure;
FIG. 2 is a flow chart of an image processing method according to another embodiment of the present disclosure;
FIG. 3 is an exemplary diagram of a manner of acquiring multiple frames of a first image and a second image in accordance with one embodiment of the present disclosure;
FIG. 4 is an exemplary diagram of a multi-frame first image of one embodiment of the present disclosure;
fig. 5 is a schematic structural view of an image processing apparatus according to an embodiment of the present disclosure;
fig. 6 is a block diagram of an electronic device, according to an exemplary embodiment of the present disclosure.
Detailed Description
In order to enable those skilled in the art to better understand the technical solutions of the present disclosure, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the foregoing figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the disclosure described herein may be capable of operation in sequences other than those illustrated or described herein. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
It can be understood that, in the gray world hypothesis-based white balance processing method, a frame of preview image can be taken from the moment of photographing, the preview image is divided into 64×48 sub-areas, each sub-area corresponds to a data point, the average brightness value of all pixel points in the sub-area is used as the brightness corresponding to the data point, whether the gray level of the 64×48 data points is located in a specified gray area is analyzed, and then the white balance gain value is determined according to the gray level of the data point located in the gray area, so as to perform white balance processing on the photographed image according to the white balance gain value. In this way, in a scene such as an extremely dim light or a night scene, since most of the images of the preview image are extremely dark, the number of data points located in the gray area is extremely small, and the error of the white balance gain value determined by the gray level of the data points is extremely large, so that the accuracy of white balance correction by this method is poor, the corrected image still has a color cast problem, and the display effect of the image is poor.
In order to improve the accuracy of white balance correction, the embodiment of the disclosure provides an image processing method, an image processing device, electronic equipment and a storage medium. The image processing method comprises the following steps: acquiring a plurality of frames of first images, wherein the plurality of frames of first images are acquired according to different exposure degrees, and the plurality of frames of first images are reference images for performing white balance processing on a shot second image; for a plurality of frames of first images, determining weights of the plurality of frames of first images at the same image positions according to the image brightness; weighting the gray scale of the first image of each frame according to the weight of the first image of the plurality of frames at the same image position so as to determine the fusion gray scale at least one image position; and performing white balance processing on the second image according to the fusion gray level at the at least one image position. Therefore, the accuracy of white balance correction is improved, so that the shot image is more similar to the true color, and the display effect of the image is improved.
The image processing method, apparatus, electronic device and storage medium provided by the present disclosure are described below with reference to the accompanying drawings.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present disclosure. It should be noted that, the image processing method of the present embodiment is performed by an image processing apparatus, and the image processing apparatus may be implemented by software and/or hardware, and the image processing apparatus may be configured in an electronic device, where the electronic device may include a terminal device with an image processing function such as a mobile terminal (e.g., a mobile phone), a tablet computer, or may be a camera device with image capturing and image processing functions, and the disclosure is not limited thereto. The following description will take an image processing apparatus as an example of an execution subject of the image processing method. It should be noted that, the image processing method provided by the embodiment of the present disclosure may be applied to a scene such as an extremely dim light or a night scene, and may also be applied to other scenes, which is not limited in this disclosure.
As shown in fig. 1, the method comprises the steps of:
step 101, acquiring a plurality of frames of first images, wherein the plurality of frames of first images are acquired according to different exposure degrees, and the plurality of frames of first images are reference images for performing white balance processing on a shot second image.
The exposure degree is understood to be the intensity and time of perceived brightness.
The second image is an image acquired in response to a shooting operation, that is, an image to be subjected to white balance processing. The multi-frame first image is a reference image for performing white balance processing on the photographed second image.
The exposure degrees corresponding to the first images of each frame may be different in the first images of the plurality of frames, or the exposure degrees corresponding to the first images of at least two frames may be the same in the first images of the plurality of frames, which is not limited in the disclosure.
In addition, the first image and the second image may be acquired by the image processing apparatus or may be acquired by the image processing apparatus from another device, which is not limited in the present disclosure.
Step 102, for multiple frames of first images, determining weights of multiple frames of first images at the same image positions according to the image brightness.
The image position may be a position where one pixel point in the first image is located, or a position where a plurality of adjacent pixel points in the first image are located, or a position where one sub-area is located after the first image is divided into a plurality of sub-areas, or a position where a plurality of adjacent sub-areas are located, etc., which is not limited in this disclosure. One image location corresponds to one data point.
Wherein the weights of the first images of each frame at each image position are values between 0 and 1, and the sum of the weights of the first images of the plurality of frames at the same image position is 1.
In an example embodiment, for any image location, the weight of each frame of the first image at that image location may be determined from the image brightness of the plurality of frames of the first image at that image location.
For example, taking the number of first images as 3 frames, each frame of first images is divided into 64×48=3072 sub-areas in the same block manner as an example, for the 3 frames of first images, the weight of the 3 frames of first images at 3072 sub-areas may be determined. Taking any sub-region a as an example, the sum of weights of the 3 frames of first images at the same sub-region a is 1, and the weight of each frame of first images at the sub-region a can be determined according to the image brightness of the 3 frames of first images at the sub-region a.
Step 103, weighting the gray scale of the first image of each frame according to the weight of the first image of the plurality of frames at the same image position to determine the fusion gray scale at least one image position.
In an example embodiment, for any image location, the gray scale of each frame first image at that image location may be weighted according to the weight of that image location to determine the blended gray scale at that image location.
It should be noted that, in the embodiment of the present disclosure, the fusion gray level at least one image position may include the fusion gray level at all image positions, or may include the fusion gray level at a part of the image positions in all image positions, which is not limited in the present disclosure.
In an example embodiment, in the case of a fusion gray scale at least one image location, including a fusion gray scale at a partial image location in all image locations, the fusion gray scale at the partial image location may be obtained by: judging whether the fusion gray scales at all the image positions are positioned in a preset gray scale interval or not, and determining the fusion gray scales positioned in the preset gray scale interval as the fusion gray scales at part of the image positions in all the image positions. The preset gray scale interval can be set arbitrarily according to the requirement, which is not limited in the disclosure.
And 104, performing white balance processing on the second image according to the fusion gray scale at the position of at least one image.
In an example embodiment, the white balance gain value may be determined based on a blended gray level of the first image at least one image position of each frame, and the white balance processing may be performed on the second image according to the white balance gain value.
Because the fusion gray scale at each image position is determined by weighting according to the gray scale of the multi-frame first image at the image position and the corresponding weight, in the embodiment of the disclosure, the white balance processing is performed on the second image according to the gray scale of the multi-frame first image at each image position. Because each image position corresponds to one data point, the image processing method provided by the disclosure performs white balance processing according to the gray scale of more data points, so that the error of the determined white balance gain value is smaller, the accuracy of white balance correction is high, and the corrected second image is closer to the true color, so that the display effect of the second image is improved.
In summary, according to the image processing method of the embodiment of the disclosure, a plurality of frames of first images are acquired according to different exposure degrees, the plurality of frames of first images are reference images for performing white balance processing on a shot second image, weights of the plurality of frames of first images at the same image position are determined according to image brightness, gray scales of the plurality of frames of first images at the same image position are weighted according to the weights of the plurality of frames of first images to determine fusion gray scales at least one image position, and white balance processing is performed on the second image according to the fusion gray scales at least one image position, so that the accuracy of white balance correction is improved, the shot image is closer to real colors, and the display effect of the image is improved.
The image processing method provided by the embodiment of the present disclosure is further described below with reference to fig. 2. Fig. 2 is a flowchart of an image processing method according to another embodiment of the present disclosure. As shown in fig. 2, the image processing method may include the steps of:
step 201, acquiring a plurality of frames of first images, wherein the plurality of frames of first images are acquired according to different exposure degrees, and the plurality of frames of first images are reference images for performing white balance processing on a photographed second image.
In an example embodiment, the exposure levels may be divided into three levels, high exposure level, standard exposure level, low exposure level, in order of the exposure level from high to low. Wherein the high exposure is greater than the standard exposure and the standard exposure is greater than the low exposure.
The first images of the multiple frames can be images acquired according to each exposure level. Wherein the number of first images acquired according to each exposure level may be one or more frames, which is not limited in the present disclosure.
The second image, which is an image acquired in response to the photographing operation, may be an image acquired after the second image is acquired. Alternatively, the plurality of frames of the first image may be images acquired before the second image is acquired. In addition, a difference between the acquisition time of the plurality of frames of the first image and the acquisition time of the second image may be smaller than a preset time threshold. The preset time threshold may be set as required, for example, 1ms (millisecond), 0.5ms, 2ms, etc.
In the following, referring to fig. 3, the process of acquiring the multiple frames of the first image and the second image is illustrated by taking the first image and the second image as the image capturing device and then transmitting the acquired first image and the acquired second image to the image processing device, where the multiple frames of the first image are the images acquired after the second image is acquired, and the number of the first images acquired according to each exposure level is taken as one frame as an example. Wherein the numbers in fig. 3 correspond to what frame of image.
Referring to fig. 3, in response to a photographing operation of a user, the photographing apparatus acquires a frame of a second image a, which is a 43 rd frame of image acquired by the photographing apparatus, the photographing apparatus may sequentially acquire a frame of a first image b according to a standard exposure degree, a frame of a first image c according to a high exposure degree, and a frame of a first image d according to a low exposure degree after acquiring the second image a, and then transmit the second image a, the first image b, the first image c, and the first image d to the image processing apparatus, so that the image processing apparatus may acquire each of the first images acquired according to the standard exposure degree, the high exposure degree, and the low exposure degree, respectively, and the second image to be subjected to a white balance process.
It should be noted that, the first image most adjacent to the capturing time of the second image in the first image of each frame may be one frame image of a frame adjacent to the second image, or may be one frame image separated from the second image by a plurality of frames, for example, the first image b and the second image a shown in fig. 3 are separated by 2 frames, which is not limited in this disclosure.
In addition, the order in which the first images are acquired according to different exposure degrees may be arbitrarily set as needed. For example, a first image may be acquired according to a standard exposure, a first image may be acquired according to a high exposure, and a first image may be acquired according to a low exposure, or a first image may be acquired according to a high exposure, a first image may be acquired according to a standard exposure, and a first image may be acquired according to a low exposure.
Step 202, determining the brightness interval of each frame of the first image at the same image position according to the image brightness of each frame of the first image at the same image position.
Wherein, the brightness may be divided into a plurality of brightness intervals in order of brightness from high to low, taking 3 brightness intervals as an example, the 3 brightness intervals may include: a high brightness section, a standard brightness section, and a low brightness section divided in brightness order.
In an exemplary embodiment, one image position may be a position where one sub-region is located after the first image is divided into a plurality of sub-regions, and correspondingly, the image brightness at one image position may be the image brightness at the corresponding sub-region, where the image brightness at the sub-region may be the brightness average value of all the pixel points in the sub-region. Accordingly, before step 202, the method may further include:
The first image of each frame is partitioned in the same partitioning mode, so that a plurality of subareas are obtained;
and determining a brightness average value of the pixel points in the same subarea aiming at the first image of any frame, and taking the brightness average value as the brightness of the corresponding subarea.
Step 203, determining the weight of the first image of each frame at the same image position according to the corresponding relation between the brightness interval and the weight.
In an example embodiment, a correspondence between the brightness interval and the weight may be preset, and further, the weight of the first image of each frame at the same image position may be determined according to the brightness interval to which the first image of each frame belongs at the same image position and the correspondence.
The weight corresponding to the high brightness interval can be smaller than the weight corresponding to the standard brightness interval; the weight corresponding to the low luminance interval may be smaller than the weight corresponding to the standard luminance interval. That is, the correspondence between the luminance section and the weight can be set following the following principle: the standard brightness interval has the largest weight, and the high brightness interval and the low brightness interval have smaller weight relative to the standard brightness interval.
In an example embodiment, the luminance interval to which the multiple frames of the first image belong at the same image position may include: the standard luminance interval and the low luminance interval are divided according to the luminance sequence, and accordingly, the relationship between the weight corresponding to the preset low luminance interval and the weight corresponding to the standard luminance interval can be determined according to: the weight corresponding to the low brightness interval is smaller than the weight corresponding to the standard brightness interval, and the weight of the first image of each frame at the same image position is determined.
Referring to fig. 4, wherein a first image b is acquired at a standard exposure, a first image c is acquired at a high exposure, and a first image d is acquired at a low exposure. Wherein, each frame of first image is partitioned according to the partitioning mode in fig. 4, and each frame of first image is divided into 25 sub-areas. The sub-region A, B, C, D of the first image of each frame is a region of the first image with lower brightness, and the sub-region E, F, G, H is a region of the first image with higher brightness. And for any frame of the first image, the image brightness is the same at sub-region A, B, C, D and the image brightness is the same at sub-region E, F, G, H.
Referring to fig. 4, taking the sub-region a as an example, for example, determining, according to the image brightness of the first image d at the sub-region a, a brightness interval to which the first image d of the frame belongs at the sub-region a, as a low brightness interval; according to the image brightness of the first image b at the subarea A, determining the brightness interval of the first image b of the frame at the subarea A as a low brightness interval; and determining the brightness interval of the first image c of the frame at the subarea A as the standard brightness interval according to the image brightness of the first image c at the subarea A. The weight of the first image c at the sub-area a may be determined to be greater than the weights of the first images b and d at the sub-area a according to the weight corresponding to the low luminance interval being less than the weight corresponding to the standard luminance interval.
Similarly, it may be determined that the first image C has a weight at sub-region B that is greater than the weights of the first images B and D at sub-region B, the first image C has a weight at sub-region C that is greater than the weights of the first images B and D at sub-region C, and the first image C has a weight at sub-region D that is greater than the weights of the first images B and D at sub-region D.
Since the sum of the weights of the first image at the same image position of each frame is 1, it can be determined that the weights of the first image c at the sub-region A, B, C, D are 0.6, respectively, and the weights of the first images b and d at the sub-region A, B, C, D are 0.2, respectively, for example.
It will be appreciated that taking an example of an extremely dark light scene (i.e., a night low light scene) in a night scene, in which the image brightness at most image positions in the first image acquired according to the low or standard exposure may fall within a low brightness interval, i.e., the image brightness at most image positions in the first images d and b of fig. 4 is similar to the image brightness at the sub-area A, B, C, D. The grayscales at these image positions are not located within the prescribed gray areas, and the accuracy of white balance is poor without performing white balance processing using the grayscales at these image positions. In the first image acquired at high exposure, however, the brightness intervals to which these image positions belong may be standard brightness intervals. The luminance section of the multi-frame first image at the same image position includes a standard luminance section and a low luminance section. In the embodiment of the disclosure, the accuracy of white balance correction can be improved by performing white balance processing by using the gray scale of the image position where the image brightness belongs to the standard brightness interval and the low brightness interval, and the gray scale of the image position where the image brightness belongs to the standard brightness interval can be implemented to a greater extent by setting the weight corresponding to the standard brightness interval to be greater than the weight corresponding to the low brightness interval, so that the accuracy of white balance correction is further improved.
In an example embodiment, the luminance interval to which the multiple frames of the first image belong at the same image position may include: the high brightness interval and the standard brightness interval are divided according to the brightness sequence, and correspondingly, according to the relation between the preset weight corresponding to the high brightness interval and the preset weight corresponding to the standard brightness interval: the weight corresponding to the high brightness interval is smaller than the weight corresponding to the standard brightness interval, and the weight of the first image of each frame at the same image position is determined.
Referring to fig. 4, taking the sub-region E as an example, for example, according to the image brightness of the first image d at the sub-region E, determining the brightness interval to which the first image d of the frame belongs at the sub-region E as a standard brightness interval; according to the image brightness of the first image b at the subarea E, determining the brightness interval of the first image b of the frame at the subarea E as a standard brightness interval; and determining the brightness interval of the first image c of the frame at the subarea E as a high brightness interval according to the image brightness of the first image c at the subarea E. The weights of the first images b and d at the sub-region E may be determined to be greater than the weight of the first image c at the sub-region E according to the weight corresponding to the high brightness interval being less than the weight corresponding to the standard brightness interval.
Similarly, it may be determined that the weights of the first images b and d at the sub-region F are greater than the weights of the first image c at the sub-region F, the weights of the first images b and d at the sub-region G are greater than the weights of the first image c at the sub-region G, the weights of the first images b and d at the sub-region H are greater than the weights of the first image c at the sub-region H.
Since the sum of the weights of the first image at the same image position of each frame is 1, for example, it may be determined that the weights of the first image b at the sub-regions E, F, G, H are respectively 0.5, the weights of the first image d at the sub-regions E, F, G, H are respectively 0.4, and the weights of the first image d at the sub-regions E, F, G, H are respectively 0.1.
It should be noted that, for the multiple frames of first images, when the brightness intervals to which at least two frames of first images belong at the same image position are the same, the weights of the at least two frames of first images at the same image position may be the same, for example, the weights of the first images b and d at the sub-area a in the above example are both 0.2, or the weights of the at least two frames of first images at the same image position may be different, for example, the weight of the first image b at the sub-area E in the above example is 0.5, and the weight of the first image d at the sub-area E is 0.4.
It will be appreciated that taking an example of a strong light scene in a night scene (i.e. a night highlight scene), in which the image brightness at most of the image positions in the first image acquired according to the high exposure level may belong to a high brightness interval, i.e. the image brightness at most of the image positions in the first image c of fig. 4 is similar to the image brightness at the sub-area E, F, G, H. The grayscales at these image positions are not located within the prescribed gray areas, and the accuracy of white balance is poor without performing white balance processing using the grayscales at these image positions. In the first image acquired with low exposure or standard exposure, the brightness intervals to which the image positions belong are standard brightness intervals. Then, the luminance section of the multi-frame first image at the same image position includes a standard luminance section and a high luminance section. In the embodiment of the disclosure, the accuracy of white balance correction can be improved by performing white balance processing by using the gray scale of the image position where the image brightness belongs to the standard brightness interval and the high brightness interval, and the gray scale of the image position where the image brightness belongs to the standard brightness interval can be implemented to a greater extent by setting the weight corresponding to the standard brightness interval to be greater than the weight corresponding to the high brightness interval, so that the accuracy of white balance correction is further improved.
In an example embodiment, the brightness interval of the multiple frames of the first image at the same image position may include: and correspondingly, the weight of the first image of each frame at the same image position can be determined according to the preset relation between the weight corresponding to the high brightness interval and the weight corresponding to the low brightness interval. The relationship between the weight corresponding to the high brightness interval and the weight corresponding to the low brightness interval can be determined according to the shooting scene of the multi-frame first image.
For example, when the shooting scene is a night low-brightness scene, the brightness of the image at each image position in the multi-frame first image is generally lower, and the weight corresponding to the low-brightness interval can be set to be smaller than the weight corresponding to the high-brightness interval, so that the accuracy of white balance correction can be improved. That is, when the image brightness is lower than the first threshold value, the weight corresponding to the low brightness section is smaller than the weight corresponding to the high brightness section. The first threshold may be set arbitrarily as needed, which is not limited in this disclosure.
When the shooting scene is a night highlight scene, the brightness of the images at each image position in the multi-frame first image is generally higher, and the weight corresponding to the high brightness interval can be set to be smaller than the weight corresponding to the low brightness interval, so that the accuracy of white balance correction is improved. That is, when the image brightness is higher than the second threshold value, the weight corresponding to the high brightness interval is smaller than the weight corresponding to the low brightness interval. Wherein the second threshold may be arbitrarily set as desired, as this disclosure is not limited in this regard.
In addition, in the embodiment of the present disclosure, for each frame of the first image, the luminance sections included therein may be plural, such as including a high luminance section and a standard luminance section, or including a high luminance section and a low luminance section, or including a low luminance section and a standard luminance section. For example, the first image d and the first image b each include a low luminance section and a standard luminance section, and the first image c includes a standard luminance section and a high luminance section. Because the weights corresponding to different brightness intervals are different, accordingly, the weights of the first image at multiple image positions of each frame may be different. For example, in a case where the first image of a certain frame includes a high brightness interval and a standard brightness interval, the weight corresponding to the high brightness interval is smaller than the weight corresponding to the standard brightness interval. When a first image of a certain frame includes a low luminance section and a standard luminance section, the weight corresponding to the low luminance section is smaller than the weight corresponding to the standard luminance section. When a certain frame of first image comprises a high brightness interval and a low brightness interval, the weight corresponding to the low brightness interval is smaller than the weight corresponding to the high brightness interval when the brightness of a night low brightness scene, namely the image brightness is lower than a first threshold value; when the image brightness of the night highlight scene is higher than the second threshold value, the weight corresponding to the high brightness interval is smaller than the weight corresponding to the low brightness interval.
Step 204, weighting the gray scale of the first image of each frame according to the weight of the first image of the plurality of frames at the same image position to determine the fusion gray scale at least one image position.
In an exemplary embodiment, taking an image position as an example where one sub-area is located after the first image is divided into a plurality of sub-areas, the fusion gray level at any sub-area can be determined by the following formula (1):
S(x,y)=α×S Ev+ (x,y)+β×S Ev0 (x,y)+γ×S Ev- (x,y) (1)
wherein S (x, y) is a fusion gray scale at a certain subarea, x represents an abscissa corresponding to the subarea, y represents an ordinate corresponding to the subarea, and if the first image of each frame is divided into 64 x 48 subareas according to the same block dividing mode, x is E [1,64]、y∈[1,48];S Ev+ (x,y)、S Ev0 (x,y)、S Ev- (x, y) are in turn: the gray level of the first image acquired according to the high exposure degree at the subarea, the gray level of the first image acquired according to the standard exposure degree at the subarea, and the gray level of the first image acquired according to the low exposure degree at the subarea. Alpha, beta and gamma are fusion factors, namely weights, and alpha, beta and gamma are E [0,1 ]]And α+β+γ=1.
Step 205, performing white balance processing on the second image according to the fusion gray scale at the at least one image position.
In an example embodiment, in the case where the number of image positions is 1, the fusion gray scale at the image position may be determined as the center gray scale, and further, the white balance gain value is determined according to the difference between the center gray scale and the reference gray scale, and the white balance processing is performed on the second image based on the white balance gain value.
In an example embodiment, when the number of image positions is plural, the fusion gray scales at each image position may be clustered to obtain plural class clusters, and the average value of the fusion gray scales at each image position is determined as the center gray scale in the class cluster having the largest number of image positions, and further, the white balance gain value is determined according to the difference between the center gray scale and the reference gray scale, and the white balance processing is performed on the second image based on the white balance gain value.
That is, step 205 may be implemented by: clustering the fusion gray scales at each image position to determine a center gray scale; determining a white balance gain value according to the difference between the center gray scale and the reference gray scale; and performing white balance processing on the second image based on the white balance gain value.
The reference gray scale is the gray scale corresponding to the ambient color temperature when the second image is shot.
In an example embodiment, reference gray scales corresponding to shooting modes in different shooting scenes, such as a night scene mode in a night scene, a normal mode in a non-night scene, and the like, may be preset, so that when the current shooting mode is the night scene mode, for example, the preset reference gray scale corresponding to the night scene mode may be determined as the reference gray scale used for determining the white balance gain value in the current white balance process.
In an example embodiment, the reference gray scale may also be determined by means of multi-point sampling. That is, in a scene where the second image is photographed, one or more frames of images are collected, and gray scales of a plurality of points in the one or more frames of images are sampled through a nine-grid form or other strategies, and a reference gray scale is determined based on an average gray scale of the plurality of points. The plurality of points may be a plurality of pixel points or a plurality of sub-areas, which is not limited herein.
In summary, according to the image processing method of the embodiment of the present disclosure, a plurality of frames of first images are acquired according to different exposure degrees, the plurality of frames of first images are reference images for performing white balance processing on a captured second image, brightness intervals of the frames of first images at the same image positions are determined according to image brightness of the frames of first images at the same image positions, weights of the frames of first images at the same image positions are determined according to correspondence between the brightness intervals and the weights, gray scales of the frames of first images are weighted according to the weights of the frames of first images at the same image positions, so as to determine fusion gray scales at least one image position, white balance processing is performed on the second image according to the fusion gray scales at least one image position, and accuracy of white balance correction is improved, so that the captured image is closer to real colors, and display effects of the image are improved.
Fig. 5 is a schematic structural view of an image processing apparatus according to an embodiment of the present disclosure.
As shown in fig. 5, the image processing apparatus 500 may include: an acquisition module 501, a first determination module 502, a second determination module 503, and a processing module 504;
the acquiring module 501 is configured to acquire a plurality of frames of first images, where the plurality of frames of first images are acquired according to different exposure degrees, and the plurality of frames of first images are reference images for performing white balance processing on a captured second image;
a first determining module 502, configured to determine, for a plurality of frames of first images, weights of the plurality of frames of first images at the same image positions according to image brightness;
a second determining module 503, configured to weight the gray scale of the first image of each frame according to the weight of the first image of the plurality of frames at the same image position, so as to determine a fusion gray scale at least one image position;
a processing module 504, configured to perform white balance processing on the second image according to the fusion gray scale at the at least one image position.
In one embodiment of the present disclosure, the first determining module 502 includes:
a first determining unit, configured to determine, according to the image brightness of each frame of the first image at the same image position, a brightness interval to which each frame of the first image belongs at the same image position;
And the second determining unit is used for determining the weight of the first image of each frame at the same image position according to the corresponding relation between the brightness interval and the weight.
In one embodiment of the present disclosure, the luminance interval of the multi-frame first image includes: a high brightness section and a low brightness section divided in brightness order;
when the brightness of the image is lower than a first threshold value, the weight corresponding to the low brightness interval is smaller than the weight corresponding to the high brightness interval.
In one embodiment of the present disclosure, the luminance interval of the multi-frame first image includes: a high brightness section and a low brightness section divided in brightness order;
when the brightness of the image is higher than the second threshold value, the weight corresponding to the high brightness interval is smaller than the weight corresponding to the low brightness interval.
In one embodiment of the present disclosure, the first determining module 502 further includes:
the first processing unit is used for partitioning the first image of each frame in the same partitioning mode to obtain a plurality of sub-areas;
and the third determining unit is used for determining a brightness average value of the pixel points in the same subarea aiming at the first image of any frame so as to take the brightness average value as the brightness of the corresponding subarea.
In one embodiment of the present disclosure, the processing module 504 includes:
The clustering unit is used for clustering the fusion gray scales at the positions of the images so as to determine the center gray scale;
a fourth determining unit for determining a white balance gain value according to a difference between the center gray and the reference gray;
and a second processing unit configured to perform white balance processing on the second image based on the white balance gain value.
In one embodiment of the present disclosure, the second image is an image acquired in response to a photographing operation, and the multi-frame first image is an image acquired after the second image is acquired.
According to the image processing device, multiple frames of first images are acquired according to different exposure degrees, the multiple frames of first images are reference images for performing white balance processing on shot second images, weights of the multiple frames of first images at the same image positions are determined according to image brightness, gray scales of the multiple frames of first images are weighted according to the weights of the multiple frames of first images at the same image positions, fusion gray scales of at least one image position are determined, white balance processing is performed on the second images according to the fusion gray scales of the at least one image position, and accuracy of white balance correction is improved, so that the shot images are closer to real colors, and display effects of the images are improved.
According to a third aspect of embodiments of the present disclosure, there is also provided an electronic device, including: a processor; a memory for storing processor-executable instructions, wherein the processor is configured to execute the instructions to implement the image processing method as above.
In order to implement the above-described embodiments, the present disclosure also proposes a storage medium.
Wherein the instructions in the storage medium, when executed by a processor of the electronic device, enable the electronic device to perform the method as above.
To achieve the above embodiments, the present disclosure also provides a computer program product.
Wherein the computer program product, when executed by a processor of an electronic device, enables the electronic device to perform the method as above.
Fig. 6 is a block diagram of an electronic device, according to an example embodiment. The electronic device shown in fig. 6 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 6, the electronic device 1000 includes a processor 111 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 112 or a program loaded from a Memory 116 into a random access Memory (RAM, random Access Memory) 113. In the RAM 113, various programs and data required for the operation of the electronic apparatus 1000 are also stored. The processor 111, the ROM 112, and the RAM 113 are connected to each other through a bus 114. An Input/Output (I/O) interface 115 is also connected to bus 114.
The following components are connected to the I/O interface 115: a memory 116 including a hard disk and the like; and a communication section 117 including a network interface card such as a local area network (Local Area Network, LAN) card, a modem, or the like, the communication section 117 performing communication processing via a network such as the internet; the drive 118 is also connected to the I/O interface 115 as needed.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program embodied on a computer readable medium, the computer program containing program code for performing the methods shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from the network through the communication section 117. The above-described functions defined in the methods of the present disclosure are performed when the computer program is executed by the processor 111.
In an exemplary embodiment, a storage medium is also provided, such as a memory, comprising instructions executable by the processor 111 of the electronic device 1000 to perform the above-described method. Alternatively, the storage medium may be a non-transitory computer readable storage medium, for example, a ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
All embodiments of the disclosure may be performed alone or in combination with other embodiments and are considered to be within the scope of the disclosure as claimed.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (16)

  1. An image processing method, wherein the method comprises:
    acquiring a plurality of frames of first images, wherein the plurality of frames of first images are acquired according to different exposure degrees, and the plurality of frames of first images are reference images for performing white balance processing on a shot second image;
    for the multiple frames of first images, determining weights of the multiple frames of first images at the same image position according to the image brightness;
    weighting the gray scale of the first image of each frame according to the weight of the first image of the plurality of frames at the same image position so as to determine the fusion gray scale at least one image position;
    and performing white balance processing on the second image according to the fusion gray level at the at least one image position.
  2. The method of claim 1, wherein the determining weights for the plurality of frames of the first image at the same image location based on image brightness comprises:
    Determining a brightness interval of the first image of each frame at the same image position according to the image brightness of the first image of each frame at the same image position;
    and determining the weight of the first image at the same image position of each frame according to the corresponding relation between the brightness interval and the weight.
  3. The method of claim 2, wherein the plurality of frames of the first image comprises: a high brightness section and a low brightness section divided in brightness order;
    and when the brightness of the image is lower than a first threshold value, the weight corresponding to the low brightness interval is smaller than the weight corresponding to the high brightness interval.
  4. The method of claim 2, wherein the plurality of frames of the first image comprises: a high brightness section and a low brightness section divided in brightness order;
    and when the brightness of the image is higher than a second threshold value, the weight corresponding to the high brightness interval is smaller than the weight corresponding to the low brightness interval.
  5. The method of claim 2, wherein determining, from the image brightness of the first image at the same image position for each frame, a brightness interval to which the first image at the same image position for each frame belongs, further comprises:
    The first image of each frame is partitioned in the same partitioning mode, so that a plurality of subareas are obtained;
    and determining a brightness average value of pixel points in the same subarea aiming at the first image of any frame, so as to take the brightness average value as the brightness of the corresponding subarea.
  6. The method according to any one of claims 1-5, wherein the white balancing the captured second image according to the blended gray scale at the at least one image location, comprises:
    clustering the fusion gray scales at the positions of the images to determine a center gray scale;
    determining a white balance gain value according to the difference between the center gray scale and the reference gray scale;
    and performing white balance processing on the second image based on the white balance gain value.
  7. The method of any of claims 1-5, wherein the second image is an image acquired in response to a capture operation and the multi-frame first image is an image acquired after acquisition of the second image.
  8. An image processing apparatus, wherein the apparatus comprises:
    the acquisition module is used for acquiring a plurality of frames of first images, wherein the plurality of frames of first images are acquired according to different exposure degrees, and the plurality of frames of first images are reference images for performing white balance processing on a shot second image;
    The first determining module is used for determining weights of the multiple frames of first images at the same image position according to the image brightness;
    the second determining module is used for weighting the gray scale of the first image of each frame according to the weight of the first image of the plurality of frames at the same image position so as to determine the fusion gray scale at least one image position;
    and the processing module is used for carrying out white balance processing on the second image according to the fusion gray level at the position of the at least one image.
  9. The apparatus of claim 8, wherein the first determination module comprises:
    a first determining unit, configured to determine, according to image brightness of the first image at the same image position of each frame, a brightness interval to which the first image of each frame belongs at the same image position;
    and the second determining unit is used for determining the weight of the first image at the same image position of each frame according to the corresponding relation between the brightness interval and the weight.
  10. The apparatus of claim 9, wherein the brightness interval of the plurality of frames of the first image comprises: a high brightness section and a low brightness section divided in brightness order;
    And when the brightness of the image is lower than a first threshold value, the weight corresponding to the low brightness interval is smaller than the weight corresponding to the high brightness interval.
  11. The apparatus of claim 9, wherein the brightness interval of the plurality of frames of the first image comprises: a high brightness section and a low brightness section divided in brightness order;
    and when the brightness of the image is higher than a second threshold value, the weight corresponding to the high brightness interval is smaller than the weight corresponding to the low brightness interval.
  12. The apparatus of claim 9, wherein the first determination module further comprises:
    the first processing unit is used for partitioning the first image of each frame in the same partitioning mode to obtain a plurality of sub-areas;
    and the third determining unit is used for determining a brightness average value of the pixel points in the same subarea aiming at the first image of any frame, and taking the brightness average value as the brightness of the corresponding subarea.
  13. The apparatus of any of claims 8-12, wherein the processing module comprises:
    the clustering unit is used for clustering the fusion gray scales at the positions of the images so as to determine the center gray scale;
    a fourth determining unit for determining a white balance gain value according to a difference between the center gray and the reference gray;
    And the second processing unit is used for carrying out white balance processing on the second image based on the white balance gain value.
  14. The apparatus of any of claims 8-12, wherein the second image is an image acquired in response to a capture operation and the multi-frame first image is an image acquired after acquisition of the second image.
  15. An electronic device, comprising:
    a processor;
    a memory for storing the processor-executable instructions;
    wherein the processor is configured to execute the instructions to implement the image processing method of any one of claims 1 to 7.
  16. A non-transitory computer readable storage medium, which when executed by a processor, causes the processor to perform the image processing method of any one of claims 1 to 7.
CN202280004418.2A 2022-06-17 2022-06-17 Image processing method and device Pending CN117616775A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/099624 WO2023240651A1 (en) 2022-06-17 2022-06-17 Image processing method and apparatus

Publications (1)

Publication Number Publication Date
CN117616775A true CN117616775A (en) 2024-02-27

Family

ID=89192927

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280004418.2A Pending CN117616775A (en) 2022-06-17 2022-06-17 Image processing method and device

Country Status (2)

Country Link
CN (1) CN117616775A (en)
WO (1) WO2023240651A1 (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002218478A (en) * 2001-01-19 2002-08-02 Fuji Photo Film Co Ltd Method for controlling auto white balance for digital camera and digital camera
JP6521776B2 (en) * 2015-07-13 2019-05-29 オリンパス株式会社 Image processing apparatus, image processing method
CN105120247B (en) * 2015-09-10 2017-12-26 联想(北京)有限公司 A kind of white balance adjustment method and electronic equipment
WO2017217135A1 (en) * 2016-06-15 2017-12-21 ソニー株式会社 Image processing device, image processing method, and program
CN107959839B (en) * 2017-11-27 2019-07-30 努比亚技术有限公司 A kind of method of blank level adjustment, terminal and computer readable storage medium
US11445127B2 (en) * 2020-06-15 2022-09-13 Samsung Electronics Co., Ltd. Leveraging HDR sensors for handling mixed illumination auto white balance
CN112818732B (en) * 2020-08-11 2023-12-12 腾讯科技(深圳)有限公司 Image processing method, device, computer equipment and storage medium
US11736804B2 (en) * 2020-09-07 2023-08-22 Mediatek Inc. Method and apparatus for generating high dynamic range frame through white balance compensation that uses white balance gain table generated from combining multiple sets of white balance gain settings

Also Published As

Publication number Publication date
WO2023240651A1 (en) 2023-12-21

Similar Documents

Publication Publication Date Title
CN109218628B (en) Image processing method, image processing device, electronic equipment and storage medium
CN110619593B (en) Double-exposure video imaging system based on dynamic scene
CN109218627B (en) Image processing method, image processing device, electronic equipment and storage medium
CN109862282B (en) Method and device for processing person image
CN113992861B (en) Image processing method and image processing device
CN110839129A (en) Image processing method and device and mobile terminal
US20160071289A1 (en) Image composition device, image composition method, and recording medium
CN107040726B (en) Double-camera synchronous exposure method and system
CN108322646A (en) Image processing method, device, storage medium and electronic equipment
CN111064904A (en) Dark light image enhancement method
WO2012170462A2 (en) Automatic exposure correction of images
WO2020171305A1 (en) Apparatus and method for capturing and blending multiple images for high-quality flash photography using mobile electronic device
CN109587407A (en) Exposure amount adjustment method, device and the computer equipment of image taking
CN110264473B (en) Image processing method and device based on multi-frame image and electronic equipment
CN113643214B (en) Image exposure correction method and system based on artificial intelligence
CN116744120B (en) Image processing method and electronic device
WO2021128593A1 (en) Facial image processing method, apparatus, and system
CN113177438A (en) Image processing method, apparatus and storage medium
CN113691724A (en) HDR scene detection method and device, terminal and readable storage medium
WO2018077156A1 (en) Systems and methods for exposure control
CN114007020B (en) Image processing method and device, intelligent terminal and computer readable storage medium
CN114429476A (en) Image processing method, image processing apparatus, computer device, and storage medium
CN110971833A (en) Image processing method and device, electronic equipment and storage medium
CN117616775A (en) Image processing method and device
CN110891145A (en) Method for obtaining image by photographing and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination