CN116934607A - Image white balance processing method and device, electronic equipment and storage medium - Google Patents
Image white balance processing method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN116934607A CN116934607A CN202210364883.9A CN202210364883A CN116934607A CN 116934607 A CN116934607 A CN 116934607A CN 202210364883 A CN202210364883 A CN 202210364883A CN 116934607 A CN116934607 A CN 116934607A
- Authority
- CN
- China
- Prior art keywords
- color class
- image
- color
- white balance
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 19
- 230000014759 maintenance of location Effects 0.000 claims abstract description 82
- 238000012545 processing Methods 0.000 claims abstract description 68
- 238000000034 method Methods 0.000 claims abstract description 41
- 238000004422 calculation algorithm Methods 0.000 claims description 41
- 238000004590 computer program Methods 0.000 claims description 6
- 238000005070 sampling Methods 0.000 abstract description 11
- 230000004075 alteration Effects 0.000 abstract description 5
- 241000519995 Stachys sylvatica Species 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 11
- 238000004891 communication Methods 0.000 description 10
- 239000003086 colorant Substances 0.000 description 8
- 230000006870 function Effects 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 238000013135 deep learning Methods 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 238000007726 management method Methods 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 241001464837 Viridiplantae Species 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Of Color Television Signals (AREA)
- Color Television Image Signal Generators (AREA)
Abstract
The disclosure relates to an image white balance processing method, an image white balance processing device, electronic equipment and a storage medium. The method comprises the following steps: acquiring an image to be processed; determining the color class of the image to be processed and the pixel included in each color class; determining a pixel retention factor corresponding to each color class according to the pixels included in each color class; determining a white balance gain parameter according to the effective pixels included in each color class; and performing white balance processing on the image to be processed by using the white balance gain parameter. Therefore, the richness of the sampling points and the probability of white spots can be increased, the effectiveness of white balance processing is improved, the chromatic aberration in the image is reduced, and the image quality is improved.
Description
Technical Field
The disclosure relates to the technical field of image processing, and in particular relates to an image white balance processing method, an image white balance processing device, electronic equipment and a storage medium.
Background
Color constancy refers to the property that the color of an object changes when light sources of different colors illuminate the surface of the object, and people's perception of the original object surface color remains unchanged. The key to enabling a human visual system to have color constancy is: the human eyes can separate background elements and illumination elements in the images based on the cognition experience of the scenery, and the eyes can judge the colors of the scenery according to priori knowledge in the human brain.
However, the digital camera does not have such characteristics during photographing. Different illumination environments can cause a certain degree of deviation between the color of an image acquired by a digital camera and the color of a real scene. For example, a digital camera may capture objects with a color temperature of 2500k ambient that is warmer, while objects with a color temperature above 6500k are cooler. Therefore, the digital camera needs to convert the image color into the original color by using an appropriate color reduction matrix. Among them, a calculation method of assisting a digital camera in eliminating the influence of the illumination environment on the display color is called automatic white balance (AWB, automatic White Balance).
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides an image white balance processing method, an apparatus, an electronic device, and a storage medium.
According to a first aspect of an embodiment of the present disclosure, there is provided an image white balance processing method, including:
acquiring an image to be processed;
determining the color class of the image to be processed and the pixel included in each color class;
determining a pixel retention factor corresponding to each color class according to the pixels included in each color class;
Determining a white balance gain parameter according to the effective pixels included in each color class;
and performing white balance processing on the image to be processed by using the white balance gain parameter.
Optionally, the determining the pixel retention factor corresponding to each color class according to the pixel included in each color class includes:
determining a ratio of the number of pixels included in each color class to the total number of pixels included in each color class according to the number of pixels included in each color class, wherein the total number of pixels is the sum of the numbers of pixels included in each color class;
and determining a pixel retention factor corresponding to each color class according to the ratio of the number of pixels included in each color class to the total number of pixels.
Optionally, before determining the pixel retention factor corresponding to each color class according to the ratio of the number of pixels included in each color class to the total number of pixels, the method further includes:
and determining that the color category with the ratio being greater than or equal to a preset threshold value exists according to the ratio of the number of pixels included in each color category to the total number of pixels.
Optionally, the determining the pixel retention factor corresponding to each color class according to the ratio of the number of pixels included in each color class to the total number of pixels includes:
The pixel retention factor corresponding to each color class is determined by the following formula:
wherein d i Characterizing the pixel retention factor, c, corresponding to the ith color class i Characterizing the ratio of the pixel quantity to the total pixel quantity included in the ith color class, wherein the value range of i is [1, n]And n is the number of color classes.
Optionally, the pixel retention factor corresponding to each color class is inversely proportional to the number of pixels included in that color class.
Optionally, the determining the white balance gain parameter according to the pixel retention factor corresponding to each color class includes:
for each color class, determining effective pixels included in the color class according to pixels included in the color class and corresponding pixel retention factors;
and determining a white balance gain parameter according to a preset white balance algorithm and the effective pixels included in each color class.
Optionally, for each color class, determining, according to the amount of pixels included in the color class and the corresponding pixel retention factors, the effective pixels included in the color class includes:
for each color class, multiplying the number of pixels included in the color class by a corresponding pixel retention factor to obtain a target number, and determining the target number of pixels as effective pixels in the pixels included in the color class.
Optionally, the determining the color class of the image to be processed and the pixel included in each color class includes:
dividing the image to be processed into a plurality of image blocks, and respectively calculating the average pixel value corresponding to each image block;
and clustering the image to be processed by adopting a preset clustering algorithm according to the average pixel value corresponding to each image block to obtain the color class of the image to be processed and the pixels included in each color class.
According to a second aspect of the embodiments of the present disclosure, there is provided an image white balance processing apparatus including:
an acquisition module configured to acquire an image to be processed;
a first determining module configured to determine a color class of the image to be processed and pixels included in each color class;
a second determining module configured to determine a pixel retention factor corresponding to each color class according to pixels included in each color class, wherein the pixel retention factor corresponding to each color class is inversely proportional to the number of pixels included in the color class;
the third determining module is configured to determine a white balance gain parameter according to the pixel retention factor corresponding to each color class;
And the processing module is configured to perform white balance processing on the image to be processed by utilizing the white balance gain parameter.
Optionally, the second determining module includes:
a first determining submodule configured to determine a ratio of the number of pixels included in each color category to a total number of pixels included in each color category, wherein the total number of pixels is a sum of the numbers of pixels included in each color category;
and the second determining submodule is configured to determine a pixel retention factor corresponding to each color category according to the ratio of the number of pixels included in each color category to the total number of pixels.
Optionally, the second determining module further includes:
and a third determining sub-module configured to determine, according to a ratio of the number of pixels included in each color category to the total number of pixels, a color category having a ratio greater than or equal to a preset threshold.
Optionally, the second determination submodule is configured to: the pixel retention factor corresponding to each color class is determined by the following formula:
wherein d i Characterizing the pixel retention factor, c, corresponding to the ith color class i Characterizing the ratio of the pixel quantity to the total pixel quantity included in the ith color class, wherein the value range of i is [1, n ]And n is the number of color classes.
Optionally, the pixel retention factor corresponding to each color class is inversely proportional to the number of pixels included in that color class.
Optionally, the third determining module includes:
a fourth determination submodule configured to determine, for each color category, valid pixels included in the color category from pixels included in the color category and their corresponding pixel retention factors;
and a fifth determining sub-module configured to determine a white balance gain parameter according to a preset white balance algorithm and effective pixels included in each color class.
Optionally, the fourth determination submodule is configured to: for each color class, multiplying the number of pixels included in the color class by a corresponding pixel retention factor to obtain a target number, and determining the target number of pixels as effective pixels in the pixels included in the color class.
Optionally, the first determining module includes:
the segmentation submodule is configured to segment the image to be processed into a plurality of image blocks and respectively calculate average pixel values corresponding to each image block;
and the clustering sub-module is configured to perform clustering processing on the image to be processed by adopting a preset clustering algorithm according to the average pixel value corresponding to each image block to obtain the color class of the image to be processed and the pixels included in each color class.
According to a third aspect of embodiments of the present disclosure, there is provided an electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring an image to be processed;
determining the color class of the image to be processed and the pixel included in each color class;
determining a pixel retention factor corresponding to each color class according to the pixels included in each color class;
determining a white balance gain parameter according to the pixel retention factor corresponding to each color class;
and performing white balance processing on the image to be processed by using the white balance gain parameter.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the method of any of the first aspects of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
by adopting the technical scheme, the pixel retention factor corresponding to each color category can be determined according to the pixels included in each color category, and the white balance gain parameter can be determined according to the pixel retention factor. Therefore, the richness of the sampling points and the probability of white spots can be increased, the effectiveness of white balance processing is improved, the chromatic aberration in the image is reduced, and the image quality is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating an image processing procedure according to an exemplary embodiment.
Fig. 2 is a flowchart illustrating a method of image white balance processing according to an exemplary embodiment.
FIG. 3 is a schematic diagram illustrating one method of determining a color class of an image to be processed, according to an example embodiment.
Fig. 4 is a schematic diagram showing an R/G, B/G distribution of pixel sampling points in an image to be processed according to an exemplary embodiment.
Fig. 5 is a schematic diagram showing R/G, B/G distribution of pixel sampling points in a new image based on active pixel stitching, according to an exemplary embodiment.
Fig. 6 is a flowchart illustrating a step S23 according to an exemplary embodiment.
Fig. 7 is a flowchart illustrating another image white balance processing method according to an exemplary embodiment.
Fig. 8 is a block diagram illustrating an image white balance processing apparatus according to an exemplary embodiment.
Fig. 9 is a block diagram of an electronic device, according to an example embodiment.
Fig. 10 is a block diagram of an electronic device, according to an example embodiment.
Detailed Description
In the related art, methods of white balance processing are generally classified into two types: statistical-based methods and learning-based methods. The former category of methods make assumptions about the statistical properties of natural scenes and estimate the deviation of the color of the light source from these assumptions. The latter type of method uses models learned from training data to estimate the color of the light source. Wherein the learning-based white balance method is capable of extracting features from an input image and training a discrimination model, the method comprising a deep learning method of learning features using a deep convolutional neural network. The main difficulty of the white balance method based on deep learning is that a large-scale data set marked by a real light source is lack as a reference, and the method may reduce the performance of the data set when stacking a plurality of data sets for training, and the original model needs to be continuously debugged to adapt to a new data set, so that the white balance algorithm based on deep learning is not yet mature and has no practical application.
The white balance method based on statistics is widely applied to industrial application due to simple calculation and strong interpretability. Common statistical-based White balancing methods include Gray World (Gray World) methods and White Patch (White Patch) methods, which often estimate the light source at the time of camera shooting by some approximate assumptions. If more accurate ambient light source information can be estimated, the original image can be subjected to image processing by calculating a pixel gain parameter (i.e., a white balance gain parameter) so as to achieve the purpose of image color restoration.
The gray world algorithm is based on gray world assumption, which is: for an image with a large number of color variations, the average of the three color channel components of RGB tends to be the same gray value. From the physical level, the gray world algorithm assumes that the average of the average reflection of light by a natural scene is a constant value on the whole, this constant value being approximately "gray", and the algorithm is achromatic by calculating the average reflectivity of the object surface. Since under white light sources the average color in the scene is achromatic. This means that any deviation from the average colour of grey is affected by the light source. The white balance algorithm applies this assumption to the image to be processed, and can eliminate the influence of ambient light from the image, so as to obtain an original scene image, i.e. a scene image with the influence of ambient light eliminated.
The white compensation algorithm no longer considers the whole world to be gray, but merely uses the pixels that are themselves gray as the basis for the gain calculation. The white compensation algorithm will find the "white point" in the image and take these pixels as "mirrors". It is assumed that a "mirror" in an image can totally reflect the light that the light source irradiates on the object, so that if there is one "mirror" in the image, the color information of the "mirror" obtained can be regarded as the information of the current light source under a specific light source. The method firstly detects gray pixels in a picture, and the average value of the gray pixel colors in each channel component is equal and can be regarded as white point (the highest brightness of gray is white), so that the pixel gain parameters are further calculated as the condition of obtaining the current light source information assumption of the image to perform image processing on the original image, and the purpose of image color restoration is achieved.
Since the gray world algorithm calculates all pixels as RGB averages, the algorithm is suitable for color rich scenes. However, in a large-area monochromatic scene, a color cast phenomenon of a shooting scene is liable to occur, for example, a color cast phenomenon is liable to occur when shooting a large-area green plant, a loess field, or the like. That is, if the image includes a large area of monochromatic scene, the calculation result of the white balance algorithm will be affected. While the white compensation method may fail the algorithm because white points may be absent in a large-area monochromatic scene. Therefore, the related art cannot effectively perform white balance processing on the image in a large-area monochromatic scene, so that the color of the image presented is greatly different from the color seen by human eyes, and the image quality is poor.
In view of this, the present disclosure provides an image white balance processing method, apparatus, electronic device, and storage medium, so as to reduce chromatic aberration in an image and improve image quality.
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
It should be noted that, all actions of acquiring signals, information or data in the present application are performed under the condition of conforming to the corresponding data protection rule policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
Before describing in detail the image white balance processing method provided by the present disclosure, first, an image processing procedure in the present disclosure will be described.
Fig. 1 is a flowchart illustrating an image processing procedure according to an exemplary embodiment. As shown in fig. 1, the image processing procedure is: (1) An image is captured using a sensor with a color filter array. (2) Color filtering to filter out infrared light in the image acquired by the sensor. (3) exposure. (4) demosaicing. (5) noise reduction. And (6) performing white balance processing on the image. (7) image color processing. (8) rendering. (9) compression. And (10) storing. In the present disclosure, an image white balance processing procedure in an image processing procedure will be mainly described in detail.
Fig. 2 is a flowchart illustrating a method of image white balance processing according to an exemplary embodiment. As shown in fig. 2, the method includes the following steps.
In step S21, an image to be processed is acquired.
It should be understood at the outset that in the present disclosure, the execution subject of the image white balance processing method may be a server, a terminal device, an image processing unit having an image processing function, or the like. And, the execution subject can acquire the image to be processed from the electronic device or locally by wired or wireless means. The image to be processed is an RGB image to be subjected to image white balance processing. The image to be processed can be an image generated by shooting, and can also be a frame image obtained by real-time scanning of a camera in a shooting mode. Alternatively, the image to be processed may be an original image, for example, a raw image file, a color image rendered based on the original image, or an image obtained after processing such as corrosion operation, which is not specifically limited in this disclosure.
In step S22, the color class of the image to be processed and the pixels included in each color class are determined.
In the present disclosure, the color class of the image to be processed and the pixels included in each color class may be determined by a clustering algorithm. The specific implementation manner of step S22 may be:
firstly, dividing an image to be processed into a plurality of image blocks, and respectively calculating an average pixel value corresponding to each image block. For each image block, a red component average value is calculated from a red component of each pixel included in the image block on an R channel, a green component average value is calculated from a green component of each pixel included in the image block on a G channel, and a blue component average value is calculated from a blue component of each pixel included in the image block on a B channel, and then an average pixel value corresponding to the image block is determined from the red component average value, the green component average value, and the blue component average value. Thus, an average pixel value corresponding to each image block can be obtained.
And then, clustering the image to be processed by adopting a preset clustering algorithm according to the average pixel value corresponding to each image block to obtain the color class of the image to be processed and the pixels included in each color class.
For example, as shown in fig. 3, the image to be processed is an image on the left side of the figure, the image to be processed is divided into 32×32 image blocks (as shown in the middle image in fig. 3), and the average pixel value corresponding to each image block is calculated in the above manner, so as to obtain 1024 average pixel values. Let the preset distance algorithm be the K-means algorithm and K be 4. Then the 1024 average pixel values are clustered according to the K-means algorithm, and the clustering result is shown as the rightmost picture in the 3 pictures, that is, the clustering result is sequentially from left to right, a first color category, a second color category, a third color category and a fourth color category. And, the clustering result can also include the pixel included in each color category.
In step S23, a pixel retention factor corresponding to each color class is determined according to the pixel included in each color class.
If the number of pixels included in a certain color class is large, a large-area monochromatic scene is easy to appear in the image to be processed, so that the effect of white balance processing is affected. In order to ensure the white balance processing effect, it is necessary to remove a part of pixels included in a color class having a large number of pixels, and therefore, in the present disclosure, a pixel retention factor corresponding to the color class is determined, where the retention factor is used to characterize a proportion of the number of valid pixels retained in the color class to the number of pixels included in the color class.
Illustratively, the pixel retention factor corresponding to a color class is inversely proportional to the number of pixels included in the color class, i.e., the smaller the pixel retention factor corresponding to a color class having a greater number of pixels, the greater the pixel retention factor corresponding to a color class having a lesser number of pixels.
In step S24, a white balance gain parameter is determined according to the pixel retention factor corresponding to each color class.
In step S25, white balance processing is performed on the image to be processed using the white balance gain parameter.
By adopting the technical scheme, the pixel retention factor corresponding to each color category can be determined according to the pixels included in each color category, and the white balance gain parameter can be determined according to the pixel retention factor. Therefore, the richness of the sampling points and the probability of white spots can be increased, the effectiveness of white balance processing is improved, the chromatic aberration in the image is reduced, and the image quality is improved.
In one possible manner, the specific implementation manner of step S24 is as follows:
first, for each color class, the valid pixels included in the color class are determined from the pixels included in the color class and their corresponding pixel retention factors.
For each color class, the number of pixels included in the color class is multiplied by the corresponding pixel retention factor to obtain a target number, and the target number of pixels is determined as effective pixels among the pixels included in the color class.
It should be appreciated that in the present disclosure, for each color class, a target number of pixels may be randomly selected among the pixels included in the color class as valid pixels, with the remaining number of pixels discarded to reduce the number of pixels included in the color class. For example, if the number of pixels included in the color class is 100 and the corresponding pixel retention factor is 50%, the target number is 50, that is, 50 pixels are randomly selected as effective pixels from the pixels included in the color class, and the other 50 pixels are discarded.
And then, determining a white balance gain parameter according to a preset white balance algorithm and the effective pixels included in each color class.
In the present disclosure, a new image of a frame may be stitched according to the effective pixels included in each color class, and the new image of the frame is input to the white balance processing module. The white balance processing module determines white balance gain parameters according to a preset white balance algorithm and a new image of the frame. The preset white balance algorithm may include a gray world algorithm and a white compensation algorithm. The following describes the calculation procedures of the gray world algorithm and the white compensation algorithm, respectively.
The gray world algorithm is calculated as follows:
1. Calculating average components of all pixels in the image on the red channel, the green channel and the blue channel respectively, for example, the average components on the red channel, the green channel and the blue channel can be obtained by the formula (1), and the average Gray value Gray of the image can be obtained by the formula (2):
where Nr, ng and Nb characterize the number of pixels on the red, green and blue channels, respectively, and Nr, ng and Nb are equal. Ravg, gavg, and Bavg characterize the average components of the pixel on the red, green, and blue channels, respectively, and Rsensor (r), rsensor (g), and Rsensor (b) characterize the components of the pixel on the red, green, and blue channels, respectively.
2. Assuming that any average color deviating from gray is affected by the light source, a white balance gain parameter k is calculated from the average gray value by calculating the object surface r 、k g 、k b For eliminating chromatic aberration in the image. Wherein the white balance gain parameter may be calculated according to formula (3):
3. calculating the components of pixels in each channel in the image after the white balance processing by using the white balance gain parameter and the formula (4) to obtain a new R ' G ' B ' image:
the white compensation algorithm is to assume that a mirror surface capable of completely reflecting a light source exists on an image when calibration is performed, and then under a standard light source, if a pure white pixel point with tristimulus values of [255, 255, 255] exists in the image (wherein, a plurality of definitions of white pixel points can exist), the light rays irradiated on the surface of an object by the current light source can be reflected according to the assumption. The specific algorithm is as follows:
1. In the image, the maximum gray values Rmax, gmax, bmax of the three channels R, G and B are calculated respectively, as shown in the formula (5):
where max () is a function taking the maximum value, and Rsensor (r), rsensor (g), and Rsensor (b) characterize the components of a pixel on the red, green, and blue channels, respectively.
2. Calculating the white balance gain parameter k according to the assumption of "specular" reflection according to equation (6) r 、k g 、k b :
3. And (3) calculating the components of pixels in each channel in the image after the white balance processing by using the white balance gain parameters and the formula (4) so as to obtain a new R ' G ' B ' image.
Fig. 4 is a schematic diagram showing an R/G, B/G distribution of pixel sampling points in an image to be processed according to an exemplary embodiment. The R/G, B/G coordinates of the pixel sample points in the image to be processed in FIG. 4 are mostly concentrated at coordinates (1, 1), i.e., the average of the components on the R, G, B three channels tends to the same gray scale value.
The white balance processing module divides the new image into M multiplied by N image blocks, each image block is a color sampling block, and the new image reduces the resolution of the image on the premise of keeping RGB information quantity, thereby enhancing the capability of acquiring the local characteristics of each color sampling block. Fig. 5 is a schematic diagram showing R/G, B/G distribution of pixel sampling points in a new image based on active pixel stitching, according to an exemplary embodiment. As shown in fig. 5, compared with fig. 4, the R/G, B/G distribution of the pixel sampling points in the new image obtained by stitching is more dispersed, and more scene color sampling points can be generated, so that the gray value estimated according to the gray world algorithm is more accurate, the probability of more white points is increased, and the effectiveness of the white compensation algorithm is ensured.
A specific embodiment of determining the pixel retention factor is described below. Fig. 6 is a flowchart illustrating a step S23 according to an exemplary embodiment. As shown in fig. 6, step S23 in fig. 2 may include step S231 and step S232.
In step S231, a ratio of the number of pixels included in each color class to the total number of pixels is determined according to the number of pixels included in each color class. Wherein the total number of pixels is the sum of the number of pixels included in each color class.
By way of example, following the example shown in fig. 3, the image to be processed is divided into 32×32 image blocks, 1024 average pixel values are obtained in the above manner, and the 1024 average pixel values are clustered by using the K-means algorithm to obtain four types of color categories. Let the number of pixels comprised by the first class of colors be s 1 The ratio of the first class of colors is c 1 =s 1 /1024, the second type of color class comprising a number s of pixels 2 The ratio of the second class of colors is c 2 =s 2 The third class of color class comprises a number s of pixels/1024 3 The ratio of the third class of color class is c 3 =s 3 The fourth class of colors includes a number s of pixels/1024 4 The ratio of the fourth class of colors is c 4 =s 4 /1024 where s 1 +s 2 +s 3 +s 4 =1024。
In step S232, a pixel retention factor corresponding to each color class is determined according to a ratio of the number of pixels included in each color class to the total number of pixels.
In one embodiment, a correspondence relationship between the number of pixels included in a color class and the total number of pixels (hereinafter referred to as a ratio corresponding to the color class) and a pixel retention factor may be preset. Wherein the correspondence is inversely proportional, i.e. the larger the ratio of the color class correspondence, the smaller the pixel retention factor. For example, the range of the pixel retention factor corresponding to the color class is [0, 20% ] when the color class correspondence ratio is 80% or more, the range of the pixel retention factor corresponding to the color class is (20%, 40% ], … … when the color class correspondence ratio is 60% or more and less than 80%, and the range of the pixel retention factor corresponding to the color class is (80%, 100% ].
Further, considering that the function value of the Tanh function (hyperbolic tangent function ) in the (0, n) range has a characteristic of suppressing a small value and retaining a larger value, in another embodiment, the pixel retention factor corresponding to each color class may be determined using the Tanh function. Illustratively, the pixel retention factor corresponding to each color class can be determined by equation (7):
wherein d i Characterizing the pixel retention factor, c, corresponding to the ith color class i Characterizing the ratio of the pixel quantity to the total pixel quantity included in the ith color class, wherein the value range of i is [1, n]And n is the number of color classes.
For example, assume that each color class corresponds to a ratio of c 1 =65%、c 2 =20%、c 3 =10% and c 4 =5% and the pixel retention factor corresponding to each color class determined according to the above formula (7) is d 1 =49.1%、d 2 =99.9%、d 3 =99.9% and d 4 =99.9%. The values illustrate that if the product of the pixel retention factor and the number of pixels is a fraction, then the rounding up or the rounding down may be done according to a preset rule.
It should be understood that the effective pixels included in each color class are obtained through the above scheme, and the effective pixels included in each color class are spliced to obtain a new image, and the Size of the new image is New image Size of image to be processed Image to be processed The relationship of (2) is shown in formula (8):
Size new image =Size Image to be processed *(c 1 *d 1 +c 2 *d 2 +...+c n *d n ) (8)
Furthermore, in one embodiment, the above-described manner of determining the pixel retention factor may be performed on any image. However, in view of the fact that the above-mentioned gray world algorithm and/or white compensation algorithm can be directly used for white balance processing for color rich scenes, in order to reduce the workload and simplify the complexity of the white balance processing, in another embodiment, the above-mentioned manner of determining the pixel retention factor and the subsequent manner of determining the white balance gain parameter according to the pixel retention factor are performed only on images containing large-area solid-color scenes.
Accordingly, as shown in fig. 6, before step S232 is performed, step S23 further includes step S233.
In step S233, it is determined that there is a color class having a ratio greater than or equal to a preset threshold according to the ratio of the number of pixels included in each color class to the total number of pixels.
In the present disclosure, if there is a color class with a ratio greater than or equal to the preset threshold, it is determined that the image to be processed includes a large-area monochromatic scene, and at this time, step S232 is performed again.
Fig. 7 is a flowchart illustrating another image white balance processing method according to an exemplary embodiment. As shown in fig. 7, first, an image to be processed is input. For example, an image to be processed is input to an execution subject of the image white balance processing method. Then, it is identified whether the image to be processed contains a large area of monochromatic scene. If the large-area monochromatic scene is not included, the white balance gain parameter is determined directly by using a white balance algorithm. And if the large-area monochromatic scene is included, determining a pixel retention factor corresponding to each color class of the image to be processed. And then, determining effective pixels according to the pixel retention factors, and performing image stitching. And then, determining a white balance gain parameter by using a white balance algorithm. And finally, performing white balance processing on the image to be processed.
By adopting the scheme, for the image to be processed comprising the large-area monochromatic scene, the pixel retention factors corresponding to each type of color category are determined according to the mode, the effective pixels included in the color category are determined according to the pixel retention factors, and then the white balance gain parameters are determined according to the effective pixels. Thus, the white balance processing is improved in effectiveness and the workload of the white balance processing is reduced.
Based on the same inventive concept, the present disclosure provides an image white balance processing apparatus. Fig. 8 is a block diagram illustrating an image white balance processing apparatus according to an exemplary embodiment. As shown in fig. 8, the image white balance processing apparatus 700 includes:
an acquisition module 701 configured to acquire an image to be processed;
a first determining module 702 configured to determine a color class of the image to be processed and a pixel included in each color class;
a second determining module 703 configured to determine a pixel retention factor corresponding to each color class according to the pixels included in each color class;
a third determining module 704 configured to determine a white balance gain parameter according to the pixel retention factor corresponding to each color class;
a processing module 705, configured to perform white balance processing on the image to be processed by using the white balance gain parameter.
Optionally, the second determining module 703 includes:
a first determining submodule configured to determine a ratio of the number of pixels included in each color category to a total number of pixels included in each color category, wherein the total number of pixels is a sum of the numbers of pixels included in each color category;
And the second determining submodule is configured to determine a pixel retention factor corresponding to each color category according to the ratio of the number of pixels included in each color category to the total number of pixels.
Optionally, the second determining module 703 further includes:
and a third determining sub-module configured to determine, according to a ratio of the number of pixels included in each color category to the total number of pixels, a color category having a ratio greater than or equal to a preset threshold.
Optionally, the second determination submodule is configured to: the pixel retention factor corresponding to each color class is determined by the following formula:
wherein d i Characterizing the pixel retention factor, c, corresponding to the ith color class i Characterizing the ratio of the pixel quantity to the total pixel quantity included in the ith color class, wherein the value range of i is [1, n]And n is the number of color classes.
Optionally, the pixel retention factor corresponding to each color class is inversely proportional to the number of pixels included in that color class.
Optionally, the third determining module 704 includes:
a fourth determination submodule configured to determine, for each color category, valid pixels included in the color category from pixels included in the color category and their corresponding pixel retention factors;
And a fifth determining sub-module configured to determine a white balance gain parameter according to a preset white balance algorithm and effective pixels included in each color class.
Optionally, the fourth determination submodule is configured to: for each color class, multiplying the number of pixels included in the color class by a corresponding pixel retention factor to obtain a target number, and determining the target number of pixels as effective pixels in the pixels included in the color class.
Optionally, the first determining module 702 includes:
the segmentation submodule is configured to segment the image to be processed into a plurality of image blocks and respectively calculate average pixel values corresponding to each image block;
and the clustering sub-module is configured to perform clustering processing on the image to be processed by adopting a preset clustering algorithm according to the average pixel value corresponding to each image block to obtain the color class of the image to be processed and the pixels included in each color class.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
The present disclosure also provides a computer-readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the image white balance processing method provided by the present disclosure.
Fig. 9 is a block diagram of an electronic device, according to an example embodiment. For example, apparatus 800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 9, apparatus 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the apparatus 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of an image white balance processing method. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on the device 800, contact data, phonebook data, messages, pictures, videos, and the like. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 806 provides power to the various components of the device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
The multimedia component 808 includes a screen between the device 800 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 800 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the apparatus 800. For example, the sensor assembly 814 may detect an on/off state of the device 800, a relative positioning of the components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, an orientation or acceleration/deceleration of the device 800, and a change in temperature of the device 800. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication between the apparatus 800 and other devices, either in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing an image white balance processing method.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including instructions executable by processor 820 of apparatus 800 to perform an image white balance processing method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
In another exemplary embodiment, a computer program product is also provided, comprising a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-described image white balance processing method when executed by the programmable apparatus.
Fig. 10 is a block diagram of an electronic device, according to an example embodiment. For example, the apparatus 1900 may be provided as a server. Referring to fig. 10, the apparatus 1900 includes a processing component 1922 that further includes one or more processors and memory resources represented by memory 1932 for storing instructions, such as application programs, that are executable by the processing component 1922. The application programs stored in memory 1932 may include one or more modules each corresponding to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform an image white balance processing method.
The apparatus 1900 may further include a power component 1926 configured to perform power management of the apparatus 1900, a wired or wireless network interface 1950 configured to connect the apparatus 1900 to a network, and an input/output (I/O) interface 1958. The apparatus 1900 may operate based on an operating system stored in the memory 1932, such as Windows Server TM ,Mac OS X TM ,Unix TM ,Linux TM ,FreeBSD TM Or the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
Claims (11)
1. An image white balance processing method, characterized by comprising:
acquiring an image to be processed;
determining the color class of the image to be processed and the pixel included in each color class;
determining a pixel retention factor corresponding to each color class according to the pixels included in each color class;
determining a white balance gain parameter according to the pixel retention factor corresponding to each color class;
And performing white balance processing on the image to be processed by using the white balance gain parameter.
2. The method of claim 1, wherein determining the pixel retention factor corresponding to each color class based on the pixels included in each color class comprises:
determining a ratio of the number of pixels included in each color class to the total number of pixels included in each color class according to the number of pixels included in each color class, wherein the total number of pixels is the sum of the numbers of pixels included in each color class;
and determining a pixel retention factor corresponding to each color class according to the ratio of the number of pixels included in each color class to the total number of pixels.
3. The method of claim 2, further comprising, prior to said determining the pixel retention factor for each color class based on the ratio of the number of pixels included in each color class to the total number of pixels:
and determining that the color category with the ratio being greater than or equal to a preset threshold value exists according to the ratio of the number of pixels included in each color category to the total number of pixels.
4. A method according to claim 2 or 3, wherein determining the pixel retention factor corresponding to each color class according to the ratio of the number of pixels included in each color class to the total number of pixels comprises:
The pixel retention factor corresponding to each color class is determined by the following formula:
wherein d i Characterizing the pixel retention factor, c, corresponding to the ith color class i Characterizing the ratio of the pixel quantity to the total pixel quantity included in the ith color class, wherein the value range of i is [1, n]And n is the number of color classes.
5. A method according to any one of claims 1-3, wherein the pixel retention factor for each color class is inversely proportional to the number of pixels that color class comprises.
6. A method according to any one of claims 1-3, wherein said determining white balance gain parameters from pixel retention factors corresponding to each color class comprises:
for each color class, determining effective pixels included in the color class according to pixels included in the color class and corresponding pixel retention factors;
and determining a white balance gain parameter according to a preset white balance algorithm and the effective pixels included in each color class.
7. The method of claim 6, wherein for each color class, determining valid pixels included in the color class based on the amount of pixels included in the color class and their corresponding pixel retention factors, comprises:
For each color class, multiplying the number of pixels included in the color class by a corresponding pixel retention factor to obtain a target number, and determining the target number of pixels as effective pixels in the pixels included in the color class.
8. A method according to any one of claims 1-3, wherein said determining the color class of the image to be processed and the pixels comprised by each color class comprises:
dividing the image to be processed into a plurality of image blocks, and respectively calculating the average pixel value corresponding to each image block;
and clustering the image to be processed by adopting a preset clustering algorithm according to the average pixel value corresponding to each image block to obtain the color class of the image to be processed and the pixels included in each color class.
9. An image white balance processing apparatus, comprising:
an acquisition module configured to acquire an image to be processed;
a first determining module configured to determine a color class of the image to be processed and pixels included in each color class;
the second determining module is configured to determine a pixel retention factor corresponding to each color category according to the pixels included in each color category;
The third determining module is configured to determine a white balance gain parameter according to the pixel retention factor corresponding to each color class;
and the processing module is configured to perform white balance processing on the image to be processed by utilizing the white balance gain parameter.
10. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring an image to be processed;
determining the color class of the image to be processed and the pixel included in each color class;
determining a pixel retention factor corresponding to each color class according to the pixels included in each color class;
determining a white balance gain parameter according to the pixel retention factor corresponding to each color class;
and performing white balance processing on the image to be processed by using the white balance gain parameter.
11. A computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the steps of the method of any of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210364883.9A CN116934607A (en) | 2022-04-07 | 2022-04-07 | Image white balance processing method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210364883.9A CN116934607A (en) | 2022-04-07 | 2022-04-07 | Image white balance processing method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116934607A true CN116934607A (en) | 2023-10-24 |
Family
ID=88377901
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210364883.9A Pending CN116934607A (en) | 2022-04-07 | 2022-04-07 | Image white balance processing method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116934607A (en) |
-
2022
- 2022-04-07 CN CN202210364883.9A patent/CN116934607A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109345485B (en) | Image enhancement method and device, electronic equipment and storage medium | |
CN104517268B (en) | Adjust the method and device of brightness of image | |
WO2016011747A1 (en) | Skin color adjustment method and device | |
CN110958401B (en) | Super night scene image color correction method and device and electronic equipment | |
CN106131441B (en) | Photographing method and device and electronic equipment | |
CN108932696B (en) | Signal lamp halo suppression method and device | |
CN107730448B (en) | Beautifying method and device based on image processing | |
US11989863B2 (en) | Method and device for processing image, and storage medium | |
CN105528765A (en) | Method and device for processing image | |
CN106982327B (en) | Image processing method and device | |
CN106210446A (en) | saturation enhancement method and device | |
US11348365B2 (en) | Skin color identification method, skin color identification apparatus and storage medium | |
CN113472997B (en) | Image processing method and device, mobile terminal and storage medium | |
US20220405896A1 (en) | Image processing method and apparatus, model training method and apparatus, and storage medium | |
EP3273437A1 (en) | Method and device for enhancing readability of a display | |
CN116934607A (en) | Image white balance processing method and device, electronic equipment and storage medium | |
CN116866495A (en) | Image acquisition method, device, terminal equipment and storage medium | |
CN111835977B (en) | Image sensor, image generation method and device, electronic device, and storage medium | |
CN107025638B (en) | Image processing method and device | |
CN112016595A (en) | Image classification method and device, electronic equipment and readable storage medium | |
US20220375037A1 (en) | Image processing method and storage medium | |
CN116805976A (en) | Video processing method, device and storage medium | |
CN118096535A (en) | Image processing method and device, electronic equipment and storage medium | |
CN118014873A (en) | Image processing method and device, image acquisition device and storage medium | |
CN118102080A (en) | Image shooting method, device, terminal and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |