CN108063934B - Image processing method and device, computer readable storage medium and computer device - Google Patents

Image processing method and device, computer readable storage medium and computer device Download PDF

Info

Publication number
CN108063934B
CN108063934B CN201711423795.7A CN201711423795A CN108063934B CN 108063934 B CN108063934 B CN 108063934B CN 201711423795 A CN201711423795 A CN 201711423795A CN 108063934 B CN108063934 B CN 108063934B
Authority
CN
China
Prior art keywords
light source
image
color
light sources
richness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711423795.7A
Other languages
Chinese (zh)
Other versions
CN108063934A (en
Inventor
王会朝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201711423795.7A priority Critical patent/CN108063934B/en
Publication of CN108063934A publication Critical patent/CN108063934A/en
Application granted granted Critical
Publication of CN108063934B publication Critical patent/CN108063934B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control

Abstract

The application discloses an image processing method comprising: processing the image to detect the color of the light source of the image to obtain the color richness of the light source and the pure chroma of the image; judging whether the color richness is greater than or equal to a preset richness and the pure chroma is less than or equal to a preset pure chroma; and performing white balance processing on the image using a gray world method when the color richness is greater than or equal to a predetermined richness and the pure chroma is less than or equal to a predetermined pure chroma. The application also discloses an image processing apparatus, a computer readable storage medium and a computer device. The image processing method and device, the computer readable storage medium and the computer equipment are used for detecting the color of the light source and judging the color richness of the light source and the pure chroma of the scene, and when the color richness of the light source is higher and the pure chroma of the scene is lower, the gray scale world method with better white balance effect is adopted for white balance processing, so that the white balance processing effect is improved.

Description

Image processing method and device, computer readable storage medium and computer device
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, a computer-readable storage medium, and a computer device.
Background
The related art image processing method may detect a color of a light source by processing an image and perform white balance processing according to the color of the light source. When a plurality of light sources with different colors exist in a scene, a main light source is often selected by analyzing the weight of the light sources and white balance processing is performed according to the color of the main light source, however, the color of the main light source often deviates from the actual color of the light source of the whole scene, so that the white balance processing effect is poor, and particularly, when the color of the light source of the scene is rich.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, a computer readable storage medium and computer equipment.
The image processing method of the embodiment of the application comprises the following steps:
processing an image to detect colors of light sources of the image to derive color richness of the light sources and pure chroma of the image;
judging whether the color richness is larger than or equal to a preset richness and the pure chroma is smaller than or equal to a preset pure chroma; and
and performing white balance processing on the image by using a gray world method when the color richness is greater than or equal to the preset richness and the pure chroma is less than or equal to the preset pure chroma.
An image processing apparatus according to an embodiment of the present application includes:
a first processing module for processing an image to detect colors of light sources of the image to obtain color richness of the light sources and pure chroma of the image;
the judging module is used for judging whether the color richness is larger than or equal to a preset richness and the pure chroma is smaller than or equal to a preset pure chroma; and
a second processing module for performing white balance processing on the image using a gray world method when the color richness is greater than or equal to the predetermined richness and the pure chroma is less than or equal to the predetermined pure chroma.
One or more non-transitory computer-readable storage media embodying computer-executable instructions that, when executed by one or more processors, cause the processors to perform the image processing method.
A computer device according to an embodiment of the present application includes a memory and a processor, where the memory stores computer-readable instructions, and the instructions, when executed by the processor, cause the processor to execute the image processing method.
The image processing method and apparatus, the computer-readable storage medium, and the computer device of the embodiments of the present application detect the color of the light source, however, compared to the difference in the related art in which the white balance processing is directly performed with the color of the light source, the present application further considers the color richness of the light source and the pure chromaticity of the scene, and performs the white balance processing with the gray world method that performs the white balance processing with a better effect than the color temperature of the light source directly when the color richness of the light source is higher and the pure chromaticity of the scene is lower, thereby improving the effect of the white balance processing.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application.
FIG. 2 is a block diagram of an image processing apparatus according to some embodiments of the present application.
FIG. 3 is a schematic plan view of a computer device according to some embodiments of the present application.
FIG. 4 is a flow chart illustrating an image processing method according to some embodiments of the present application.
FIG. 5 is a block diagram of a first processing module of some embodiments of the present application.
Fig. 6 is a schematic view of a scene of an image processing method according to some embodiments of the present application.
Fig. 7 is a schematic view of a scene of an image processing method according to some embodiments of the present application.
FIG. 8 is a histogram of region formation for an image processing method according to some embodiments of the present application.
FIG. 9 is a flow chart illustrating an image processing method according to some embodiments of the present application.
FIG. 10 is a block diagram of a first processing module of certain embodiments of the present application.
FIG. 11 is a schematic view of a scene of an image processing method according to some embodiments of the present application.
FIG. 12 is a graphical representation of a color temperature curve for certain embodiments of the present application.
FIG. 13 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application.
FIG. 14 is a block diagram of a first processing module of some embodiments of the present application.
FIG. 15 is a schematic view of a scene of an image processing method according to some embodiments of the present application.
FIG. 16 is a flow chart illustrating an image processing method according to some embodiments of the present application.
FIG. 17 is a block diagram of a first processing module of certain embodiments of the present application.
FIG. 18 is a histogram of image formation of an image processing method according to some embodiments of the present application.
FIG. 19 is a flow chart illustrating an image processing method according to some embodiments of the present application.
FIG. 20 is a block diagram of an image processing apparatus according to some embodiments of the present application.
FIG. 21 is a flow chart illustrating an image processing method according to some embodiments of the present application.
FIG. 22 is a block diagram of a third processing module in accordance with certain implementations of the present application.
FIG. 23 is a schematic view of a scene of an image processing method according to some embodiments of the present application.
FIG. 24 is a schematic view of a scene of an image processing method according to some embodiments of the present application.
FIG. 25 is a block diagram of a computer device according to some embodiments of the present application.
FIG. 26 is a block diagram of an image processing circuit according to some embodiments of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another.
Referring to fig. 1, an image processing method according to an embodiment of the present application includes the following steps:
s12: processing the image to detect the color of the light source of the image to obtain the color richness of the light source and the pure chroma of the image;
s14: judging whether the color richness is greater than or equal to a preset richness and the pure chroma is less than or equal to a preset pure chroma; and
s16: and performing white balance processing on the image by using a gray world method when the color richness is greater than or equal to a preset richness and the pure chroma is less than or equal to a preset pure chroma.
Referring to fig. 2, an image processing apparatus 10 according to an embodiment of the present disclosure includes a first processing module 12, a determining module 14, and a second processing module 16. The first processing module 12 is used for processing the image to detect the color of the light source of the image to obtain the color richness of the light source and the pure chroma of the image. The determining module 14 is used for determining whether the color richness is greater than or equal to a predetermined richness and the pure chroma is less than or equal to a predetermined pure chroma. The second processing module 16 is configured to perform image processing on the image using a gray world method when the color richness is greater than or equal to a predetermined richness and the pure chroma is less than or equal to a predetermined pure chroma.
The image processing method according to the embodiment of the present application may be implemented by the image processing apparatus 10 according to the embodiment of the present application, wherein the step S12 may be implemented by the first processing module 12, the step S14 may be implemented by the determining module 14, and the step S16 may be implemented by the second processing module 16.
Referring to fig. 3, the image processing apparatus 10 according to the embodiment of the present application may be applied to the computer device 100 according to the embodiment of the present application, that is, the computer device 100 according to the embodiment of the present application may include the image processing apparatus 10 according to the embodiment of the present application.
In some embodiments, the computer device 100 includes a cell phone, a tablet, a laptop, a smart bracelet, a smart watch, a smart helmet, smart glasses, and the like.
The image processing method, the image processing apparatus 10 and the computer device 100 according to the embodiment of the present application detect the color of the light source, however, compared to the difference in the related art that the white balance processing is directly performed by using the color of the light source, the present application further considers the color richness of the light source and the pure chroma of the scene, and performs the white balance processing by using the gray world method that performs the white balance processing with a better effect than the color temperature of the light source when the color richness of the light source is higher and the pure chroma of the scene is lower, thereby improving the effect of the white balance processing.
In some embodiments, the step of processing the image using the gray world method comprises the steps of:
counting the primary color channel values of all pixels of the image;
calculating the average value of the (R, G, B) three primary color channel values;
the white balance adjustment values for the R, G and B channels are determined from the average. And
and carrying out white balance adjustment on the image according to the white balance adjustment value.
Specifically, the average value of the primary color channel pixels of the image obtained by calculating the primary color channel value data of the image is (R)avg,Gavg,Bavg) Then, the white balance adjustment value of each channel is calculated as K/R according to the average valueavg,K/GavgAnd K/BavgWherein K ═ R (R)avg,Gavg,Bavg)/3. In this way, the entire image can be white-balanced based on the white balance adjustment value of the primary color channel value.
Thus, white balance can be performed quickly and the white balance effect is better in the case that the colors of the light sources are richer and the scene purity is lower.
Referring to fig. 4, in some embodiments, step S12 includes the following steps:
s122: dividing the image into a plurality of regions;
s124: judging whether the region is a target region comprising a light source according to the histogram of each region;
s126: judging whether a plurality of adjacent target areas exist when the area is a target area comprising a light source;
s128: splicing a plurality of adjacent target areas into a light source when the plurality of adjacent target areas exist;
s121: determining a target area as a light source when there are no adjacent target areas; and
s123: and counting the number of the light sources.
Referring to fig. 5, in some embodiments, the first processing module 12 includes a dividing unit 122, a first determining unit 124, a second determining unit 126, a splicing unit 128, a first determining unit 121, and a counting unit 123. The dividing unit 122 is used to divide the image into a plurality of regions. The first judging unit 124 is configured to judge whether the region is a target region including the light source according to the histogram of each region. The second determination unit 126 is configured to determine whether there are multiple adjacent target areas. The stitching unit 128 is configured to stitch the adjacent target areas as the light source when the adjacent target areas exist. The first determination unit 121 is configured to determine the target area as the light source when there are no adjacent plurality of target areas. The counting unit 123 is used for counting the number of the light sources.
That is, step S122 may be implemented by the dividing unit 122, step S124 may be implemented by the first judging unit 124, step S126 may be implemented by the second judging unit 126, step S128 may be implemented by the splicing unit 128, step S121 may be implemented by the first determining unit 121, and step S123 may be implemented by the counting unit 123.
In this way, the location and number of light sources in the image can be determined.
Specifically, referring to fig. 6-8, in one embodiment, the image processing method first divides the image into a plurality of regions, for example, 4 × 5 regions. Each region can draw 4 histograms according to the channel values of R, Gr, Gb and B, and then whether the region is a target region comprising the light source is judged according to the 4 histograms of each region. In fig. 6 and 7, the images each include a plurality of target regions. For example, the image in fig. 6 includes 3 target regions, and the image in fig. 7 includes 8 target regions. When the existing area is a target area including a light source, the image processing method judges whether a plurality of adjacent target areas exist, namely judges whether a situation that one light source covers a plurality of target areas simultaneously exists, wherein the covering can be partial covering or complete covering. When a plurality of adjacent target areas exist, the image processing method splices the plurality of adjacent target areas into a light source; when there are no adjacent target areas, each target area is determined as a light source. Referring to fig. 6, 3 target regions that are not adjacent to each other are respectively determined as a light source R, a light source G, and a light source B. Referring to fig. 7, 6 adjacent target regions are combined to form a complete light source R, and the other two non-adjacent target regions are respectively identified as a light source G and a light source B.
Note that the method of drawing the histogram of the area in fig. 8 is merely an example, and the horizontal axis of the histogram in fig. 8 indicates the pixel value and the vertical axis indicates the number of pixels. In other embodiments, the horizontal axis of the histogram may also be the number of pixels, and the vertical axis is the pixel value; or the horizontal axis of the histogram is the proportion of the number of pixels, and the vertical axis is the pixel value; or the horizontal axis of the histogram is the pixel value, and the vertical axis of the histogram is the ratio of the number of pixels.
In some embodiments, when determining whether a certain region is a target region including a light source according to a histogram of the region, it may be implemented by determining whether a pixel count ratio in which a pixel value exceeds a predetermined ratio. For example, the determination may be made as to whether the pixel count ratio of the pixel value exceeding 239 exceeds 5%, and when the pixel count ratio of the pixel value exceeding 239 exceeds 5%, the region is indicated as a target region including the light source; when the number of pixels having a pixel value exceeding 239 accounts for not more than 5%, it indicates that the region is not a target region including the light source.
Referring to fig. 9, in some embodiments, step S12 includes the following steps:
s125: determining a highlight region and a middle highlight region according to the radially outward brightness distribution of the center of the light source; and
s127: the average of the primary color channel pixels of the highlight region is subtracted from the average of the primary color channel pixels of the mid-highlight region to determine the color of the light source.
Referring to fig. 10, in some embodiments, the first processing module 12 includes a second determining unit 125 and a third determining unit 127. The second determination unit 125 is configured to determine the highlight region and the mid-highlight region according to the radially outward luminance distribution of the center of the light source. The third determination unit 127 is for subtracting the average value of the primary color channel pixels of the medium bright area from the average value of the primary color channel pixels of the high bright area to determine the color of the light source.
That is, step S125 may be implemented by the second determining unit 125, and step S127 may be implemented by the third determining unit 127.
Therefore, the color of the main light source can be determined through the high-brightness region H and the middle-brightness region M, the color temperature of the main light source can be determined according to the color of the main light source, and the color temperature of the main light source can be estimated more accurately.
Referring to fig. 11, after the light source position in the image is determined, it can be understood that the central O region of the light source in the image is an overexposed region, which is generally a large white spot and does not include information of the light source color. The light source color may be determined by the primary color channel pixel average of the highlight region H and the mid-highlight region M. The highlight region H may refer to a region constituted by pixels having luminance values radially outward of the center of the light source in a first luminance range L1, the first luminance range L1 being, for example, [200, 239 ]. The middle-bright region M may refer to a region constituted by pixels having brightness values radially outward of the center of the light source in a second brightness range L2, the second brightness range L2 being [150, 200 ], for example. It should be noted that specific values of the first luminance range L1 and the second luminance range L2 may be determined according to the luminance distribution of the center O of the light source radially outward, for example, the luminance of the light source decays faster, and the first luminance range L1 and the second luminance range L2 may be increased; for example, the luminance of the light source decays relatively slowly, the first luminance range L1 and the second luminance range L2 may be reduced.
The average value of the primary color channel pixels of the high-brightness area is the average value of the pixel values of all the pixels of the high-brightness area, and the average value of the primary color channel pixels of the medium-brightness area is the average value of the pixel values of all the pixels of the medium-brightness area. Assuming that the number of pixels in the highlight region is C1 and the number of pixels in the middle-bright region is C2, the number of pixels in the highlight region is C1
The average value of the primary color channel pixels of the highlight area is:
Figure BDA0001523583480000061
the average value of the primary color channel pixels of the middle bright area is:
Figure BDA0001523583480000062
averaging the primary color channel pixels of the highlight region
Figure BDA0001523583480000063
Subtracting the mean of the primary color channel pixels for the medium bright area
Figure BDA0001523583480000064
Namely, it is
Figure BDA0001523583480000065
To determine the color of the light source, the color temperature of the light source may be correspondingly determined according to the color of the light source, in some embodiments, the determining the color temperature of the light source according to the color of the light source may specifically be: and determining the color temperature of the light source according to the corresponding relation among the color of the light source, the color of the light source and the color temperature of the light source. The correspondence between the color of the light source and the color temperature of the light source may be a mapping table and/or a color temperature curve (as shown in fig. 12). Specifically, in one embodiment, images may be acquired under standard light boxes with color temperatures of 3000K, 4000K, 5000K, 6000K, and … … respectively, and the images may be calculated to obtain corresponding color temperatures
Figure BDA0001523583480000071
Thereby can form
Figure BDA0001523583480000072
A mapping table or a color temperature profile with the color temperature of the light source and may be stored in a local database. In the embodiment of the application, the calculation results
Figure BDA0001523583480000073
And then, the color temperature of the corresponding light source can be inquired and obtained through the color temperature curve graph or the mapping table. Then, the corresponding white balance parameters can be searched and obtained according to the color temperature of the light source and the corresponding relation between the color temperature of the light source and the white balance parameters, so that the white balance processing can be carried out on the image according to the white balance parameters.
In some embodiments, the primary color channel refers to a color channel, for example, including at least one of an R (red) channel, a Gr (green red) channel, a Gb (green blue) channel, and a B (blue) channel, and in some embodiments, the pixel value of the G (green) channel may be obtained by the pixel value of the Gr channel and the pixel value of the Gb channel. The pixel average may refer to an arithmetic average of pixel values. In one example, the average value (Ravg, Gavg, Bavg) of each primary color channel pixel in the highlight area is (200,210,220), the average value (Ravg, Gavg, Bavg) of each primary color channel pixel in the middle highlight area is (160,180,190), and the channel (R, G, B) of the light source color is (200-.
Referring to fig. 13, in some embodiments, step S12 further includes the following steps:
s129: determining the color temperature of the light source according to the color of the light source;
s12 b: comparing the different light sources to obtain a color temperature difference value between the different light sources, and combining the different light sources into the same light source when the color temperature difference value is smaller than a preset color temperature difference value; and
s12 d: and determining the color richness according to the number of the combined light sources.
Referring to fig. 14, in some embodiments, the first processing module 12 further includes a fourth determining unit 129, a first comparing unit 12b, and a fifth determining unit 12 d. The fourth determination unit 129 is configured to determine the color temperature of the light source according to the color of the light source. The first comparing unit 12b is configured to compare the different light sources to obtain a color temperature difference value between the different light sources, and combine the different light sources into the same light source when the color temperature difference value is smaller than a predetermined color temperature difference value. The fifth determining unit 12d is configured to determine the color richness according to the number of the combined light sources.
That is, step S129 is implemented by the fourth determining unit 129, step S12b is implemented by the first comparing unit 12b, and step S12d is implemented by the fifth determining unit 12 d.
Therefore, whether different light sources are light sources with the same color temperature or not is judged according to the color temperature difference value between the different light sources, when the color temperature difference value is smaller than the preset color temperature difference value, the different light sources are combined to be the same light source, the color richness of the light sources is determined according to the number of the combined light sources, and the accuracy is high.
Specifically, as shown in fig. 15, there are 5 light sources in the image, and the color temperature of light source a is 3000K, the color temperature of light source B is 3500K, the color temperature of light source C is 3800K, the color temperature of light source D is 4100K, and the color temperature of light source E is 5000K. For example, the predetermined color temperature difference is 500, and the color temperature difference between the light source B and the light source C is 300, which is obtained by calculating the difference between different light sources, is less than the predetermined color temperature difference 500, when calculating the number of light sources, the light source B and the light source C are logically regarded as one light source, and the one light source includes two light sources with different color temperatures, and the combined light source should satisfy that the difference between all the light sources included in the light source and the compared light source is less than 500 when performing comparison, for example, when the light source B and the light source C are combined into one light source and compared with the light source D, the difference between the light source C and the light source D is 300, but the difference between the light source B and the light source D is 600, and only one of the light sources satisfies the. Thus, the number of the final light sources was 4. Of course, the predetermined color temperature difference may be other suitable values. Therefore, the color richness is measured according to the number of the light sources, namely, the more the number of the light sources is, the more the number of the light sources representing different color temperature ranges is, the higher the color richness is represented. The predetermined richness may be set to 3, that is, the richness of the color of the light source representing the image satisfies a condition for using the gray world method when the number of light sources (color richness) is greater than 3, and thus, the white balance processing is performed using the gray world method after the pure chroma of the image is judged to be less than the predetermined pure chroma.
Referring to fig. 16, in some embodiments, step S12 further includes the following steps:
s12 f: comparing the proportion of pixels falling within a predetermined pixel value range to the total pixels of the image to determine a maximum pixel proportion; and
s12 h: pure chroma is determined according to the maximum pixel proportion.
Referring to fig. 17, in some embodiments, the first processing module 12 further includes a second comparing unit 12f and a sixth determining unit 12 h. The second comparing unit 12f is for comparing the proportion of pixels falling within a predetermined pixel value range to the total pixels of the image to determine the maximum pixel proportion. The sixth determining unit 12h is configured to determine pure chroma according to the maximum pixel proportion.
That is, step S12f is implemented by the second comparing unit 12f, and step S12h is implemented by the sixth determining unit 12 h.
Thus, the proportion of pixels falling within a predetermined pixel value range to the total pixels of the image is compared to determine the maximum pixel proportion, and then pure chroma is determined according to the maximum pixel proportion. The pure chroma of the image can be accurately estimated.
Specifically, as shown in fig. 18, the pixel values of all pixels in the image are counted, a plurality of pixel ranges are determined, for example, every 10 pixel values are set as a pixel value range such as (0, 10), (10, 20), and so on, with the pixel values as the horizontal axis of the histogram, and the pixels in each pixel value range are regarded as the same color.
Referring to fig. 19, in some embodiments, the image processing method further includes the following steps:
s18: and when the color richness is less than the preset richness or the pure chroma is greater than the preset pure chroma, carrying out white balance processing on the image according to the color temperature of the light source.
Referring to fig. 20, in some embodiments, the image processing apparatus further includes a third processing module 18. The third processing module 18 is configured to perform white balance processing on the image according to the color temperature of the light source when the color richness is less than the predetermined richness or the pure chroma is greater than the predetermined pure chroma.
That is, step S18 is implemented by the third processing module 18.
Therefore, when the color richness is less than the preset richness or the pure chroma is greater than the preset pure chroma, the gray world method is not adopted, and the stability of white balance is ensured.
Referring to fig. 21, in some embodiments, step S18 includes the following steps:
s182: judging whether the number of the light sources is more than or equal to 1;
s184: when the number of the light sources is equal to 1, carrying out white balance processing on the image according to the color temperature of the light sources; and
s186: when the number of the light sources is larger than 1, a main light source is determined according to at least one of scene parameters, corresponding areas and brightness parameters of the light sources, and white balance processing is performed on the image according to the color temperature of the main light source, wherein the scene parameters comprise the time for shooting the image and the signal intensity of a GPS, the brightness parameters comprise the brightness of the light sources, and the light sources comprise the main light source.
Referring to fig. 22, in some embodiments, the third processing module 18 includes a third determining unit 182, a first processing unit 184 and a second processing unit 186. The third determination unit 182 is used to determine whether the number of light sources is greater than or equal to 1. The first processing unit 184 performs white balance processing on an image according to the color temperature of the light sources when the number of light sources is equal to 1. The second processing unit 186 is configured to determine a main light source according to at least one of a scene parameter, a corresponding area, and a brightness parameter of the light source when the number of the light sources is greater than 1, and perform white balance processing on the image according to a color temperature of the main light source, where the scene parameter includes a time for capturing the image and a signal intensity of the GPS, the brightness parameter includes brightness of the plurality of light sources, and the light sources include the main light source.
Therefore, when the light source richness of the image is low and the pure chroma of the image is high, whether the number of the light sources is greater than 1 is judged, when the number of the light sources is not greater than 1, the light sources are determined to be main light sources, when the number of the light sources is greater than 1, the main light sources are determined according to at least one of scene parameters, corresponding areas and brightness parameters of the light sources, and the image is subjected to white balance processing according to the color temperature of the main light sources.
Specifically, the time period in which the current time is located can be distinguished according to the time for shooting the image, and the place where the user is likely to be located in the current time period can be judged to carry out shooting activities through the work and rest time and the conventional habits of the user stored in the local database. For example, at 12 am, the user typically has lunch at a restaurant; after 8 pm, the user typically reads in the living room. In this way, it can be roughly distinguished whether the user is in an indoor environment or an outdoor environment or a certain scene according to the time when the image is taken. In addition, the signal strength of the outdoor GPS is generally stronger than that of the indoor GPS. Therefore, it can be roughly distinguished from the signal strength of the GPS that the user is in an indoor environment or an outdoor environment. It is understood that the color temperature of the light source in the room is generally below 5000K, for example, the color temperature of a tungsten lamp is 2760-2900K, and the color temperature of a flash lamp is 3800K; the color temperature of the outdoor light source is generally above 5000K, for example, the color temperature of the noon sunlight is 5000K, and the color temperature of the blue sky is 10000K. Thus, it can be roughly determined whether the current color temperature should be above 5000K or below 5000K according to whether the user is in an indoor environment or an outdoor environment, as shown in fig. 23, for example, the color temperature of the light source R is 4500K, the color temperature of the light source G is 3500K, and the color temperature of the light source B is 7000K, and the current color temperature is determined to be 5000K according to the scene parameters, and obviously, the light source R is determined to be the main light source because the light source R is closest to the current color temperature of the scene. Thus, it can be used to determine the primary light source.
When the main light source is determined according to the corresponding areas of the plurality of light sources, the light source with the largest area can be selected as the main light source by comparing the areas of the plurality of light sources. For example, in fig. 23, the area of the light source R is larger than the area of the light source G and larger than the area of the light source B, the light source R is determined as the main light source.
When the main light source is determined according to the brightness corresponding to the plurality of light sources, it can be understood that the higher the brightness of the light source is, the greater the influence on the whole image is generally. As shown in fig. 24, when the light source areas are the same, the luminance of the light source G is 150, the luminance of the light source G is 100, and the luminance of the light source B is 200, the light source B is determined to be the main light source. Thus, when the light source areas are the same, the light source with the maximum brightness is determined as the main light source.
The image processing method of the embodiment of the application can determine the main light source according to the combination of the time of shooting the images of the plurality of light sources and the signal intensity of the GPS, or determine the main light source according to the areas corresponding to the plurality of light sources; or determining a main light source according to a combination of the corresponding brightness of the plurality of light sources and the average brightness of the image; or determining a main light source according to the combination of the time of shooting the images of the plurality of light sources and the signal intensity of the GPS and the areas corresponding to the plurality of light sources; or determining a main light source according to a combination of the time of taking the image of the plurality of light sources and the signal intensity of the GPS and a combination of the corresponding brightness of the plurality of light sources and the average brightness of the image; or determining a main light source according to the combination of the areas corresponding to the plurality of light sources and the corresponding brightness of the plurality of light sources and the average brightness of the image; or the main light source is determined according to a combination of the time of taking the image and the signal intensity of the GPS, the corresponding area, and a combination of the corresponding brightness and the average brightness of the image of the plurality of light sources.
Preferably, the image processing method determines the main light source according to a combination of the time of taking the image and the signal intensity of the GPS, the corresponding area, and a combination of the corresponding brightness and the average brightness of the image of the plurality of light sources. The combination of the time at which the image is captured and the signal intensity of the GPS, the corresponding area, and the combination of the corresponding brightness and the average brightness of the image of the plurality of light sources may be respectively set with different weights. Therefore, the selected main light source is accurate, and the white balance effect expected by a user can be better met when the image is subjected to white balance processing.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of:
s12: processing the image to detect the color of the light source of the image to obtain the color richness of the light source and the pure chroma of the image;
s14: judging whether the color richness is greater than or equal to a preset richness and the pure chroma is less than or equal to a preset pure chroma; and
s16: and performing white balance processing on the image by using a gray world method when the color richness is greater than or equal to a preset richness and the pure chroma is less than or equal to a preset pure chroma.
FIG. 25 is a diagram showing an internal configuration of the computer apparatus 100 according to an embodiment. As shown in fig. 25, the computer apparatus 100 includes a processor 52, a memory 53 (e.g., a nonvolatile storage medium), an internal memory 54, a display 55, and an input device 56, which are connected via a system bus 51. The memory 53 of the computer device 100 has stored therein an operating system and computer readable instructions. The computer readable instructions can be executed by the processor 52 to implement the image processing method of the embodiment of the present application. The processor 52 is used to provide computing and control capabilities that support the operation of the overall computing device 100. The internal memory 53 of the computer device 100 provides an environment for the execution of computer readable instructions in the memory 52. The display 55 of the computer device 100 may be a liquid crystal display or an electronic ink display, and the input device 56 may be a touch layer covered on the display 55, a button, a trackball or a touch pad arranged on a housing of the computer device 100, or an external keyboard, a touch pad or a mouse. The computer device 100 may be a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, or a wearable device (e.g., a smart bracelet, a smart watch, a smart helmet, smart glasses), etc. It will be understood by those skilled in the art that the configuration shown in fig. 25 is only a schematic diagram of a part of the configuration related to the present application, and does not constitute a limitation to the computer device 100 to which the present application is applied, and a specific computer device 100 may include more or less components than those shown in the drawings, or combine some components, or have a different arrangement of components.
Referring to fig. 26, the computer device 100 according to the embodiment of the present disclosure includes an Image Processing circuit 80, and the Image Processing circuit 80 may be implemented by hardware and/or software components and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 26 is a diagram of image processing circuitry 80 in one embodiment. As shown in fig. 26, for convenience of explanation, only aspects of the image processing technique related to the embodiment of the present application are shown.
As shown in fig. 26, the image processing circuit 80 includes an ISP processor 81 (the ISP processor 81 may be the processor 52 or a part of the processor 52) and control logic 82. The image data captured by the camera 83 is first processed by the ISP processor 81, and the ISP processor 81 analyzes the image data to capture image statistics that may be used to determine one or more control parameters of the camera 83. The camera 83 may include one or more lenses 832 and an image sensor 834. Image sensor 834 may comprise an array of color filters (e.g., Bayer filters), and image sensor 834 may acquire light intensity and wavelength information captured by each imaging pixel and provide a raw set of image data that may be processed by ISP processor 81. The sensor 84 (e.g., a gyroscope) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 81 based on the type of sensor 84 interface. The sensor 84 interface may be a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interface, or a combination of the above.
In addition, the image sensor 834 may also send raw image data to the sensor 84, the sensor 84 may provide raw image data to the ISP processor 81 based on the sensor 84 interface type, or the sensor 84 may store raw image data in the image memory 85.
The ISP processor 81 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 81 may perform one or more image processing operations on the raw image data, gathering statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
The ISP processor 81 may also receive image data from an image memory 85. For example, the sensor 84 interface sends raw image data to the image memory 85, and the raw image data in the image memory 85 is then provided to the ISP processor 81 for processing. The image Memory 85 may be the Memory 53, a portion of the Memory 53, a storage device, or a separate dedicated Memory within the electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving raw image data from image sensor 834 interface or from sensor 84 interface or from image memory 85, ISP processor 81 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 85 for additional processing before being displayed. The ISP processor 81 receives the processed data from the image memory 85 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. The image data processed by ISP processor 81 may be output to display 87 (display 87 may include display screen 55) for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). Further, the output of the ISP processor 81 may also be sent to the image memory 85, and the display 87 may read image data from the image memory 85. In one embodiment, image memory 85 may be configured to implement one or more frame buffers. In addition, the output of the ISP processor 81 may be sent to an encoder/decoder 86 for encoding/decoding the image data. The encoded image data may be saved and decompressed before being displayed on the display 87 device. The encoder/decoder 86 may be implemented by a CPU or GPU or coprocessor.
The statistical data determined by ISP processor 81 may be sent to control logic 82 unit. For example, the statistical data may include image sensor 834 statistics such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 832 shading correction, and the like. Control logic 82 may include a processing element and/or microcontroller that executes one or more routines (e.g., firmware) that determine control parameters for camera 83 and ISP processor 81 based on the received statistical data. For example, the control parameters of camera 83 may include sensor 84 control parameters (e.g., gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 832 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 832 shading correction parameters.
The following steps are performed to implement the image processing method using the image processing technique of fig. 26:
s12: processing the image to detect the color of the light source of the image to obtain the color richness of the light source and the pure chroma of the image;
s14: judging whether the color richness is greater than or equal to a preset richness and the pure chroma is less than or equal to a preset pure chroma; and
s16: and performing white balance processing on the image by using a gray world method when the color richness is greater than or equal to a preset richness and the pure chroma is less than or equal to a preset pure chroma.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, and the program can be stored in a non-volatile computer readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An image processing method, characterized by comprising the steps of:
processing an image to detect colors of light sources of the image to derive color richness of the light sources and pure chroma of the image;
judging whether the color richness is larger than or equal to a preset richness and the pure chroma is smaller than or equal to a preset pure chroma;
performing white balance processing on the image by using a gray world method when the color richness is greater than or equal to the predetermined richness and the pure chroma is less than or equal to the predetermined pure chroma; and
when the color richness is smaller than the preset richness or the pure chroma is larger than the preset pure chroma, carrying out white balance processing on the image according to the color temperature of the light source;
the step of processing an image to detect the colors of the light sources of the image to obtain the color richness of the light sources and the pure chroma of the image comprises the steps of:
dividing the image into a plurality of regions;
judging whether the area is a target area comprising the light source according to whether the pixel number ratio of the pixel value exceeding a preset value in the histogram of each area exceeds a preset ratio;
judging whether a plurality of adjacent target areas exist when the area is the target area comprising the light source;
splicing a plurality of adjacent target areas into the light source when the plurality of adjacent target areas exist;
determining the target area as the light source when there are no adjacent plurality of the target areas; and
counting the number of the light sources;
determining a high-brightness region and a medium-brightness region surrounding the central region of the light source according to the radially outward brightness distribution of the center of the light source, wherein the central region of the light source is an overexposure region; and
subtracting the average of the primary color channel pixels of the medium bright area from the average of the primary color channel pixels of the high bright area to determine the color of the light source.
2. The image processing method of claim 1, wherein the step of processing the image to detect the color of the light source of the image to obtain the color richness of the light source and the pure chroma of the image further comprises the steps of:
determining the color temperature of the light source according to the color of the light source;
comparing different light sources to obtain a color temperature difference value between different light sources, and combining different light sources into the same light source when the color temperature difference value is smaller than a preset color temperature difference value; and
determining the color richness according to the number of the combined light sources.
3. The image processing method of claim 1, wherein the step of processing the image to detect the color of the light source of the image to obtain the color richness of the light source and the pure chroma of the image further comprises the steps of:
comparing the proportion of pixels falling within a predetermined pixel value range to the total pixels of the image to determine a maximum pixel proportion; and
and determining the pure chroma according to the maximum pixel proportion.
4. The image processing method according to claim 2, wherein the step of white-balancing the image in accordance with the color temperature of the light source when the color richness is smaller than the predetermined richness or the pure chroma is larger than the predetermined pure chroma comprises the steps of:
judging whether the number of the light sources is more than or equal to 1;
when the number of the light sources is equal to 1, carrying out white balance processing on the image according to the color temperature of the light sources; and
when the number of the light sources is larger than 1, determining a main light source according to at least one of scene parameters, corresponding areas and brightness parameters of the light sources, and performing white balance processing on the image according to the color temperature of the main light source, wherein the scene parameters comprise the time for shooting the image and the signal intensity of a GPS, the brightness parameters comprise the brightness of the light sources, and the light sources comprise the main light source.
5. An image processing apparatus characterized by comprising:
a first processing module for processing an image to detect colors of light sources of the image to obtain color richness of the light sources and pure chroma of the image;
the judging module is used for judging whether the color richness is larger than or equal to a preset richness and the pure chroma is smaller than or equal to a preset pure chroma;
a second processing module for performing white balance processing on the image using a gray world method when the color richness is greater than or equal to the predetermined richness and the pure chroma is less than or equal to the predetermined pure chroma; and
a third processing module, configured to perform white balance processing on the image according to the color temperature of the light source when the color richness is smaller than the predetermined richness or the pure chroma is larger than the predetermined pure chroma;
the first processing module comprises:
a dividing unit for dividing the image into a plurality of regions;
a first judging unit configured to judge whether the region is a target region including the light source according to whether a pixel number proportion in a histogram of each of the regions in which a pixel value exceeds a predetermined proportion;
a second determination unit configured to determine whether there are a plurality of adjacent target regions when the region is the target region including the light source;
a splicing unit for splicing a plurality of adjacent target regions into the light source when the plurality of adjacent target regions exist;
a first determination unit for determining the target area as the light source when there are no adjacent plural target areas; and
the counting unit is used for counting the number of the light sources;
a second determination unit, configured to determine a highlight region and a middle-highlight region surrounding a central region of the light source according to a radially outward brightness distribution of a center of the light source, where the central region of the light source is an overexposure region; and
a third determination unit to subtract the primary color channel pixel average value of the medium bright region from the primary color channel pixel average value of the high bright region to determine the color of the light source.
6. The image processing apparatus of claim 5, wherein the first processing module further comprises:
a fourth determination unit for determining a color temperature of the light source according to the color of the light source;
the first comparison unit is used for comparing different light sources to obtain a color temperature difference value between the different light sources and combining the different light sources into the same light source when the color temperature difference value is smaller than a preset color temperature difference value; and
a fifth determining unit for determining the color richness according to the number of the combined light sources.
7. The image processing apparatus of claim 5, wherein the first processing module further comprises:
a second comparison unit for comparing a proportion of pixels falling within a predetermined pixel value range to the total pixels of the image to determine a maximum pixel proportion; and
a sixth determining unit for determining the pure chroma according to the maximum pixel proportion.
8. The image processing apparatus of claim 6, wherein the third processing module comprises:
a third judging unit configured to judge whether the number of the light sources is greater than or equal to 1;
a first processing unit, configured to perform white balance processing on the image according to the color temperature of the light sources when the number of the light sources is equal to 1; and
and the second processing unit is used for determining a main light source according to at least one of scene parameters, corresponding areas and brightness parameters of the light sources and performing white balance processing on the image according to the color temperature of the main light source when the number of the light sources is greater than 1, wherein the scene parameters comprise the time for shooting the image and the signal intensity of a GPS, the brightness parameters comprise the brightness of the light sources, and the light sources comprise the main light source.
9. A non-transitory computer-readable storage medium containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the image processing method of any one of claims 1 to 4.
10. A computer device comprising a memory and a processor, the memory having stored therein computer readable instructions that, when executed by the processor, cause the processor to perform the image processing method of any of claims 1 to 4.
CN201711423795.7A 2017-12-25 2017-12-25 Image processing method and device, computer readable storage medium and computer device Active CN108063934B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711423795.7A CN108063934B (en) 2017-12-25 2017-12-25 Image processing method and device, computer readable storage medium and computer device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711423795.7A CN108063934B (en) 2017-12-25 2017-12-25 Image processing method and device, computer readable storage medium and computer device

Publications (2)

Publication Number Publication Date
CN108063934A CN108063934A (en) 2018-05-22
CN108063934B true CN108063934B (en) 2020-01-10

Family

ID=62140135

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711423795.7A Active CN108063934B (en) 2017-12-25 2017-12-25 Image processing method and device, computer readable storage medium and computer device

Country Status (1)

Country Link
CN (1) CN108063934B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109327691B (en) * 2018-10-23 2021-05-04 Oppo广东移动通信有限公司 Image shooting method and device, storage medium and mobile terminal
CN110505459B (en) * 2019-08-16 2020-12-11 域鑫科技(惠州)有限公司 Image color correction method, device and storage medium suitable for endoscope
CN112788324A (en) * 2021-02-26 2021-05-11 广东以诺通讯有限公司 White balance method, system and terminal for pure color scene
CN113177886B (en) * 2021-04-14 2023-05-05 RealMe重庆移动通信有限公司 Image processing method, device, computer equipment and readable storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101370151B (en) * 2008-09-18 2011-12-28 北京中星微电子有限公司 Automatic white balance adjustment method
CN102883168A (en) * 2012-07-05 2013-01-16 上海大学 White balance processing method directed towards atypical-feature image
CN103929632B (en) * 2014-04-15 2016-02-03 浙江宇视科技有限公司 A kind of method for correcting automatic white balance and device
CN106791758B (en) * 2016-12-07 2018-07-06 浙江大华技术股份有限公司 The judgment method and device of natural light mixing colour temperature in a kind of image
CN106534835B (en) * 2016-11-30 2018-08-07 珠海市魅族科技有限公司 A kind of image processing method and device
CN106851121B (en) * 2017-01-05 2019-07-05 Oppo广东移动通信有限公司 Control method and control device
CN107483906B (en) * 2017-07-25 2019-03-19 Oppo广东移动通信有限公司 White balancing treatment method, device and the terminal device of image
CN107483908A (en) * 2017-09-27 2017-12-15 歌尔科技有限公司 White balance calibration method and electronic equipment

Also Published As

Publication number Publication date
CN108063934A (en) 2018-05-22

Similar Documents

Publication Publication Date Title
US10397486B2 (en) Image capture apparatus and method executed by image capture apparatus
CN107959851B (en) Colour temperature detection method and device, computer readable storage medium and computer equipment
CN107872663B (en) Image processing method and device, computer readable storage medium and computer equipment
CN108063934B (en) Image processing method and device, computer readable storage medium and computer device
JP5497151B2 (en) Automatic backlight detection
US20070047803A1 (en) Image processing device with automatic white balance
CN108174172B (en) Image pickup method and device, computer readable storage medium and computer equipment
US8786729B2 (en) White balance method and apparatus thereof
US9307213B2 (en) Robust selection and weighting for gray patch automatic white balancing
JP2016019080A (en) Image processing system, control method thereof, and control program
CN108063926B (en) Image processing method and device, computer readable storage medium and computer device
CN108174173B (en) Photographing method and apparatus, computer-readable storage medium, and computer device
CN108259754B (en) Image processing method and device, computer readable storage medium and computer device
CN108012135B (en) Image processing method and device, computer readable storage medium and computer equipment
JP2013013134A (en) Imaging apparatus
CN107959843B (en) Image processing method and device, computer readable storage medium and computer equipment
CN108156434B (en) Image processing method and device, computer readable storage medium and computer equipment
CN108063933B (en) Image processing method and device, computer readable storage medium and computer device
CN107959842B (en) Image processing method and device, computer readable storage medium and computer equipment
CN108111831B (en) Photographing method, imaging apparatus, computer-readable storage medium, and computer device
JP2013132065A (en) Imaging apparatus and flash control method
JP3907654B2 (en) Imaging apparatus and signal processing apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: OPPO Guangdong Mobile Communications Co., Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: Guangdong Opel Mobile Communications Co., Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant