CN117714659A - Image processing method, device, equipment and medium - Google Patents

Image processing method, device, equipment and medium Download PDF

Info

Publication number
CN117714659A
CN117714659A CN202311810649.5A CN202311810649A CN117714659A CN 117714659 A CN117714659 A CN 117714659A CN 202311810649 A CN202311810649 A CN 202311810649A CN 117714659 A CN117714659 A CN 117714659A
Authority
CN
China
Prior art keywords
white balance
parameter
color
image
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311810649.5A
Other languages
Chinese (zh)
Inventor
李振华
张雁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202311810649.5A priority Critical patent/CN117714659A/en
Publication of CN117714659A publication Critical patent/CN117714659A/en
Pending legal-status Critical Current

Links

Abstract

The disclosure provides an image processing method, an image processing device and a medium, which can be applied to the technical field of image processing. The method comprises the following steps: acquiring an image to be processed; determining at least one color communication region from the image to be processed, wherein each color communication region is formed by communication pixels satisfying a first condition; determining a target communication area satisfying a second condition from among the at least one color communication area; and adjusting the white balance parameter of the image to be processed according to the target communication area.

Description

Image processing method, device, equipment and medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method, apparatus, device, and medium.
Background
When the object is subjected to image acquisition under different light sources, the object in the obtained image can also show different colors due to different light reflected by the surface of the object. When the object is subjected to image acquisition under the white light source, the color of the object in the obtained image is closer to the color of the object. Therefore, for an image acquired under an arbitrary light source, in order to make the color of an object in the image closer to the color of itself, white balance processing needs to be performed on the image.
However, in practical applications, when there are many mixed colors in the image, the white balance operation easily causes the image to be color-shifted.
Disclosure of Invention
The present disclosure provides an image processing method, apparatus, device, and medium.
According to a first aspect of the present disclosure, there is provided an image processing method including: acquiring an image to be processed; determining at least one color communication region from the image to be processed, wherein each color communication region is formed by communication pixels satisfying a first condition; determining a target communication area satisfying a second condition from among the at least one color communication area; and adjusting the white balance parameter of the image to be processed according to the target communication area.
According to an embodiment of the present disclosure, determining a target communication region satisfying a second condition from among at least one color communication region includes: acquiring parameter information of each color communication area; sequencing at least one color connected region according to the parameter information to obtain a sequencing sequence; and determining the target connected regions based on the ordered sequence, wherein the ordered sequence characterizes the proximity degree of the white balance parameter of each color connected region and the white balance parameter of the image to be processed.
According to an embodiment of the present disclosure, determining at least one color communication region from an image to be processed includes: dividing each pixel with similar color in the image to be processed into a set to obtain at least one set to be processed; and dividing the pixels with similar colors in each set into at least one color communication area according to the position information of the pixels with similar colors, wherein the pixels with similar colors in each color communication area are communication pixels.
According to an embodiment of the present disclosure, satisfying the first condition includes: determining a target pixel from the image to be processed; acquiring a first parameter of a target pixel and second parameters of M candidate pixels adjacent to the target pixel, wherein the first parameter represents color mode information of the target pixel, and the second parameter represents color mode information of the candidate pixels; under the condition that the difference value between the second parameter and the first parameter is smaller than a first threshold value, determining candidate pixels as connected pixels, determining N connected pixels from M candidate pixels, wherein M and N are integers larger than or equal to zero, and M is larger than or equal to N; and combining the target pixel with the N connected pixels to obtain a color connected region.
According to an embodiment of the present disclosure, in a case where a difference between the second parameter and the first parameter is smaller than a first threshold, determining the candidate pixel as a connected pixel, determining N connected pixels from M candidate pixels, includes: under the condition that the M-th candidate pixel is determined to be a connected pixel, taking a second parameter of the M-th candidate pixel as an updated first parameter, wherein M is 1 to M; and determining the (m+1) th candidate pixel as a connected pixel in the case that the difference value between the second parameter of the (m+1) th candidate pixel and the updated first parameter is smaller than the first threshold value.
According to an embodiment of the present disclosure, adjusting a white balance parameter of an image to be processed according to a target communication area includes:
calculating a white balance parameter of the image to be processed according to the average value of all pixel color components in the image to be processed to obtain an initial white balance parameter; calculating a white balance parameter of the target connected region according to the average value of all pixel color components in the target connected region to obtain a reference white balance parameter; and adjusting the white balance parameter of the image to be processed according to the initial white balance parameter and the reference white balance parameter.
According to an embodiment of the present disclosure, adjusting a white balance parameter of an image to be processed according to an initial white balance parameter and a reference white balance parameter includes: under the condition that the difference value between the reference white balance parameter and the initial white balance parameter is smaller than a second threshold value, taking the reference white balance parameter as a target white balance parameter; and adjusting the initial white balance parameter to the target white balance parameter according to the target white balance parameter.
According to an embodiment of the present disclosure, adjusting a white balance parameter of an image to be processed according to an initial white balance parameter and a reference white balance parameter includes: and under the condition that the difference value between the reference white balance parameter and the initial white balance parameter is larger than or equal to a second threshold value, calculating a target white balance parameter according to the reference white balance parameter, and adjusting the initial white balance parameter to the target white balance parameter according to the target white balance parameter.
A second aspect of the present disclosure provides an image processing apparatus including: the acquisition module is used for acquiring the image to be processed; a first determining module for determining at least one color communication region from an image to be processed, wherein each color communication region is formed by communication pixels satisfying a first condition; a second determining module for determining a target communication area satisfying a second condition from among the at least one color communication area; and the adjusting module is used for adjusting the white balance parameter of the image to be processed according to the target communication area.
A third aspect of the present disclosure provides an electronic device, comprising: one or more processors; and a memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the image processing method described above.
A fourth aspect of the present disclosure provides a computer-readable storage medium having stored thereon executable instructions that, when executed by a processor, cause the processor to perform the above-described image processing method.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The drawings are for a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 schematically illustrates an application scenario diagram of an image processing method, apparatus, device, and medium according to an embodiment of the present disclosure;
FIG. 2 schematically illustrates a flow chart of an image processing method according to an embodiment of the disclosure;
FIG. 3 schematically illustrates a flow chart of a method of determining color connected regions in accordance with an embodiment of the present disclosure;
FIG. 4A schematically illustrates a decision flow diagram that satisfies a first condition in accordance with an embodiment of the disclosure;
FIG. 4B schematically illustrates a flowchart of a first parameter updating method according to an embodiment of the present disclosure;
FIG. 5 schematically illustrates a flow chart of a method of determining a target connected region in accordance with an embodiment of the disclosure;
FIG. 6 schematically illustrates a schematic diagram of a method of adjusting white balance parameters according to an embodiment of the disclosure;
fig. 7 schematically illustrates a decision flow chart of a method of adjusting white balance parameters of an image to be processed according to an initial white balance parameter and a reference white balance parameter in accordance with an embodiment of the disclosure;
fig. 8 schematically shows a block diagram of the structure of an image processing apparatus according to an embodiment of the present disclosure; and
fig. 9 schematically illustrates a block diagram of an electronic device adapted to implement an image processing method according to an embodiment of the disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is only exemplary and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the present disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. In addition, in the following description, descriptions of well-known structures and techniques are omitted so as not to unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and/or the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It should be noted that the terms used herein should be construed to have meanings consistent with the context of the present specification and should not be construed in an idealized or overly formal manner.
Where expressions like at least one of "A, B and C, etc. are used, the expressions should generally be interpreted in accordance with the meaning as commonly understood by those skilled in the art (e.g.," a system having at least one of A, B and C "shall include, but not be limited to, a system having a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
The embodiment of the disclosure provides an image processing method, an image processing device and an image processing medium, and prior to introducing the technical scheme provided by the embodiment of the disclosure, related technologies related to the disclosure are described.
In the related art, when an image of an object is acquired under different light sources, the object in the obtained image also presents different colors due to different light reflected by the surface of the object. When the object is subjected to image acquisition under the white light source, the color of the object in the obtained image is closer to the color of the object. Therefore, for an image acquired under an arbitrary light source, in order to make the color of an object in the image closer to the color of itself, white balance processing needs to be performed on the image.
Currently, white balance processing is generally performed on an image by using a gray world method. The premise of the gray world method is that the average value of each color component of all pixel points in the image tends to be the same gray value, and the gain coefficient of each color component is calculated by using the gray value and the average value of each color component. And finally, adjusting each color component of each pixel point in the image by using the calculated gain coefficient, and finally obtaining the image after white balance processing.
In practical application, the gray world method has high applicability to shooting gray scenes per se. However, the gray world method is susceptible to many confusing colors, and has poor applicability to a scene with rich colors or a scene with too single color and not gray, resulting in a decrease in the accuracy of performing white balance processing on an image.
In order to facilitate an understanding of the embodiments of the present disclosure, related concepts involved in the embodiments of the present disclosure will be briefly described first.
The white balance processing is to perform reverse elimination of the influence of the light source color in the image by accurately estimating the light source color, so as to achieve the effect of white light shooting.
The color mode refers to a representation of points in an RGB color model in a cylindrical coordinate system. Color modes are hue, saturation and brightness. The hue (H) is a basic attribute of color, namely a color name commonly known as red, yellow, and the like. The saturation (S) is the purity of the color, the higher the color is, the lower the color is, the gray is gradually changed, the numerical value of the shell degree (L) of 0-100% is taken, and the numerical value of the shell degree (L) of 0-100% is taken.
The color components refer to red (R), green (G) and blue (B) components of pixels of an image. In the RGB color model, the range of values for each color component is 0-255. By adjusting the intensities of the three components of red, green and blue, countless colors can be mixed, and when the intensities of the three components are all 0, black is obtained; when the intensities of the three components are 255, white is obtained, and when the intensities of the respective components are between 0 and 255, various colors are obtained.
The gray world algorithm is based on the gray world assumption that the average of the R, G, B three components tends to be the same gray K for an image with a large number of color changes.
In a physical sense, the gray world method assumes that the average of the average reflection of light by a natural scene is a constant value in general, which is approximately "gray". The color balance algorithm enforces this assumption on the image to be processed, eliminating the effect of ambient light from the image.
In view of this, an embodiment of the present disclosure provides an image processing method including: acquiring an image to be processed; determining at least one color communication region from the image to be processed, wherein each color communication region is formed by communication pixels satisfying a first condition; determining a target communication area satisfying a second condition from among the at least one color communication area; and adjusting the white balance parameter of the image to be processed according to the target communication area. The method comprises the steps of firstly dividing an image according to colors of pixels in the image to form a plurality of color communication areas with different colors. And selecting the most probable gray connected region according to the information of each color connected region, and effectively eliminating the influence of the non-gray region, thereby improving the accuracy of the white balance adjustment of the image.
Fig. 1 schematically illustrates an application scenario diagram of an image processing method, apparatus, device, and medium according to an embodiment of the present disclosure.
As shown in fig. 1, an application scenario 100 according to this embodiment may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. Various communication client applications, such as shopping class applications, web browser applications, search class applications, instant messaging tools, mailbox clients, social platform software, etc. (by way of example only) may be installed on the terminal devices 101, 102, 103.
The terminal devices 101, 102, 103 may be a variety of electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablets, laptop and desktop computers, and the like.
The server 105 may be a server providing various services, such as a background management server (by way of example only) providing support for websites browsed by users using the terminal devices 101, 102, 103. The background management server may analyze and process the received data such as the user request, and feed back the processing result (e.g., the web page, information, or data obtained or generated according to the user request) to the terminal device.
It should be noted that, the image processing method provided by the embodiment of the present disclosure may be generally executed by the server 105. Accordingly, the image processing apparatus provided by the embodiments of the present disclosure may be generally provided in the server 105. The image processing method provided by the embodiments of the present disclosure may also be performed by a server or a server cluster that is different from the server 105 and is capable of communicating with the terminal devices 101, 102, 103 and/or the server 105. Accordingly, the image processing apparatus provided by the embodiments of the present disclosure may also be provided in a server or a server cluster that is different from the server 105 and is capable of communicating with the terminal devices 101, 102, 103 and/or the server 105.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
The image processing method of the disclosed embodiment will be described in detail below with reference to fig. 2 to 7 based on the scene described in fig. 1.
Fig. 2 schematically shows a flowchart of an image processing method according to an embodiment of the present disclosure.
As shown in fig. 2, the image processing method of this embodiment includes operations S210 to S240.
In operation S210, a to-be-processed image is acquired.
The image to be processed may be an image acquired by the image acquisition device in real time, or may be a pre-acquired image, for example.
At least one color communication region is determined from the image to be processed, wherein each color communication region is formed of communication pixels satisfying the first condition in operation S220.
The color communication area refers to an area formed by pixels with similar color modes of a plurality of adjacent pixels in the image.
For example, the first condition may be satisfied that a difference between color mode parameters of two adjacent pixels is smaller than a preset first threshold.
In operation S230, a target communication area satisfying the second condition is determined from among the at least one color communication area.
Wherein the target communication region refers to a region closest to gray among one or more color communication regions.
Illustratively, satisfying the first condition may be an area where the white balance parameter of the one or more color communication areas is closest to the white balance parameter of the image to be processed.
In operation S240, the white balance parameter of the image to be processed is adjusted according to the target communication area.
In some embodiments, according to a preset color communication range, the communication areas with similar colors are detected, so as to obtain one or more color communication areas suitable for communication in the whole image. And selecting the most gray connected region through color information, position information, area information and the like in the color connected region, and calculating a more accurate white balance result according to parameters of the connected region and a gray world method.
It can be understood that the image is divided according to the colors of the pixels in the image to form a plurality of color communication areas with different colors. And selecting the most probable gray connected region according to the information of each color connected region, and effectively eliminating the influence of the non-gray region, thereby improving the accuracy of the white balance adjustment of the image. Meanwhile, aiming at the region formed by the large-area solid-color scene in the image, the method can reduce the component of the solid-color region for adjusting the white balance, so that the white balance is calibrated more accurately.
Fig. 3 schematically illustrates a flowchart of a method of determining color connected regions according to an embodiment of the disclosure.
As shown in fig. 3, operation S220 of this embodiment determines at least one color communication area from the image to be processed, including operations S221 to S222.
In operation S221, each pixel with similar color in the image to be processed is divided into a set, so as to obtain at least one set to be processed.
In operation S222, the pixels with similar colors in each set are divided into at least one color connected region according to the position information of the pixels with similar colors, and the pixels with similar colors in each color connected region are connected pixels.
In some embodiments, all pixels of each similar color in the image are integrated into one set, forming one or more sets to be processed. For each set to be processed, dividing the pixels with the connected positions together according to the position information of the pixels in each set to form a color communication area, wherein one or more color communication areas can be formed in one set.
A two-dimensional coordinate system is illustratively established on the image to be transmitted in units of one pixel data size. So that the coordinate value of each pixel in the image can be obtained. For example, the coordinates of the red pixels in the image include (2, 1), (2, 2), (3, 1), (3, 2), (5, 5), (5, 6), and (7, 2), and then (2, 1), (2, 2), (3, 1), and (3, 2) form one red color communication region, (5, 5), and (5, 6) form another red color communication region, and (7, 2) form one red color communication region alone. Three red color connected regions are formed in the image.
It can be understood that the image is divided into different color connected regions according to the principle of color approximation and the positions of pixels. The method can divide the color communication area of the image rapidly, and improves the efficiency of white balance correction.
Fig. 4A schematically illustrates a determination flowchart that satisfies a first condition according to an embodiment of the present disclosure.
In one implementation, as shown in fig. 4A, in operation S220, satisfying the first condition includes operations S410 to S440.
In operation S410, a target pixel is determined from an image to be processed.
In some embodiments, the target pixel may be any pixel selected in the image coordinate system to be processed.
In operation S420, a first parameter of the target pixel and second parameters of M candidate pixels adjacent to the target pixel are obtained, wherein the first parameter represents color mode information of the target pixel and the second parameter represents color mode information of the candidate pixels.
For example, the first parameter and the second parameter each include hue (H), saturation (S), and brightness (L) information of the pixel.
In operation S430, it is determined whether the difference between the second parameter and the first parameter is smaller than the first threshold.
In the case where the above operation S430 is judged yes:
in operation S440, if it is determined that the difference between the second parameter and the first parameter is smaller than the first threshold, the candidate pixels are determined to be connected pixels, N connected pixels are determined from M candidate pixels, M and N are integers greater than or equal to zero, and M is greater than or equal to N.
In the case where the above operation S430 is judged no:
in operation S440', in case it is determined that the second parameter is greater than or equal to the first parameter difference value, the candidate pixel is determined not to be the connected pixel.
In operation S450, the target pixel is combined with the N connected pixels to obtain a color connected region.
Illustratively, in the color mode HSL, the limits of the HSL respectively (i.e., the preset first threshold value) are preset as criteria for judging that the pixel colors are close. When the parameter information in the color modes of two adjacent pixels exceeds the first threshold value, the color difference between the two pixels is large, and the pixels cannot be judged to be similar in color. If the change of two adjacent pixels H cannot exceed 2 degrees, the change of two adjacent pixels S cannot exceed 0.1, and the change of two adjacent pixels L cannot exceed 0.1, the pixels are compared with the pixels adjacent to the periphery one by one from the beginning pixel of the picture, if the pixels are in line with the limit, the pixels belong to the same color communication area, the next pixel is continuously checked, and if the limit is not met all the periphery, the communication area is ended. The detection of the next color connected region starts from pixels not belonging to this color connected region, and finally the image is divided into one or more different color connected regions according to the principle of color approximation.
It should be noted that, in the embodiment of the present disclosure, the setting of the first threshold is not specifically limited, and may be set according to an actual application scenario.
It should be noted that satisfying the first condition is equally applicable to the division of the pixels having similar colors in operation S221.
Fig. 4B schematically illustrates a flowchart of a first parameter updating method according to an embodiment of the present disclosure.
In some embodiments, as shown in fig. 4B, in operation S440, in the case where the difference between the second parameter and the first parameter is smaller than the first threshold, the candidate pixel is determined as a connected pixel, and N connected pixels are determined from M candidate pixels, including operations S441 to S442.
In operation S441, in the case where it is determined that the mth candidate pixel is the connected pixel, the second parameter of the mth candidate pixel is taken as the updated first parameter, where M belongs to 1 to M.
In operation S442, in case that the difference value between the second parameter of the (m+1) -th candidate pixel and the updated first parameter is smaller than the first threshold value, the (m+1) -th candidate pixel is determined as the connected pixel.
In some embodiments, the m+1th candidate pixel is determined to be a connected pixel if the difference between the second parameter of the m+1th candidate pixel and the updated first parameter is less than a first threshold and if the difference between the second parameter of the m+1th candidate pixel and the first parameter of the target pixel is determined to be less than a third threshold.
For example, if the H of the target pixel is 5, then it is determined whether the other pixels are connected pixels of the target pixel, and then it is determined whether the difference between the first adjacent pixel value and 5 is less than 0.1. If the first adjacent candidate pixel is 5.2, the first candidate pixel cannot be combined with the target pixel as a connected pixel to form a color connected region, and the color connected region is cut off at the candidate pixel; if the second neighboring candidate is 5.05, the second candidate may be merged as a connected pixel with the target pixel to form a color connected region. Then when the third candidate pixel makes a determination as to whether it is a connected pixel, the H value of the second candidate pixel is compared with 5.05, and so on, to determine whether the remaining candidate pixels can be used as connected pixels, e.g., H of the third candidate pixel is 5.12, then when determining, the difference between 5.12 and 5.05 is calculated and compared with 0.1. For example, if the third threshold is 0.3 and if the H of the fourth candidate pixel is 5.2, then at the time of judgment, the difference between 5.2 and 5.12 is calculated and compared with 0.1, and at the same time, the difference between 5.2 and 5.12 is compared with 0.3, and under the condition that the first threshold and the third threshold are satisfied, the fourth candidate pixel may be combined as a connected pixel together with the target pixel to form a color connected region.
It can be understood that, by continuously updating the first parameter, the gradual change can be prevented from being not detected as a color communication area, and meanwhile, the set third threshold can prevent the gradual change range of the gradual change from being too large, eliminate the error of pixel judgment in the color communication area as much as possible, and improve the accuracy of white balance adjustment.
Fig. 5 schematically illustrates a flowchart of a method of determining a target connected region according to an embodiment of the present disclosure.
In some embodiments, as shown in fig. 5, in operation S230, a target communication area satisfying the second condition is determined from among the at least one color communication area, including operations S231 to S233.
In operation S231, parameter information of each color communication region is acquired.
In some embodiments, the parameter information includes: position information, area information, chromaticity information, and color span range.
In operation S232, at least one color connected region is ordered according to the parameter information, so as to obtain an ordered sequence.
In operation S233, the target connected regions are determined based on a ranking sequence, wherein the ranking sequence characterizes a proximity of the white balance parameter of each color connected region to the white balance parameter of the image to be processed.
In some embodiments, operation S232 orders at least one color connected region according to the parameter information to obtain an ordered sequence comprising
Calculating a contribution value of each color communication area according to the parameter information; and
and sequencing the color connected regions respectively corresponding to the at least one contribution value according to the size of the contribution value to obtain a sequencing sequence.
And determining at least one color communication area corresponding to the maximum contribution value from the ordered sequence as a target communication area.
Illustratively, the weight values of four dimensions (i.e., parameters) for each region are calculated from the parameter information. For example, the area: the large area has high weight. Chromaticity: the higher the weight of the inferior comparison, the lower the more vivid the weight. Position: the weight of the position is slightly complex, and is related to the brightness of the environment, if the brightness is higher and the position belongs to the outside, the weight above the picture is low, and the weight in the middle and below the picture is high; if the brightness is normal, the indoor is the room, the middle weight is high, and the surrounding weights are low. Color span range: the larger the color span range, the lower the weight, the more concentrated the colors, and the higher the weight. Since gray is a problem with no color gradient, mostly with large chroma spans, this area is not a gray object. And averaging the weight values of the four parameters of each region to obtain the contribution value of each region. The contribution values of the plurality of color connected regions are ranked to obtain a ranking sequence (the contribution values are ranked according to a standard that the more likely the contribution values are larger for gray regions and the more likely the contribution values are lower for non-gray regions), and the target connected region is selected and determined from the first few in the ranking sequence.
It will be appreciated that, on the one hand, determining whether the color communication region is a gray region from multiple dimensions can more objectively evaluate the proximity of the region to the gray region. And the accuracy of selecting the target communication area in the image is improved. On the other hand, since the method of passing through the color connected regions corresponds to dividing the picture and giving different contribution values to different color connected regions according to the rule, an object with a higher contribution value is more likely to be gray, an object with a lower contribution value is less likely to be gray, and the result of white balance calculation is finally contributed by the color connected regions, so that the influence of non-gray is effectively eliminated. Furthermore, the method of the color connected region can reduce the contribution value of the pure color connected region due to the chromaticity of the pure color, and the contribution value of the scene outside the pure color can be improved, so that the white balance can be properly corrected.
Fig. 6 schematically illustrates a schematic diagram of a method of adjusting white balance parameters according to an embodiment of the present disclosure.
In some embodiments, as shown in fig. 6, in operation S240, the white balance parameter of the image to be processed is adjusted according to the target communication area, including operations S241 to S243.
In operation S241, the white balance parameter of the image to be processed is calculated according to the average value of all pixel color components in the image to be processed, so as to obtain the initial white balance parameter.
In operation S242, the white balance parameter of the target connected region is calculated according to the average value of all the pixel color components in the target connected region, to obtain the reference white balance parameter.
In operation S243, the white balance parameter of the image to be processed is adjusted according to the initial white balance parameter and the reference white balance parameter.
In some embodiments, a gray world algorithm is adopted to calculate the average value of three RGB channels in the image to be processed, and gain coefficients of the three channels are calculated to obtain the initial white balance parameter of the image to be processed. Similarly, a gray world algorithm is adopted to calculate the average value of RGB three channels in each target communication area, and gain coefficients of the three channels are calculated to obtain the reference white balance parameter of each target communication area.
It can be understood that the picture is divided by a color communication area method, the initial white balance parameter is corrected according to the white balance parameter (reference white balance parameter) in the image, the actual shooting scene of the image is more attached, and the accuracy of white balance adjustment is improved.
Fig. 7 schematically illustrates a decision flow chart of a method of adjusting white balance parameters of an image to be processed according to an initial white balance parameter and a reference white balance parameter according to an embodiment of the disclosure.
In some embodiments, as shown in fig. 6, in operation S243, the white balance parameters of the image to be processed are adjusted according to the initial white balance parameters and the reference white balance parameters, including operations S2431 to S2433.
In operation S2431, it is determined whether the difference between the reference white balance parameter and the initial white balance parameter is less than a second threshold.
For the case where the determination in operation S2431 is yes:
in operation S2432, in case it is determined that the difference between the reference white balance parameter and the initial white balance parameter is less than the second threshold, the reference white balance parameter is taken as the target white balance parameter.
In operation S2433, the initial white balance parameter is adjusted to the target white balance parameter according to the target white balance parameter.
For the case where no is determined in operation S2431:
in operation S2432', in case it is determined that the difference between the reference white balance parameter and the initial white balance parameter is greater than or equal to the second threshold, the target white balance parameter is calculated from the reference white balance parameter.
In operation S2433', the initial white balance parameter is adjusted to the target white balance parameter according to the target white balance parameter.
Specifically, after the reference white balance parameter and the initial white balance parameter are obtained, a difference value between the reference white balance parameter and the initial white balance parameter is calculated. If the difference value is smaller than the second threshold value, the reference white balance parameter is indicated to be relatively close to the initial white balance parameter, and the reference white balance parameter can be directly used as a target white balance parameter and simultaneously used as a white balance adjustment target of the image to be processed for adjustment. If the differences between the reference white balance parameters and the initial white balance parameters of the color communication areas corresponding to the first contribution values in the ordering sequence are larger than a second threshold value, which indicates that the reference white balance parameters and the initial white balance parameters have larger differences, the target white balance parameters are obtained after the processing of the reference white balance parameters, and the target white balance parameters are used as white balance adjustment targets of the images to be processed for adjustment.
A method of calculating the target white balance parameter in the case where the determination in operation S2431 is no is exemplified. For example, taking the reference white balance parameters (A1, A2, A3 respectively) of the color connected regions corresponding to the first three contribution values in the ordered sequence, and the contribution values of the color connected regions in the first three ranks are B1, B2, B3 respectively, then calculating the target white balance parameter (T) according to the following relation.
T=(A1*B1+A2*B2+A3*B3)/(B1+B2+B3)。
It will be appreciated that by obtaining the target white balance parameter of the image by combining the representative color connected regions in the image, and adjusting the white balance parameter, specific calculations can be performed based on the actual scene information (the parameter information fed back on the pixels) of the image. Therefore, the adjustment of the white balance of the image is more concrete and practical, and the accuracy of the white balance adjustment is improved.
Based on the image processing method, the disclosure also provides an image processing device. The device will be described in detail below in connection with fig. 8.
Fig. 8 schematically shows a block diagram of the structure of an image processing apparatus according to an embodiment of the present disclosure.
As shown in fig. 8, the image processing apparatus 800 of this embodiment includes or acquires a module 810, a first determination module 820, a second determination module 830, and an adjustment module 840.
The acquiring module 810 is configured to acquire an image to be processed. In an embodiment, the obtaining module 810 may be configured to perform the operation S210 described above, which is not described herein.
The first determining module 820 is configured to determine at least one color connected region from an image to be processed, where each color connected region is formed by connected pixels that satisfy a first condition. In an embodiment, the first determining module 820 may be used to perform the operation S220 described above, which is not described herein.
The second determining module 830 is configured to determine a target communication area that satisfies a second condition from among the at least one color communication area. In an embodiment, the second determining module 830 may be configured to perform the operation S230 described above, which is not described herein.
The adjusting module 830 is configured to adjust a white balance parameter of the image to be processed according to the target communication area. In an embodiment, the adjusting module 830 may be configured to perform the operation S240 described above, which is not described herein.
Any of the acquisition module 810, the first determination module 820, the second determination module 830, and the adjustment module 840 may be combined in one module to be implemented, or any of them may be split into a plurality of modules, according to an embodiment of the present disclosure. Alternatively, at least some of the functionality of one or more of the modules may be combined with at least some of the functionality of other modules and implemented in one module. According to embodiments of the present disclosure, at least one of the acquisition module 810, the first determination module 820, the second determination module 830, and the adjustment module 840 may be implemented at least in part as hardware circuitry, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware in any other reasonable way of integrating or packaging the circuitry, or in any one of or a suitable combination of three of software, hardware, and firmware. Alternatively, at least one of the acquisition module 810, the first determination module 820, the second determination module 830, and the adjustment module 840 may be at least partially implemented as computer program modules, which when executed, may perform the respective functions.
Fig. 9 schematically illustrates a block diagram of an electronic device adapted to implement an image processing method according to an embodiment of the disclosure.
As shown in fig. 9, an electronic device 900 according to an embodiment of the present disclosure includes a processor 901 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 902 or a program loaded from a storage portion 908 into a Random Access Memory (RAM) 903. The processor 901 may include, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or an associated chipset and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), or the like. Processor 901 may also include on-board memory for caching purposes. Processor 901 may include a single processing unit or multiple processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
In the RAM 903, various programs and data necessary for the operation of the electronic device 900 are stored. The processor 901, the ROM 902, and the RAM 903 are connected to each other by a bus 904. The processor 901 performs various operations of the method flow according to the embodiments of the present disclosure by executing programs in the ROM 902 and/or the RAM 903. Note that the program may be stored in one or more memories other than the ROM 902 and the RAM 903. The processor 901 may also perform various operations of the method flow according to embodiments of the present disclosure by executing programs stored in the one or more memories.
According to an embodiment of the disclosure, the electronic device 900 may also include an input/output (I/O) interface 905, the input/output (I/O) interface 905 also being connected to the bus 904. The electronic device 900 may also include one or more of the following components connected to the I/O interface 905: an input section 906 including a keyboard, a mouse, and the like; an output portion 907 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage portion 908 including a hard disk or the like; and a communication section 909 including a network interface card such as a LAN card, a modem, or the like. The communication section 909 performs communication processing via a network such as the internet. The drive 910 is also connected to the I/O interface 905 as needed. A removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed as needed on the drive 910 so that a computer program read out therefrom is installed into the storage section 908 as needed.
The present disclosure also provides a computer-readable storage medium that may be embodied in the apparatus/device/system described in the above embodiments; or may exist alone without being assembled into the apparatus/device/system. The computer-readable storage medium carries one or more programs which, when executed, implement methods in accordance with embodiments of the present disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example, but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. For example, according to embodiments of the present disclosure, the computer-readable storage medium may include ROM 902 and/or RAM 903 and/or one or more memories other than ROM 902 and RAM 903 described above.
Embodiments of the present disclosure also include a computer program product comprising a computer program containing program code for performing the methods shown in the flowcharts. The program code, when executed in a computer system, causes the computer system to implement the item recommendation method provided by embodiments of the present disclosure.
The above-described functions defined in the system/apparatus of the embodiments of the present disclosure are performed when the computer program is executed by the processor 901. The systems, apparatus, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the disclosure.
In one embodiment, the computer program may be based on a tangible storage medium such as an optical storage device, a magnetic storage device, or the like. In another embodiment, the computer program may also be transmitted, distributed, and downloaded and installed in the form of a signal on a network medium, via communication portion 909, and/or installed from removable medium 911. The computer program may include program code that may be transmitted using any appropriate network medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
In such an embodiment, the computer program may be downloaded and installed from the network via the communication portion 909 and/or installed from the removable medium 911. The above-described functions defined in the system of the embodiments of the present disclosure are performed when the computer program is executed by the processor 901. The systems, devices, apparatus, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the disclosure.
According to embodiments of the present disclosure, program code for performing computer programs provided by embodiments of the present disclosure may be written in any combination of one or more programming languages, and in particular, such computer programs may be implemented in high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. Programming languages include, but are not limited to, such as Java, c++, python, "C" or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that the features recited in the various embodiments of the disclosure and/or in the claims may be provided in a variety of combinations and/or combinations, even if such combinations or combinations are not explicitly recited in the disclosure. In particular, the features recited in the various embodiments of the present disclosure and/or the claims may be variously combined and/or combined without departing from the spirit and teachings of the present disclosure. All such combinations and/or combinations fall within the scope of the present disclosure.
The embodiments of the present disclosure are described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described above separately, this does not mean that the measures in the embodiments cannot be used advantageously in combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be made by those skilled in the art without departing from the scope of the disclosure, and such alternatives and modifications are intended to fall within the scope of the disclosure.

Claims (11)

1. An image processing method, comprising:
acquiring an image to be processed;
determining at least one color communication region from the image to be processed, wherein each color communication region is formed by communication pixels meeting a first condition;
Determining a target communication area satisfying a second condition from among the at least one color communication area; and
and adjusting the white balance parameter of the image to be processed according to the target communication area.
2. The method of claim 1, wherein determining a target connected region from the at least one color connected region that satisfies a second condition comprises:
acquiring parameter information of each color communication area;
sequencing the at least one color communication area according to the parameter information to obtain a sequencing sequence; and
and determining target connected areas based on the sorting sequences, wherein the sorting sequences represent the approaching degree of the white balance parameters of each color connected area and the white balance parameters of the image to be processed.
3. The method of claim 1, wherein determining at least one color connected region from the image to be processed comprises:
dividing each pixel with similar color in the image to be processed into a set to obtain at least one set to be processed;
dividing the pixels with similar colors in each set into at least one color communication area according to the position information of the pixels with similar colors, wherein the pixels with similar colors in each color communication area are the communication pixels.
4. A method according to claim 1 or 3, wherein the meeting a first condition comprises:
determining a target pixel from the image to be processed;
acquiring a first parameter of the target pixel and second parameters of M candidate pixels adjacent to the target pixel, wherein the first parameter represents color mode information of the target pixel, and the second parameter represents color mode information of the candidate pixels;
under the condition that the difference value between the second parameter and the first parameter is smaller than a first threshold value, determining the candidate pixels as connected pixels, determining N connected pixels from the M candidate pixels, wherein M and N are integers which are larger than or equal to zero, and M is larger than or equal to N; and
and merging the target pixel with the N connected pixels to obtain a color connected region.
5. The method of claim 4, wherein determining the candidate pixel as a connected pixel if the second parameter and the first parameter differ by less than a first threshold, determining N connected pixels from the M candidate pixels, comprises:
under the condition that the M candidate pixel is the connected pixel, taking the second parameter of the M candidate pixel as the updated first parameter, wherein M is 1 to M; and
And determining the (m+1) th candidate pixel as a connected pixel under the condition that the difference value between the second parameter of the (m+1) th candidate pixel and the updated first parameter is smaller than the first threshold value.
6. The method of claim 1, wherein adjusting the white balance parameter of the image to be processed according to the target connected region comprises:
calculating a white balance parameter of the image to be processed according to the average value of all pixel color components in the image to be processed to obtain an initial white balance parameter;
calculating a white balance parameter of the target communication area according to the average value of all pixel color components in the target communication area to obtain a reference white balance parameter; and
and adjusting the white balance parameters of the image to be processed according to the initial white balance parameters and the reference white balance parameters.
7. The method of claim 6, wherein adjusting the white balance parameter of the image to be processed according to the initial white balance parameter and the reference white balance parameter comprises:
when the difference value between the reference white balance parameter and the initial white balance parameter is smaller than a second threshold value, the reference white balance parameter is used as a target white balance parameter; and
And adjusting the initial white balance parameter to the target white balance parameter according to the target white balance parameter.
8. The method of claim 6, wherein adjusting the white balance parameter of the image to be processed according to the initial white balance parameter and the reference white balance parameter comprises:
under the condition that the difference value between the reference white balance parameter and the initial white balance parameter is larger than or equal to a second threshold value, calculating a target white balance parameter according to the reference white balance parameter; and
and adjusting the initial white balance parameter to the target white balance parameter according to the target white balance parameter.
9. An image processing apparatus comprising:
the acquisition module is used for acquiring the image to be processed;
a first determining module, configured to determine at least one color connected region from the image to be processed, where each color connected region is formed by connected pixels that meet a first condition;
a second determining module configured to determine a target communication area satisfying a second condition from among the at least one color communication area; and
and the adjusting module is used for adjusting the white balance parameter of the image to be processed according to the target communication area.
10. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method of any of claims 1-8.
11. A computer readable storage medium having stored thereon executable instructions which, when executed by a processor, cause the processor to perform the method according to any of claims 1-8.
CN202311810649.5A 2023-12-26 2023-12-26 Image processing method, device, equipment and medium Pending CN117714659A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311810649.5A CN117714659A (en) 2023-12-26 2023-12-26 Image processing method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311810649.5A CN117714659A (en) 2023-12-26 2023-12-26 Image processing method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN117714659A true CN117714659A (en) 2024-03-15

Family

ID=90147877

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311810649.5A Pending CN117714659A (en) 2023-12-26 2023-12-26 Image processing method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN117714659A (en)

Similar Documents

Publication Publication Date Title
EP3391651B1 (en) Dynamic video overlays
US9554109B2 (en) Identifying gray regions for auto white balancing
US7706606B1 (en) Fast, adaptive color to grayscale conversion
US10812767B2 (en) White balance processing method, electronic device and computer readable storage medium
US20180098043A1 (en) Assisted Auto White Balance
US20150304525A1 (en) Color correction based on multiple images
WO2020151590A1 (en) Display current determination and compensation method and device, and display device and storage medium
US20220139013A1 (en) Image and Text Typesetting Method and Related Apparatus Thereof
US9036047B2 (en) Apparatus and techniques for image processing
US20200193578A1 (en) Method and system for image enhancement
WO2023046112A1 (en) Document image enhancement method and apparatus, and electronic device
CN111312141B (en) Color gamut adjusting method and device
US8837829B2 (en) Image processing apparatus, storage medium storing image processing program, and image processing method
CN109272526B (en) Image processing method and system and electronic equipment
US10510281B2 (en) Image processing apparatus and method, and electronic device
US8805063B2 (en) Method and apparatus for detecting and compensating for backlight frame
US20140314317A1 (en) Method and apparatus for converting gray level of color image
CN116708756A (en) Sensor accuracy detection method, detection device, electronic device, and storage medium
CN117714659A (en) Image processing method, device, equipment and medium
CN115775215A (en) Image processing method, image processing device, electronic equipment and storage medium
CN113763258A (en) Beautifying method and device for video image
US11837193B2 (en) Off-axis color correction in dynamic image capture of video wall displays
US20240135899A1 (en) Off-Axis Color Correction in Dynamic Image Capture of Video Wall Displays
CN117292158A (en) Image processing method and device, electronic equipment and storage medium
US20200160773A1 (en) Display adjustment method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination