CN116614716A - Image processing method, image processing device, storage medium, and electronic apparatus - Google Patents

Image processing method, image processing device, storage medium, and electronic apparatus Download PDF

Info

Publication number
CN116614716A
CN116614716A CN202310562470.6A CN202310562470A CN116614716A CN 116614716 A CN116614716 A CN 116614716A CN 202310562470 A CN202310562470 A CN 202310562470A CN 116614716 A CN116614716 A CN 116614716A
Authority
CN
China
Prior art keywords
processed
information
block
areas
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310562470.6A
Other languages
Chinese (zh)
Inventor
李林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202310562470.6A priority Critical patent/CN116614716A/en
Publication of CN116614716A publication Critical patent/CN116614716A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, a computer readable storage medium, and an electronic device. The method comprises the following steps: acquiring block information of each block of an original image, wherein the block information comprises one or more of depth information, brightness information and color temperature information; dividing the original image into areas based on the block information of each block to obtain one or more areas to be processed; and respectively carrying out white balance treatment on each region to be treated so as to obtain a target image. According to the method and the device, through region division, white balance processing is respectively and independently carried out on different color temperature regions of the original image, and the accuracy of white balance can be improved.

Description

Image processing method, image processing device, storage medium, and electronic apparatus
Technical Field
The present disclosure relates to the field of image processing technology, and in particular, to an image processing method, an image processing apparatus, a computer-readable storage medium, and an electronic device.
Background
When a human eye observes an object, the color of the observed object can be adjusted according to the properties of the ambient light with different color temperatures, and the object shot by the camera under the ambient light with different color temperatures can generate deviation. The white balance can restore the color of the image, so that the color of the image influenced by the ambient light is kept consistent with the true color of the object.
In the related white balance technology, the average color temperature of the whole image is taken as a final result to restore the color. However, for a multi-color temperature light source scene, the color temperatures of different areas in the same image may be different, and the average color temperature cannot accurately represent the actual color temperature conditions of the areas, so that the accuracy of the image white balance processing is low, and the accuracy of color restoration is affected.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide an image processing method, an image processing apparatus, a computer-readable medium, and an electronic device, thereby improving accuracy of white balance color reproduction at least to some extent.
According to a first aspect of the present disclosure, there is provided an image processing method including: acquiring block information of each block of an original image, wherein the block information comprises one or more of depth information, brightness information and color temperature information; dividing the original image into areas based on the block information of each block to obtain one or more areas to be processed; and respectively carrying out white balance treatment on each region to be treated so as to obtain a target image.
According to a second aspect of the present disclosure, there is provided an image processing apparatus including: a block information acquisition module configured to acquire block information of each block of an original image, the block information including one or more of depth information, luminance information, color temperature information; the region dividing module is configured to divide the original image into regions based on the block information of each block so as to obtain one or more regions to be processed; the regional white balance processing module is configured to perform white balance processing on each region to be processed respectively so as to obtain a target image.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the image processing method of the first aspect described above.
According to a fourth aspect of the present disclosure, there is provided an electronic apparatus, comprising: one or more processors; and a memory for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the image processing method according to the first aspect described above.
According to a fifth aspect of the present disclosure there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the steps of the image processing method as described in the first aspect.
The technical scheme of the present disclosure has the following beneficial effects:
in the disclosure, on one hand, by partitioning an original image and then performing region division on the original image according to block information to obtain different color temperature regions, the accuracy of the color temperature region division can be improved; on the other hand, in the method, the original image is divided into areas to obtain different color temperature areas, and then the different color temperature areas are respectively and independently subjected to white balance treatment, so that the actual color of each color temperature area can be restored more truly, and the accuracy of white balance is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort. In the drawings:
FIG. 1 illustrates a multicolor Wen Changjing schematic diagram in an exemplary embodiment of the present disclosure;
FIG. 2 shows a schematic diagram of a system architecture to which exemplary embodiments of the present disclosure may be applied;
FIG. 3 illustrates a flow diagram of an image processing method in an exemplary embodiment of the present disclosure;
FIG. 4 illustrates a flow diagram of a method of region segmentation of an original image in an exemplary embodiment of the present disclosure;
fig. 5 illustrates a result diagram after region division according to depth information in an exemplary embodiment of the present disclosure;
fig. 6 is a diagram showing a result of area division according to luminance information in an exemplary embodiment of the present disclosure;
fig. 7 is a diagram showing a result after performing region division according to color temperature information in an exemplary embodiment of the present disclosure;
FIG. 8 illustrates a schematic view of a partitioned area to be treated in an exemplary embodiment of the present disclosure;
FIG. 9 illustrates a flow diagram of a method of determining a region to be treated in an exemplary embodiment of the present disclosure;
fig. 10A illustrates another result diagram after region division according to depth information in an exemplary embodiment of the present disclosure;
fig. 10B is a diagram showing a result after area division according to luminance information in another exemplary embodiment of the present disclosure;
Fig. 10C illustrates another exemplary embodiment of the present disclosure after performing region division according to color temperature information;
FIG. 10D illustrates a schematic diagram of a partitioned candidate region in an exemplary embodiment of the present disclosure;
FIG. 10E illustrates a schematic diagram of a candidate region being merged to obtain a region to be processed in one exemplary embodiment of the present disclosure;
FIG. 11 illustrates a flow diagram of another method of determining a region to be treated in an exemplary embodiment of the present disclosure;
FIG. 12 is a flow chart of a method for determining a region to be processed based on candidate regions in an exemplary embodiment of the disclosure;
fig. 13 shows a composition schematic diagram of an image processing apparatus in an exemplary embodiment of the present disclosure;
fig. 14 shows a schematic diagram of an electronic device to which exemplary embodiments of the present disclosure may be applied.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
When a human eye observes an object, the color of the observed object can be adjusted according to the properties of the ambient light with different color temperatures, and the object shot by the camera under the ambient light with different color temperatures can generate deviation. The white balance can restore the color of the image, so that the color of the image influenced by the ambient light is kept consistent with the true color of the object.
The white balance method in the related art is to determine the mapping relation between the (R/G, B/G) coordinates and the color temperature according to the pre-experimental data and the empirical data, then calculate the distribution of the R (red channel), G (green channel) and B (blue channel) components of the original image, and calculate the (R/G average, B/G average) of the original image, thereby obtaining the color temperature corresponding to the (R/G average, B/G average) according to the mapping relation, taking the color temperature as the final color temperature of the original image, and further performing white balance processing on the original image based on the final color temperature.
In other words, in the related art, the same image is considered to belong to a single light source. However, in an actual scene, multiple light source scenes widely exist, when the actual color temperature variety distribution in one image is more and the difference is larger, the average value cannot accurately represent the actual condition of each region, and for a region with a larger difference between the actual color temperature and the average value color temperature, the color reduction effect is poor, so that the color reduction accuracy of white balance is reduced.
As shown in fig. 1, there are 2 kinds of light sources, namely, an indoor incandescent light source 11 and an outdoor solar light source 12 in fig. 1. Assuming that the color temperature of the incandescent light source 11 is 3000K and the color temperature of the solar light source 12 is 5000K, the irradiation range of the incandescent light source 11 and the irradiation range of the solar light source 12 are respectively as shown in the light diagram in fig. 1, that is, for the 1/2 region on the left and the 1/4 region on the lower right, they are affected by the indoor and outdoor light sources at the same time, while the 1/4 region on the upper right is affected by the outdoor solar light. Assuming that the actual color temperature of the left 1/2 region is 3000K, the actual color temperature of the upper right 1/4 region is 5000K, the actual color temperature of the lower right 1/4 region is 4000K, taking the most commonly used white balance gray world algorithm in the related art as an example, the color temperature of the whole image is obtained after weighted average of the actual color temperatures of the respective regions, that is, the color temperature of the whole image is a certain value between 3000K and 5000K, no matter what the specific color temperature value is, the color temperature value is different from the actual color temperature of at least two regions in the 3 regions (that is, the left 1/2 region, the upper right 1/4 region and the lower right 1/4 region) of fig. 1, the color reduction accuracy of the region with the larger difference between the weighted average color temperature and the actual color temperature is lower, and the color reduction effect of the whole image is poor.
In view of the above, exemplary embodiments of the present disclosure provide an image processing method.
Next, a system architecture of the operating environment of the present exemplary embodiment will be described with reference to fig. 2.
Fig. 2 shows a schematic diagram of a system architecture 200 that may include a terminal 210 and a server 220. The terminal 210 may be a terminal device such as a smart phone, a tablet computer, a desktop computer, a notebook computer, a smart wearable electronic device (e.g. a smart watch), etc., and the server 220 generally refers to a background system that provides a relevant service of the image processing method in the present exemplary embodiment. Wherein, the server 220 may be a cloud server. The terminal 210 and the server 220 may form a connection through a wired or wireless communication link for data interaction.
In an exemplary embodiment, the image processing method in the present disclosure may be performed by the terminal 210. For example, the camera module in the terminal may capture an original image, then the terminal may divide the original image into m×n image blocks, determine depth information of each image block through an auto-focusing algorithm, determine brightness information of each image block through an auto-exposure algorithm, and determine color temperature information of each image block according to an auto-white balance algorithm. Then, according to the depth information, the brightness information and the color temperature information of each image block, carrying out area division on an original image to obtain one or more color temperature areas (namely areas to be processed), then respectively calculating the white balance color gains of the different color temperature areas, processing the pixel values of the pixel points in the color temperature areas based on the respective white balance color areas of each color temperature area, so as to obtain a white balance image corresponding to each color temperature area, obtaining the white balance image of the original image based on the white balance image corresponding to each area, and displaying the white balance image in a display interface of the terminal.
In another exemplary embodiment, the image processing method in the present disclosure may also be performed by the server 220. For example, the server 220 may divide an original image to be subjected to white balance processing into m×n image blocks, determine depth information of each image block by using an auto-focusing algorithm, determine brightness information of each image block by using an auto-exposure algorithm, and determine color temperature information of each image block according to the auto-white balance algorithm. Then, according to the depth information, the brightness information and the color temperature information of each image block, carrying out region division on an original image to obtain one or more color temperature regions (namely to-be-processed regions), then respectively calculating the white balance color gains of the different color temperature regions, processing the pixel values of the pixel points in each color temperature region based on the respective white balance color regions of each color temperature region, so as to obtain a white balance image corresponding to each color temperature region, and obtaining the white balance image of the original image based on the white balance image corresponding to each region.
In still another exemplary embodiment, the image processing method in the present exemplary embodiment may also be performed by the terminal 210 and the server 220 together. For example, the terminal 210 transmits the photographed original image to the server 220, the server 220 blocks the original image, and then divides the original image into regions based on the block information, so that a white balance process is separately performed on each region to obtain a white balance image of the original image, and then the obtained white balance image is transmitted to the terminal 210, and the terminal 210 displays the received white balance image in its display interface.
As is clear from the above, the execution subject of the image processing method in the present exemplary embodiment may be the terminal 210, the server 220, or the terminal 210 and the server 220 together as the execution subject, which is not limited in the present disclosure.
An image processing method in the present exemplary embodiment will be described below with reference to fig. 3, and fig. 3 shows an exemplary flow of the image processing method, which may include:
step S310, obtaining block information of each block of the original image, wherein the block information comprises one or more of depth information, brightness information and color temperature information;
step S320, performing region division on the original image based on the block information of each block, so as to obtain one or more regions to be processed;
step S330, performing white balance processing on each of the areas to be processed, so as to obtain a target image.
Based on the method, on one hand, the original image is segmented, and then the original image is subjected to region division according to the block information, so that different color temperature regions can be obtained, and the accuracy of the color temperature region division can be improved; on the other hand, the original image is divided into areas to obtain different color temperature areas, and then the different color temperature areas are respectively and independently subjected to white balance treatment, so that the actual color of each color temperature area can be restored more truly, and the accuracy of white balance is improved.
The steps shown in fig. 3 are specifically described below.
In step S310, block information of each block of the original image is acquired.
In an exemplary embodiment, the original image may be understood as an image of a RAW format file, i.e., a RAW image, which is in an unprocessed, natural state, i.e., the RAW file directly recorded by the camera is the original data of the optical signal captured by the sensor converted into a digital signal. In other words, the original image records original data of the environment at the time of taking a photograph.
The original image may include a RAW image obtained by direct real-time shooting by a camera, or may include a locally stored RAW image. That is, the present disclosure may perform white balance processing on an original image photographed in real time by a camera, or may perform white balance processing on a RAW image stored in a terminal or a server, which is not particularly limited in this exemplary embodiment.
In an exemplary embodiment, the original image may be divided into M x N tiles, where M and N are both positive integers. M and N may be predetermined, e.g., M is 10 and N is 10, and for any original image, it may be divided into 100 image blocks. The values of M and N should not be too large, which leads to an increase in the amount of calculation, and too small, which may lead to inaccurate subsequent region division results, which may be empirically determined, which is not particularly limited in the present exemplary embodiment.
In an exemplary embodiment, the block information may include one or more of depth information, luminance information, and color temperature information. Depth information is understood to be, among other things, the distance of the camera lens from the object in the block, i.e. object distance information.
In another exemplary embodiment, the block information includes color temperature information, and for the depth information and the luminance information, either or both of the block information and the depth information are included. I.e. the block information comprises one or more of depth information and luminance information, as well as color temperature information.
For the depth information of the blocks, the depth information of each pixel of the original image may be determined based on PD (Phase Detection) by an AF (Automatic Focus) algorithm, and the depth information of the block may be determined according to the average value of the depth information of the pixels included in each block. The depth information of each pixel point of the original image can be determined by a deep learning-based method, and the depth information of each block is determined according to the average value of the depth information of the pixel points included in the block. The depth information of each block may be determined by an infrared ranging method, or may be determined by a Time of flight (TOF) method. Of course, the depth information of each block may be determined by other methods for estimating the depth information, which is not particularly limited in the present exemplary embodiment.
For the luminance information of the block, the luminance of each pixel point in the original image may be determined by an AE (Auto Exposure) algorithm, and the luminance information of the block may be determined according to the luminance average value of the pixel point included in each block. Of course, the luminance information of each block may be obtained by other ways of determining the luminance information, which is not particularly limited in the present exemplary embodiment.
For the color temperature information of the segments, the color temperature information of each segment may be determined by an automatic white balance algorithm (Automatic white balance, AWB), such as a gray world algorithm in automatic white balance, a perfect emission method, a color gamut limit method, a dynamic threshold method, or the like. Taking the gray world algorithm in white balance as an example, the gray world algorithm assumes that for an image with complex color variations, the saturation averages for three categories, R (red channel), G (green channel), B (blue channel), tend to be the same gray value. Specifically, the gray world algorithm may determine the color temperature value of each color block through R, G, and B component values and a color temperature curve (i.e., planckian curve). Wherein, the Planckian curve reflects the mapping relation between the (R/G, B/G) coordinate point and the color temperature.
For example, for each block, an R/G value and a B/G value of each pixel point of the block may be calculated, that is, a ratio between an R component value and a G component value and a ratio between a B component value and a G component value of each pixel point are calculated, so as to obtain (R/G, B/G) coordinate points, and a color temperature value corresponding to the coordinate points (R/G, B/G) is found in the planckian curve, so as to obtain a color temperature value of each pixel point. And then counting the number of pixel points with different color temperature values in the block, and taking the color temperature value with the largest number of pixel points as the color temperature value of the block. If there are 100 pixels in a certain block, and the color temperature found in the planckian curve by 80 pixels in the 100 pixels is 4500K, the color temperature value of the block is 4500K. Or determining the color temperature value of the block according to the average value of the color temperature values corresponding to the pixel points included in the block.
In step S320, the original image is divided into areas based on the block information of each block, so as to obtain one or more areas to be processed.
In an exemplary embodiment, the pixel points included in the partition are smaller than or equal to the pixel points included in the area to be processed.
Based on this, in an exemplary embodiment, in a case where the blocking information includes the depth information, determining the blocks of the depth information belonging to the same depth value interval as the same region, so as to perform region division on the original image, thereby obtaining one or more regions to be processed; determining the blocks of the brightness information belonging to the same brightness value interval as the same region under the condition that the block information comprises the brightness information, so as to divide the region of the original image and obtain one or more regions to be processed; and determining the blocks of the color temperature information belonging to the same color temperature value interval as the same region under the condition that the block information comprises the color temperature information, so as to divide the region of the original image and obtain one or more regions to be processed.
The range of each block of information may be configured in a customized manner according to the requirement, which is not particularly limited in the present exemplary embodiment.
For example, since there may be an error in estimation of the depth information, the luminance information, and the color temperature information, it is considered that the depth information of the blocks belonging to the same depth value range may be merged into the same region, the luminance information of the blocks belonging to the same luminance value range may be the same, the color temperature information of the blocks belonging to the same color temperature range may be the same, and the blocks belonging to the same color temperature range may be merged into the same region.
In other words, the blocks with the depth information difference smaller than the first preset value may be combined, the blocks with the brightness information difference smaller than the second preset value may be combined, and the blocks with the color temperature information smaller than the third preset value may be combined to obtain the region to be processed. Illustratively, fig. 4 shows a flow diagram of a method for region-dividing an original image in an exemplary embodiment of the present disclosure. Referring to fig. 4, the method may include steps S410 to S420. Wherein:
in step S410, in the case where the block information includes at least two of the depth information, the luminance information, and the color temperature information, the original image is divided into regions according to the at least two kinds of block information, respectively, to obtain a region division result of the at least two kinds of block information.
For example, in the case where the block information includes at least two of depth information, luminance information, and color temperature information, the partition algorithm of each block information may be synchronously performed to synchronously obtain the region division result of each block information.
Taking the example that the block information includes depth information, luminance information and color temperature information, an AF algorithm, an AE algorithm and an AWB algorithm can be simultaneously executed to simultaneously perform region division on the original image based on the depth information, the luminance information and the color temperature information, thereby respectively obtaining a region division result of each block information.
Taking the original image as an example of the image in fig. 1, the depth information (distance) of the outdoor area is 10 meters, the depth information of the indoor area is 3 meters, the depth information of the blocks in the 1/4 area at the upper right corner is determined to be in the range of 10 meters to 11 meters based on the AF algorithm through the PD information, the depth information of the blocks in other areas is in the range of 3 meters to 4 meters, and then the original image in fig. 1 can be divided into 2 areas according to the depth information of the blocks, as shown in fig. 5, which are af_block1 and af_block2 respectively.
It is determined based on the AE algorithm that the luminance values luma1 of the blocks in the 1/4 area at the upper right corner in fig. 1 are all in the range of 8luma (luma is Lu Ma, luminance units) to 9luma, the luminance values luma2 of the blocks in the 1/2 area at the left are all in the range of 1luma to 2luma, and the luminance values luma3 of the blocks in the 1/4 area at the lower right corner are all in the range of 2luma to 3luma, and then the original image in fig. 1 can be divided into 3 areas by the luminance information, as shown in fig. 6, as ae_block1, ae_block2, and ae_block3, respectively.
The color temperature values of the blocks in the 1/4 area at the upper right corner in fig. 1 are determined to be in the range of 5000K to 6000K based on the AWB algorithm, the color temperature values of the blocks in the 1/2 area at the left side are determined to be in the range of 3000K to 4000K, and the color temperature values of the blocks in the 1/4 area at the lower right corner are determined to be in the range of 4000K to 5000K, so that the original image in fig. 1 can be divided into 3 areas by the color temperature information, as shown in fig. 7, namely AWB _block1, AWB _block2 and AWB _block3.
In step S420, the one or more areas to be processed are determined based on the intersection of the area division results of at least two kinds of block information.
For example, an intersection of the region division results of each of the at least two pieces of block information may be determined, and each region corresponding to the intersection thereof may be determined as the region to be processed. That is, each area to be processed simultaneously belongs to a certain area divided by the area division result of each piece of block information.
As shown in fig. 8, from the intersection of the region division results of each block information in fig. 5, 6, and 7, it can be determined that the original image is finally divided into 3 regions, i.e., block1, block2, and block3 in fig. 8. Wherein block1 belongs to the intersection of af_block1, ae_block1 and awb _block1, i.e. block1 is af_blockok 1, ae_block1 and awb _block1, block2 belongs to the intersection of af_block2, ae_block2 and awb _block2, i.e. block2 is af_block2, ae_block2 and awb _block2, and block3 belongs to the intersection of af_block2, ae_block3 and awb _block3, i.e. block3 is af_block2, ae_block3 and awb _block3.
Through steps S410 to S420 described above. The partitioning algorithm of each block information can be executed simultaneously to partition the original image at the same time, and finally the partitioning result of each block information is comprehensively considered to obtain the final region partitioning result. Therefore, not only can the efficiency of region division be improved, but also the accuracy of region division can be improved through the combination of various information, thereby assisting in improving the efficiency and accuracy of white smoothing processing.
In another exemplary embodiment, in the case that the block information includes the depth information, the luminance information, and the color temperature information, the original image may be first divided into regions according to the luminance information to obtain one or more first regions to be processed; dividing the first to-be-processed area according to the depth information of each partition in the first to-be-processed area aiming at each first to-be-processed area so as to obtain one or more second to-be-processed areas; and dividing the second to-be-processed areas according to the color temperature information of each partition in the second to-be-processed areas aiming at each second to-be-processed area so as to obtain one or more to-be-processed areas.
For example, the region division may be performed according to at least two of the depth information, the brightness information and the color temperature information based on a preset sequence, so as to obtain the region to be processed. The preset sequence may be determined according to the partition reliability of each block information, for example, the partition reliability of the brightness information is the largest, the partition reliability of the depth information is the next largest, and finally the partition reliability of the color temperature information is the smallest. The preset sequence may be to partition based on the luminance information, then partition based on the depth information, and finally partition based on the color temperature information.
For example, in the case that the block information includes depth information and color temperature information, the original image may be partitioned according to the depth information to obtain one or more first partitions, then it is determined whether each first partition needs to be partitioned again according to the color temperature information, so as to obtain one or more second partitions, and finally, a final area to be processed is determined according to the second partitions.
For another example, in the case that the block information includes luminance information and color temperature information, the original image may be partitioned according to the luminance information to obtain one or more first partitions, then it is determined whether each first partition needs to be partitioned again according to the color temperature information, so as to obtain one or more second partitions, and finally, a final area to be processed is determined according to the second partitions.
For another example, in the case that the block information includes depth information and luminance information, the original image may be partitioned according to the luminance information to obtain one or more first partitions, then each first partition is partitioned according to the depth information to obtain one or more second permissions, and finally the final area to be processed is determined according to the second partitions.
For another example, in the case that the block information includes depth information, luminance information and color temperature information, the original image may be partitioned according to the luminance information to obtain one or more first partitions, on the basis of the result of the luminance partitioning, it is determined whether each first partition of the luminance information needs to be partitioned again to obtain one or more second partitions, on the basis of the depth information, on the basis of the result of each partition after the luminance information and the depth information partitioning, it is determined whether each second partition needs to be further partitioned again on the basis of the color temperature information to obtain one or more third partitions, and finally, on the basis of the third partitions, one or more areas to be processed are obtained.
By sequentially partitioning based on each piece of block information, one or more areas to be processed can be obtained, and compared with the partitioning based on the piece of block information at the same time, the partitioning based on the piece of block information does not need intersection operation, and the one or more areas to be processed can be determined directly according to the final result after sequential execution. The one or more regions to be treated can be understood as different color temperature regions. Meanwhile, the accuracy of the divided color temperature regions is ensured due to the fact that the reliability of the block information is considered during sequential execution.
In an exemplary embodiment, in the case that the number of areas divided according to the block information is too large, they may be combined to reduce the number of color temperature areas finally determined, thereby reducing the amount of calculation of white balance and improving the image processing efficiency.
Based on this, exemplary, fig. 9 shows a flow diagram of a method for determining a region to be processed in an exemplary embodiment of the present disclosure. Referring to fig. 9, the method may include steps S910 to S920.
In step S910, in the case where the block information includes the color temperature information, a kind of color temperature included in the original image is determined according to the color temperature information of each block.
For example, in the case where the block information includes color temperature information, the color temperature information of each block may be determined first, and then the color temperature information belonging to the same color temperature value interval may be determined as the same color temperature type, thereby determining the color temperature type included in the original image.
The specific embodiment of determining the color temperature information of each block is already described above, and will not be described herein.
In step S920, the target number of the regions to be processed is determined based on the number of the color temperature categories.
Since the white balance process is to make the images at different color temperatures display the original normal colors, no color deviation occurs. In other words, the white balance processing is to perform color reproduction on images at different color temperatures, so that the target number of areas to be processed for which the white balance processing is required can be determined according to the kind of color temperature. If the color temperature of the original image is 3, determining that the finally divided areas to be processed are 3. The color temperature of the original image is 1, the original image can be directly used as the area to be processed without dividing the original image, and only 1 area to be processed can be considered to be divided at the moment, namely the original image.
For example, after determining the target number, one embodiment of step S320 may include: dividing the original image into areas based on the block information of each block to obtain one or more candidate areas; and combining the candidate areas based on an image segmentation algorithm under the condition that the number of the candidate areas is larger than the target number, so as to obtain the target number of the areas to be processed.
For example, in the case where the number of the partitions directly obtained by performing the region division according to the partition information is greater than the number of the color temperature types, the candidate regions may be merged based on the manner of corrosion and/or expansion in the image segmentation algorithm, until the number of the merged partitions is equal to the number of the color temperature types, and then the merging is stopped, so as to obtain the target number of the regions to be processed.
Next, a specific embodiment of determining a region to be treated will be further described with reference to fig. 10A, 10B, 10C, 10D, and 10E. In fig. 10A, the original image is divided into two regions af1 and af2 based on the depth information, in fig. 10B, into two regions ae1 and ae2 based on the luminance information, in fig. 10C, the original image is divided into 4 regions (i.e., color temperature type is 4) based on the color temperature information, awb1, awb2, awb3, awb4, respectively. Taking the intersection of the partitions of fig. 10A, 10B, and 10C, the original image can be divided into 8 candidate regions, such as a1, a2, a3, a4, a5, a6, a7, a8 in fig. 10D.
Where a1 is the intersection of af2, ae1, awb1, a2 is the intersection of af1, ae1, awb, a3 is the intersection of af2, ae2, awb1, a4 is the intersection of af1, ae2, awb1, a5 is the intersection of af2, ae2, awb2, a6 is the intersection of af2, ae2, awb3, a7 is the intersection of af2, ae2, awb4, and a8 is the intersection of af1, ae2, awb4.
Obviously, the number 8 of candidate regions is greater than the number 4 of color temperature categories. The candidate regions may be eroded and/or dilated based on an image segmentation algorithm to merge the candidate regions to yield 4 regions to be processed. If the image segmentation algorithm is used for carrying out corrosion and expansion treatment on 8 candidate areas, a3 is corroded, a5 is expanded to an area comprising a3, a5 and a3 are combined to obtain an area b3 shown in fig. 10E, likewise, a2 is expanded to an area which can comprise a4, namely a4 and a2 can be combined to generate a new area b2, a6 and a7 are corroded by the area after a8 expansion, namely a6, a7 and a8 can be combined to b4. Therefore, the candidate region finally determined includes 4 regions a1, b2, b3, and b4 as shown in fig. 10E.
When expansion corrosion is carried out, expansion or corrosion can be carried out on any area, after the expansion or corrosion is finished, whether the number of the combined areas is larger than the number of the color temperature types is judged, and if so, whether the areas capable of expansion or corrosion exist is continuously judged; if not, stopping expansion or corrosion, and determining a final area to be treated according to the current expansion or corrosion result.
For example, fig. 11 shows a flowchart of another method of determining a region to be processed in an exemplary embodiment of the present disclosure, and referring to fig. 11, the method may include steps S1110 to S1120. Wherein:
in step S1110, the original image is divided into regions based on the block information of each block to obtain one or more candidate regions.
As described above, the blocks whose block information belongs to the same range may be combined to obtain one or more candidate regions.
In step S1120, the one or more candidate areas are combined according to the number of blocks included in each candidate area, so as to obtain one or more areas to be processed.
Illustratively, fig. 12 is a flowchart of a method for determining a region to be processed according to the number of blocks included in a candidate region in an exemplary embodiment of the present disclosure. Referring to fig. 12, the method may include steps S1210 to S1250.
In step S1210, a first candidate region and a second candidate region are determined according to the number of blocks included in each candidate region.
In an exemplary embodiment, the first candidate region is the candidate region having the largest number of tiles and the second candidate region is the candidate region having the smallest number of tiles.
For example, a candidate region having the largest proportion of the original image may be determined from among the candidate regions, as a first candidate region, and a candidate region having the smallest proportion of the original image may be determined from among the candidate regions or as a second candidate region. In the case where the sizes of the respective blocks are the same, the proportion of the candidate region including the largest number of blocks to the original image is the largest, and the proportion of the candidate region including the smallest number of blocks to the original image is the smallest.
In step S1220, it is determined whether the difference between the first block number and the second block number is greater than a preset value; if yes, go to step S1230, otherwise go to step S1250.
For example, the first number of blocks may be understood as the number of blocks comprised by the first candidate region, and the second number of blocks may be understood as the number of blocks comprised by the second candidate region. The difference between the number of the first blocks and the number of the second blocks can be calculated, if the difference is larger than a preset value, the second candidate region can be considered to occupy a smaller proportion of the original image, and can be combined with other candidate regions instead of being processed independently, so that the number of the candidate regions is reduced, and the processing efficiency is improved; if the difference value is smaller than or equal to the preset value, the proportion of each candidate region in the original image is considered to be equivalent, namely, each candidate region is important to the original image, and the region to be processed can be determined according to the current candidate region, so that white balance processing is respectively carried out on each candidate region in the subsequent steps, and the accuracy of white balance is improved.
In step S1230, a target candidate region merged with the second candidate region is determined from the other candidate regions.
For example, the second candidate region may be eroded by an image segmentation algorithm, and other candidate regions capable of eroding the second candidate region are determined as target candidate regions. The candidate region having the smallest color temperature value difference with the second candidate region among the other candidate regions within the neighborhood of the second candidate region may be determined as the target candidate region, that is, the candidate region adjacent to the second candidate region and having the smallest color temperature value difference with the second candidate region may be determined as the target candidate region, which is not particularly limited in this exemplary embodiment.
In step S1240, the second candidate region and the target candidate region are combined to update the candidate region according to the combination result, and the process proceeds to step S1210.
For example, if the candidate region divided according to the block information includes a candidate region 1, a candidate region 2, a candidate region 3, a candidate region 4, and a candidate region 5, where the candidate region 1 includes the largest number of blocks, the candidate region 4 includes the smallest number of blocks, and the difference between the number of blocks of the candidate region 1 and the number of blocks of the candidate region 4 is larger than a preset value, the target candidate region of the candidate region 4 may be determined. If it is determined that the target candidate region of the candidate region 4 is the candidate region 3, the candidate region 4 and the candidate region 3 may be combined into the candidate region 34, so that the updated candidate region includes 4 candidate regions, namely, the candidate region 1, the candidate region 2, the candidate region 34 and the candidate region 5. And determining the candidate region with the largest number of blocks and the candidate region with the smallest number of blocks in the 4 candidate regions to redetermine the first candidate region and the second candidate region, and continuously judging whether the difference value between the first number of blocks included in the new first candidate region and the second number of blocks included in the new second candidate region is larger than a preset value or not so as to execute different steps according to different relations between the difference value and the preset value.
In step S1250, the one or more regions to be processed are obtained based on the current candidate region.
As described above, when the difference between the number of the first blocks included in the first candidate region and the number of the second blocks included in the second candidate region is smaller than or equal to the preset value, it indicates that the proportion of each candidate region after the current update to the original image is equivalent, which indicates that the importance degree of each candidate region after the current update to the original image is equivalent, merging may be stopped, and a region to be processed that needs to be subjected to white balance processing alone may be determined according to the candidate region after the current update.
For example, taking the example that the candidate area after the current update includes 4 candidate areas, namely, candidate area 1, candidate area 2, candidate area 34 and candidate area 5, if the new first candidate area is candidate area 1, the new second candidate area is candidate area 5, and the difference between the number of blocks included between candidate area 1 and candidate area 5 is smaller than the preset value, the area to be processed may include candidate area 1, candidate area 2, candidate area 34 and candidate area 5.
Through the steps S1110 to S1120 described above, the candidate regions may be combined according to the proportion of the candidate regions in the original image, so as to determine the final to-be-processed region. Therefore, the number of the finally determined color temperature areas can be reduced while the color temperature areas of the original image are divided, the problem of reduced white balance processing efficiency caused by excessive color temperature areas is avoided while the accuracy of white balance is improved, and therefore, a better balance is carried out between the accuracy and the efficiency of white balance.
In an exemplary embodiment, where the entire image is in a single light source, there may be only one partitioned area to be processed. Therefore, based on the image processing method in the present disclosure, the original image can be divided into one or more areas to be processed that require white balance processing.
With continued reference to fig. 3, in step S330, a white balance process is performed on each of the areas to be processed, respectively, to obtain a target image.
For example, one specific embodiment of step S330 may include: and respectively determining the white balance color gain of each to-be-processed area, and respectively carrying out white balance processing on the pixels of each to-be-processed area based on the white balance color gain of each to-be-processed area so as to obtain a target image.
The method for determining the white balance color gain of each to-be-processed area comprises the steps of respectively determining the white balance color gain of each to-be-processed area, respectively carrying out white balance processing on pixels of each to-be-processed area based on the white balance color gain of each to-be-processed area to obtain a target image, wherein the method comprises the following steps: for each to-be-processed area, determining a first white balance color gain of a red channel of the to-be-processed area according to a first ratio average value between a red channel value and a green channel value of a pixel point in the to-be-processed area, and determining a second white balance color gain of a blue channel of the to-be-processed area according to a second ratio average value between a blue channel value and a green channel value of the pixel point in the to-be-processed area; for each to-be-processed area, processing each pixel of the to-be-processed area according to a first white balance color gain of a red channel and a second white balance gain of a blue channel of the to-be-processed area so as to obtain a white balance image of the to-be-processed area; and obtaining the target image according to the white balance image of each area to be processed.
For example, for each area to be processed, a first ratio average between the red channel value and the green channel value of the pixel points in the area to be processed may be determined, where the first ratio average may be determined according to the average of the ratio of the red channel value to the green channel value of each pixel point, or may be determined according to the ratio of the red channel average to the green channel average of all the pixel points in the area. Similarly, the second ratio average value may be determined according to the average value of the ratios of the blue channel and the green channel of each pixel point in the area, or may be determined according to the ratio of the average value of the blue channel to the average value of the green channel of all the pixel points in the area.
Then, the first ratio is determined as the white balance gain of the red channel of each pixel in the area, the second ratio is determined as the white balance gain of the blue channel of each pixel in the area, and the white balance gain of the green channel of each pixel in the area is 1. The color value of the green channel is unchanged for each pixel point in each area to be processed, the color value of the red channel is adjusted through the white balance gain of the red channel, the color value of the blue channel is adjusted through the white balance gain of the blue channel, and for each pixel point in the area, the original color value of the green channel, the adjusted color value of the red channel and the adjusted color value of the blue channel are combined to determine the color value of the pixel point after white balance processing, so that the white balance sub-image corresponding to the area to be processed is determined according to the color value of each pixel point of the area to be processed after white balance processing.
And integrating the white balance sub-images corresponding to each region to be processed to obtain a target image of the original image after white balance processing. The red channel, the green channel and the blue channel are three color channels of the RGB image.
In the method, the original image is segmented firstly, and then the original image is divided into areas according to the depth information, the brightness information and the color temperature information of each segment, so that the areas to be processed with different color temperatures are determined, white balance processing is respectively and independently carried out on each area to be processed, and further, the fact that each area to be processed has color temperature information close to reality is guaranteed, the actual color of each area to be processed is accurately restored, the situation that a traditional mean color temperature algorithm cannot adapt to a multi-color-temperature scene is avoided, and the accuracy of white balance is improved.
Furthermore, the color temperature region division is carried out by combining the depth information and the brightness information, so that the accuracy of region division can be improved compared with the independent region division according to the color temperature information, and the accuracy of white balance processing is further improved.
It is noted that the above-described figures are merely schematic illustrations of processes involved in a method according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Further, referring to fig. 13, there is also provided an image processing apparatus 1300 in the embodiment of the present example, the image processing apparatus 1300 may include: a block information acquisition module 1310, a region division module 1320, and a region white balance processing module 1330. Wherein: a block information acquisition module 1310 configured to acquire block information of each block of the original image, the block information including one or more of depth information, luminance information, color temperature information; a region dividing module 1320 configured to divide the original image into regions based on the block information of each block to obtain one or more regions to be processed; the regional white balance processing module 1330 is configured to perform white balance processing on each of the to-be-processed regions, so as to obtain a target image.
In an exemplary implementation, based on the foregoing embodiment, the area dividing module 1320 may be specifically configured to: under the condition that the block information comprises at least two of the depth information, the brightness information and the color temperature information, respectively carrying out region division on the original image according to the at least two block information so as to respectively obtain region division results of the at least two block information; one or more regions to be processed are determined based on an intersection of the region division results of the at least two pieces of block information.
In an exemplary implementation, based on the foregoing embodiment, the area dividing module 1320 may also be specifically configured to: under the condition that the block information comprises the depth information, the brightness information and the color temperature information, carrying out region division on the original image according to the brightness information to obtain one or more first regions to be processed; dividing the first to-be-processed area according to the depth information of each partition in the first to-be-processed area aiming at each first to-be-processed area so as to obtain one or more second to-be-processed areas; and dividing the second to-be-processed areas according to the color temperature information of each partition in the second to-be-processed areas aiming at each second to-be-processed area so as to obtain one or more to-be-processed areas.
In an exemplary implementation, based on the foregoing embodiment, the area dividing module 1320 may also be specifically configured to: determining the color temperature type included in the original image according to the color temperature information of each block under the condition that the block information includes the color temperature information; and determining the target number of the to-be-processed areas based on the number of the color temperature types.
In an exemplary implementation, based on the foregoing embodiment, the area dividing module 1320 may also be specifically configured to: dividing the original image into areas based on the block information of each block to obtain one or more candidate areas; and combining the candidate areas based on an image segmentation algorithm under the condition that the number of the candidate areas is larger than the target number, so as to obtain the target number of the areas to be processed.
In an exemplary implementation, based on the foregoing embodiment, the area dividing module 1320 may also be specifically configured to: dividing the original image into areas based on the block information of each block to obtain one or more candidate areas; and merging the one or more candidate areas according to the number of the blocks included in each candidate area to obtain one or more areas to be processed.
In an exemplary implementation manner, based on the foregoing embodiment, the merging, according to the number of blocks included in each candidate area, the one or more candidate areas to obtain one or more areas to be processed includes: determining a first candidate region and a second candidate region according to the number of the blocks included in each candidate region, wherein the first candidate region is the candidate region with the largest number of the blocks, and the second candidate region is the candidate region with the smallest number of the blocks; determining a target candidate region merged with the second candidate region from other candidate regions under the condition that the difference value between the first block number included in the first candidate region and the second block number included in the second candidate region is larger than a preset value; merging the second candidate region and the target candidate region to update the candidate region according to a merging result, and re-determining the first candidate region and the second candidate region based on the updated candidate region; and obtaining the one or more areas to be processed based on the updated candidate areas under the condition that the difference value between the first block number included in the first candidate areas and the second block number included in the second candidate areas is equal to or smaller than a preset value.
In an exemplary embodiment, based on the foregoing embodiment, the pixel points included in the partition are smaller than or equal to the pixel points included in the area to be processed.
In an exemplary implementation, based on the foregoing embodiment, the area dividing module 1320 may also be specifically configured to: determining the partitions of the depth information belonging to the same depth value interval as the same region under the condition that the partition information comprises the depth information, so as to divide the region of the original image and obtain one or more regions to be processed; determining the blocks of the brightness information belonging to the same brightness value interval as the same region under the condition that the block information comprises the brightness information, so as to divide the region of the original image and obtain one or more regions to be processed; and determining the blocks of the color temperature information belonging to the same color temperature value interval as the same region under the condition that the block information comprises the color temperature information, so as to divide the region of the original image and obtain one or more regions to be processed.
In an exemplary implementation, based on the foregoing embodiment, the split area white balance processing module 1330 may be specifically configured to: and respectively determining the white balance color gain of each to-be-processed area, and respectively carrying out white balance processing on the pixels of each to-be-processed area based on the white balance color gain of each to-be-processed area so as to obtain a target image.
In an exemplary embodiment, based on the foregoing embodiment, the determining the white balance color gain of each of the to-be-processed areas, and performing white balance processing on the pixels of each of the to-be-processed areas based on the white balance color gain of each of the to-be-processed areas, to obtain the target image, includes: for each to-be-processed area, determining a first white balance color gain of a red channel of the to-be-processed area according to a first ratio average value between a red channel value and a green channel value of a pixel point in the to-be-processed area, and determining a second white balance color gain of a blue channel of the to-be-processed area according to a second ratio average value between a blue channel value and a green channel value of the pixel point in the to-be-processed area; for each to-be-processed area, processing each pixel of the to-be-processed area according to a first white balance color gain of a red channel and a second white balance gain of a blue channel of the to-be-processed area so as to obtain a white balance image of the to-be-processed area; and obtaining the target image according to the white balance image of each area to be processed.
The specific details of each module in the above apparatus are already described in the method section embodiments, and the details that are not disclosed can be found in the method section embodiments, so that they will not be described in detail.
The exemplary embodiments of the present disclosure also provide an electronic device for performing the above-described image processing method, which may be the above-described terminal 210. In general, the electronic device may include a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the image processing method described above via execution of the executable instructions.
The configuration of the electronic device will be exemplarily described below with reference to the mobile terminal 1400 in fig. 14. It will be appreciated by those skilled in the art that the configuration of fig. 14 can be applied to stationary type devices in addition to components specifically for mobile purposes.
As shown in fig. 14, the mobile terminal 1400 may specifically include: processor 1401, memory 1402, bus 1403, mobile communication module 1404, antenna 1, wireless communication module 1405, antenna 2, display screen 1406, camera module 1407, audio module 1408, power source module 1409, and sensor module 1410.
The processor 1401 may include one or more processing units, such as: the processor 1410 may include an AP (Application Processor ), modem processor, GPU (Graphics Processing Unit, graphics processor), ISP (Image Signal Processor ), controller, encoder, decoder, DSP (Digital Signal Processor ), baseband processor, and/or NPU (Neural-Network Processing Unit, neural network processor), etc. For example, the GPU may be used to process the original image based on the image processing method in the present disclosure, thereby obtaining the target image after the white balance processing.
The processor 1401 may form a connection with the memory 1402 or other components through the bus 1403.
Memory 1402 may be used to store computer-executable program code that includes instructions. The processor 1401 performs various functional applications of the mobile terminal 1400 and data processing by executing instructions stored in the memory 1402. The memory 1402 may also store application data such as files for storing images, videos, etc., and the memory 1402 may also store an original image and a target image obtained by white balancing the original image using the image processing method in the present disclosure.
The communication functions of the mobile terminal 1400 may be implemented by the mobile communication module 1404, the antenna 1, the wireless communication module 1405, the antenna 2, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The mobile communication module 1404 may provide a 2G, 3G, 4G, 5G, etc. mobile communication solution for application on the mobile terminal 1400. The wireless communication module 1405 may provide wireless communication solutions for wireless local area networks, bluetooth, near field communications, etc. that are implemented on the mobile terminal 1400.
The display screen 1406 is used to realize a display function such as displaying a user interface, displaying a target image obtained after white balance processing according to the image processing method of the present disclosure. The image capturing module 1407 is used to realize a capturing function such as capturing an image to be subjected to white balance processing, a video to be subjected to white balance processing, and the like. The audio module 1408 is used to implement audio functions such as playing audio, capturing voice, etc. The power module 1409 is used to implement power management functions such as charging a battery, powering a device, monitoring a battery status, and the like. The sensor module 1410 may include a depth sensor 14101, a speed sensor 14102, a gyro sensor 14103, a barometric sensor 14104, etc. to implement a corresponding sensing function.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable storage medium, the computer program comprising program code for performing the method shown in the flowcharts. In such embodiments, the computer program may be downloaded and installed from a network, and/or installed based on removable media. The computer program, when executed by a Central Processing Unit (CPU), performs the various functions defined in the method and apparatus of the present application.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification. In some possible implementations, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, e.g. any one or more of the steps of fig. 3, when the program product is run on the terminal device.
The computer readable medium shown in the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Furthermore, the program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (14)

1. An image processing method, comprising:
acquiring block information of each block of an original image, wherein the block information comprises one or more of depth information, brightness information and color temperature information;
dividing the original image into areas based on the block information of each block to obtain one or more areas to be processed;
and respectively carrying out white balance treatment on each region to be treated so as to obtain a target image.
2. The image processing method according to claim 1, wherein the dividing the original image into areas based on the block information of each block to obtain one or more areas to be processed includes:
under the condition that the block information comprises at least two of the depth information, the brightness information and the color temperature information, respectively carrying out region division on the original image according to the at least two block information so as to respectively obtain region division results of the at least two block information;
And determining the one or more areas to be processed based on the intersection of the area division results of at least two pieces of block information.
3. The image processing method according to claim 1, wherein the performing region division on the original image based on the block information of each block to obtain one or more areas to be processed includes:
under the condition that the block information comprises the depth information, the brightness information and the color temperature information, carrying out region division on the original image according to the brightness information to obtain one or more first regions to be processed;
dividing the first to-be-processed area according to the depth information of each partition in the first to-be-processed area aiming at each first to-be-processed area so as to obtain one or more second to-be-processed areas;
and dividing the second to-be-processed areas according to the color temperature information of each partition in the second to-be-processed areas aiming at each second to-be-processed area so as to obtain one or more to-be-processed areas.
4. The image processing method according to claim 1, wherein the performing region division on the original image based on the block information of each block to obtain one or more areas to be processed includes:
Determining the color temperature type included in the original image according to the color temperature information of each block under the condition that the block information includes the color temperature information;
and determining the target number of the to-be-processed areas based on the number of the color temperature types.
5. The image processing method according to claim 4, wherein the performing region division on the original image based on the block information of each block to obtain one or more areas to be processed includes:
dividing the original image into areas based on the block information of each block to obtain one or more candidate areas;
and combining the candidate areas based on an image segmentation algorithm under the condition that the number of the candidate areas is larger than the target number, so as to obtain the target number of the areas to be processed.
6. The image processing method according to claim 1, wherein the performing region division on the original image based on the block information of each block to obtain one or more areas to be processed includes:
dividing the original image into areas based on the block information of each block to obtain one or more candidate areas;
And merging the one or more candidate areas according to the number of the blocks included in each candidate area to obtain one or more areas to be processed.
7. The image processing method according to claim 6, wherein merging the one or more candidate areas according to the number of blocks included in each candidate area to obtain one or more areas to be processed, comprises:
determining a first candidate region and a second candidate region according to the number of the blocks included in each candidate region, wherein the first candidate region is the candidate region with the largest number of the blocks, and the second candidate region is the candidate region with the smallest number of the blocks;
determining a target candidate region merged with the second candidate region from other candidate regions under the condition that the difference value between the first block number included in the first candidate region and the second block number included in the second candidate region is larger than a preset value;
merging the second candidate region and the target candidate region to update the candidate region according to a merging result, and re-determining the first candidate region and the second candidate region based on the updated candidate region;
And obtaining the one or more areas to be processed based on the current candidate area under the condition that the difference value between the first block number included in the first candidate area and the second block number included in the second candidate area is equal to or smaller than a preset value.
8. The image processing method according to any one of claims 1 to 7, wherein the blocks include pixels smaller than or equal to pixels included in the region to be processed.
9. The image processing method according to claim 1, wherein the dividing the original image into areas based on the block information of each block to obtain one or more areas to be processed includes:
determining the partitions of the depth information belonging to the same depth value interval as the same region under the condition that the partition information comprises the depth information, so as to divide the region of the original image and obtain one or more regions to be processed;
determining the blocks of the brightness information belonging to the same brightness value interval as the same region under the condition that the block information comprises the brightness information, so as to divide the region of the original image and obtain one or more regions to be processed;
And determining the blocks of the color temperature information belonging to the same color temperature value interval as the same region under the condition that the block information comprises the color temperature information, so as to divide the region of the original image and obtain one or more regions to be processed.
10. The image processing method according to claim 1, wherein the performing white balance processing on each of the areas to be processed to obtain the target image includes:
and respectively determining the white balance color gain of each to-be-processed area, and respectively carrying out white balance processing on the pixels of each to-be-processed area based on the white balance color gain of each to-be-processed area so as to obtain a target image.
11. The image processing method according to claim 10, wherein the determining the white balance color gain of each of the to-be-processed areas, respectively, and performing white balance processing on the pixels of each of the to-be-processed areas based on the white balance color gain of each of the to-be-processed areas, respectively, to obtain the target image, comprises:
for each to-be-processed area, determining a first white balance color gain of a red channel of the to-be-processed area according to a first ratio average value between a red channel value and a green channel value of a pixel point in the to-be-processed area, and determining a second white balance color gain of a blue channel of the to-be-processed area according to a second ratio average value between a blue channel value and a green channel value of the pixel point in the to-be-processed area;
For each to-be-processed area, processing each pixel of the to-be-processed area according to a first white balance color gain of a red channel and a second white balance gain of a blue channel of the to-be-processed area so as to obtain a white balance image of the to-be-processed area;
and obtaining the target image according to the white balance image of each area to be processed.
12. An image processing apparatus, comprising:
a block information acquisition module configured to acquire block information of each block of an original image, the block information including one or more of depth information, luminance information, color temperature information;
the region dividing module is configured to divide the original image into regions based on the block information of each block so as to obtain one or more regions to be processed;
the regional white balance processing module is configured to perform white balance processing on each region to be processed respectively so as to obtain a target image.
13. A computer readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the method according to any one of claims 1 to 11.
14. An electronic device, comprising:
one or more processors; and
a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-11.
CN202310562470.6A 2023-05-17 2023-05-17 Image processing method, image processing device, storage medium, and electronic apparatus Pending CN116614716A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310562470.6A CN116614716A (en) 2023-05-17 2023-05-17 Image processing method, image processing device, storage medium, and electronic apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310562470.6A CN116614716A (en) 2023-05-17 2023-05-17 Image processing method, image processing device, storage medium, and electronic apparatus

Publications (1)

Publication Number Publication Date
CN116614716A true CN116614716A (en) 2023-08-18

Family

ID=87681200

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310562470.6A Pending CN116614716A (en) 2023-05-17 2023-05-17 Image processing method, image processing device, storage medium, and electronic apparatus

Country Status (1)

Country Link
CN (1) CN116614716A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117596489A (en) * 2024-01-18 2024-02-23 荣耀终端有限公司 Image processing method, image processing apparatus, electronic device, and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117596489A (en) * 2024-01-18 2024-02-23 荣耀终端有限公司 Image processing method, image processing apparatus, electronic device, and storage medium

Similar Documents

Publication Publication Date Title
CN108335279B (en) Image fusion and HDR imaging
CN108734676B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN107633252B (en) Skin color detection method, device and storage medium
CN110062176B (en) Method and device for generating video, electronic equipment and computer readable storage medium
JP6615917B2 (en) Real-time video enhancement method, terminal, and non-transitory computer-readable storage medium
US20200118257A1 (en) Image Processing Method, Terminal, and Non-Transitory Computer-Readable Storage Medium
KR20150099302A (en) Electronic device and control method of the same
CN107690804B (en) Image processing method and user terminal
US20220261961A1 (en) Method and device, electronic equipment, and storage medium
CN111710049A (en) Method and device for determining ambient illumination in AR scene
CN113989173A (en) Video fusion method and device, electronic equipment and storage medium
CN116614716A (en) Image processing method, image processing device, storage medium, and electronic apparatus
CN108174173B (en) Photographing method and apparatus, computer-readable storage medium, and computer device
CN116113976A (en) Image processing method and device, computer readable medium and electronic equipment
CN112802033A (en) Image processing method and device, computer readable storage medium and electronic device
US11962917B2 (en) Color adjustment method, color adjustment device, electronic device and computer-readable storage medium
CN107295255B (en) Shooting mode determining method and device and terminal
US20170310872A1 (en) Image dynamic range adjustment method, terminal, and storage medium
CN113112422A (en) Image processing method, image processing device, electronic equipment and computer readable medium
CN113205011A (en) Image mask determining method and device, storage medium and electronic equipment
WO2024027287A1 (en) Image processing system and method, and computer-readable medium and electronic device
CN111970501A (en) Pure color scene AE color processing method and device, electronic equipment and storage medium
CN108470327B (en) Image enhancement method and device, electronic equipment and storage medium
CN116309224A (en) Image fusion method, device, terminal and computer readable storage medium
CN115187487A (en) Image processing method and device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination