CN114945087B - Image processing method, device, equipment and storage medium based on face characteristics - Google Patents

Image processing method, device, equipment and storage medium based on face characteristics Download PDF

Info

Publication number
CN114945087B
CN114945087B CN202210630448.6A CN202210630448A CN114945087B CN 114945087 B CN114945087 B CN 114945087B CN 202210630448 A CN202210630448 A CN 202210630448A CN 114945087 B CN114945087 B CN 114945087B
Authority
CN
China
Prior art keywords
awb
image
weight
light source
coordinate point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210630448.6A
Other languages
Chinese (zh)
Other versions
CN114945087A (en
Inventor
李富生
毛凡禹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ARM Technology China Co Ltd
Original Assignee
ARM Technology China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ARM Technology China Co Ltd filed Critical ARM Technology China Co Ltd
Priority to CN202210630448.6A priority Critical patent/CN114945087B/en
Publication of CN114945087A publication Critical patent/CN114945087A/en
Application granted granted Critical
Publication of CN114945087B publication Critical patent/CN114945087B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

The application relates to the technical field of image processing, in particular to an image processing method, device, equipment and storage medium based on face characteristics, wherein the method comprises the following steps: acquiring an image to be processed, and detecting a face area in the image to be processed; calculating RGB values of the face region, and predicting a first AWB coordinate point corresponding to the face region in a target coordinate system based on the RGB values of the face region; determining a first AWB parameter of the image to be processed based on the first AWB coordinate point; acquiring a second AWB parameter of the image to be processed and a second AWB coordinate point corresponding to the second AWB parameter; determining a first weight corresponding to the first AWB parameter and a second weight corresponding to the second AWB parameter based on the first AWB coordinate point and the second AWB coordinate point respectively; a target AWB gain value for the image to be processed is calculated based on the first AWB parameter and the first weight, and the second AWB parameter and the second weight. The method can improve the accuracy of determining the AWB gain value.

Description

Image processing method, device, equipment and storage medium based on face characteristics
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, apparatus, device, and storage medium based on facial features.
Background
Color temperature (Color Temperature) is a scale representing the light color of a light source in K (kelvin). The human eye recognizes the brightest object as white at any color temperature. While photographs taken by the camera at different color temperatures appear in different colors, e.g., a photograph under the D65 light source is bluish and a photograph under the a light source is yellowish. Indoor light sources tend to be complex, and neither incandescent nor fluorescent lamps are quite standard in color temperature. So shooting a portrait indoors often causes abnormal skin tone, yellowish or bluish of the person.
With the development of image processing technology, the requirements of people on images are higher and higher, and the images are usually subjected to post-optimization so as to obtain better visual effects. As a common image optimization means, automatic white balancing (Automatic White Balance, AWB) is widely used for processing of pictures containing human faces. The essence of white balance is that a white object is displayed as white under a light source of any color, and the white balance changes the color of a photographed image into a normal color seen by human eyes through color correction.
However, the accuracy of the existing white balance algorithm is low when calculating the white balance gain value for compensating the image, so that the color cast problem still exists in the image after the white balance processing. Particularly, in some white balance scenes with solid color background or large-area non-neutral color background, such as blue background or yellow background containing large pieces, the existing white balance algorithm is invalid, so that the accuracy of the existing white balance algorithm in calculating the white balance gain value is low, the white balance processing effect on the image is poor, the problem of inaccurate face color restoration of the image after the white balance processing is caused, and the user experience is poor.
Disclosure of Invention
In view of the above, the embodiments of the present application provide an image processing method, apparatus, device, and storage medium based on a face feature, which can improve the accuracy of the AWB gain value obtained by calculation for a face image white balance scene, thereby being beneficial to more truly restoring the color of the face area of the image and improving the user experience.
In a first aspect, an embodiment of the present application provides an image processing method based on a face feature, which is applied to an electronic device, and includes:
Acquiring an image to be processed, and detecting a face area in the image to be processed;
calculating RGB values of the face region, and predicting a first AWB coordinate point corresponding to the face region in a target coordinate system based on the RGB values of the face region; the target coordinate system is used for determining a preset relation between RGB values and AWB parameters;
determining a first AWB parameter of the image to be processed based on the first AWB coordinate point;
acquiring a second AWB parameter of the image to be processed by adopting a preset white balance algorithm to calculate the image to be processed, and acquiring a corresponding second AWB coordinate point in the target coordinate system according to the second AWB parameter;
determining a first weight corresponding to the first AWB parameter and a second weight corresponding to the second AWB parameter based on the first AWB coordinate point and the second AWB coordinate point respectively;
a target AWB gain value for the image to be processed is calculated based on the first AWB parameter and the first weight, and the second AWB parameter and the second weight.
It can be understood that, for an image to be processed obtained by shooting by the electronic device, a first AWB coordinate point of a current color temperature can be estimated according to characteristic data of a face area in the image to be processed, so as to obtain an AWB parameter of the white balance of the face area, and the AWB parameter of the white balance of the whole image and a corresponding second AWB coordinate point are obtained by calculating the image to be processed by adopting a preset white balance algorithm, so that the influence degree of the white balance gain of the face area to be processed is determined according to the first AWB coordinate point and the second AWB coordinate point, the AWB parameter of the whole image and the face area are integrated to calculate a target AWB gain value of the image to be processed, the accuracy of the calculated target AWB gain value is improved, the color of the face area of the image is more truly restored, and the use experience of the user is improved.
In a possible implementation manner of the first aspect, the calculating an RGB value of the face area includes:
determining a central area of the face area, and calculating RGB values of the central area;
determining a plurality of image blocks corresponding to the face area, and calculating RGB values of the image blocks;
according to the RGB value of the central area and the RGB value of each image block, one or more target image blocks are selected from a plurality of image blocks corresponding to the face area;
and calculating RGB values of an area formed by the one or more target image blocks as the RGB values of the face area.
It can be understood that when calculating the RGB value of the face region, the central region of the face region can be used as a reference, an image block region with smaller difference from the central region, namely a skin color region, can be screened from the face region, the RGB value of the skin color region is used as the RGB value of the face region, and the influence of the hair region and the background region on the calculation of the RGB value of the face region can be eliminated, so that the accuracy of estimating the AWB coordinate point of the current color temperature of the face region is improved.
In a possible implementation manner of the first aspect, predicting, based on RGB values of the face region, a first AWB coordinate point corresponding to the face region in a target coordinate system includes:
Determining a skin color coordinate point corresponding to the face region in a target coordinate system according to the RGB value of the face region;
acquiring skin color correction data corresponding to a plurality of light sources of different types corresponding to the face region;
calculating correction weight values corresponding to various types of light sources according to the skin color coordinate points and the skin color correction data;
respectively acquiring neutral color correction data corresponding to each type of light source;
and calculating a neutral color prediction coordinate point corresponding to the face region in a target coordinate system according to the neutral color correction data and the correction weight value corresponding to each type of light source, and taking the neutral color prediction coordinate point as the first AWB coordinate point.
It can be understood that, because there is a certain regularity between the falling point coordinates of the skin color of each color temperature and the falling point coordinates of the neutral color, under the corresponding coordinate system established based on R/G-B/G, the falling point coordinates of the skin color are offset to the right and below, and the regularity still exists in the Log Domain multi-color temperature frame coordinate system, therefore, the correction weight value corresponding to each color temperature can be determined based on the skin color coordinate point and the skin color correction data of each color temperature, and then the neutral color correction data of each color temperature and the correction weight value are utilized to predict the neutral color prediction coordinate point corresponding to the face area.
In a possible implementation manner of the first aspect, the acquiring skin color correction data corresponding to the plurality of light sources corresponding to the face region includes:
determining a first skin color reference RGB value and a second skin color reference RGB value corresponding to each type of light source respectively;
acquiring a brightness value of the face area;
and calculating skin color correction data corresponding to the face region based on the brightness value of the face region, the first skin color reference RGB value and the second skin color reference RGB value.
In a possible implementation of the first aspect, the skin color correction data is a skin color correction coordinate point;
the calculating the correction weight value corresponding to each type of light source comprises the following steps:
and calculating correction weight values corresponding to the light sources of all types according to the distances between the skin color coordinate points and the skin color correction coordinate points.
In a possible implementation of the first aspect, the second AWB parameter includes a plurality of second AWB gain values corresponding to different types of light sources, each of the second AWB gain values corresponding to one of the second AWB coordinate points;
the step of obtaining the second AWB parameter of the image to be processed by calculating the image to be processed by adopting a preset white balance algorithm comprises the following steps:
Dividing the image to be processed into a plurality of image blocks, and calculating RGB values of the image blocks;
determining the type of a light source to which each image block belongs based on the RGB value of each image block;
determining, for each type of light source, an image block belonging to said type of light source, respectively; determining a second AWB coordinate point corresponding to the type of light source in a target coordinate system according to the image block of the type of light source; and calculating a second AWB gain value corresponding to the type of light source according to the second AWB coordinate point.
In a possible implementation of the first aspect, the acquiring, according to the second AWB parameter, a corresponding second AWB coordinate point in the target coordinate system includes:
determining the type of the light source corresponding to each second AWB gain value in the second AWB parameters respectively;
and acquiring a second AWB coordinate point corresponding to the type of light source in the target coordinate system, and taking the second AWB coordinate point corresponding to the second AWB gain value as the second AWB coordinate point.
It can be understood that, for the image to be processed, the type of the light source to which each image block belongs may be determined according to the RGB value of each image block obtained by dividing the image to be processed, and then the neutral color coordinate point and the second AWB gain value corresponding to each type of light source and the influence degree of each type of light source on the white balance gain are determined according to the image block under each type of light source, and the influence degree may be the prediction weight value corresponding to each type of light source. According to the influence degree of the white balance gain of the image to be processed of each type of light source, the AWB gain value of the image to be processed is calculated by integrating the color temperatures corresponding to each type of light source, and the accuracy of the calculated AWB gain value is improved.
In a possible implementation of the first aspect, the first AWB parameter is a first AWB gain value, the first weight includes a plurality of first weight values corresponding to different types of light sources, and the second weight includes a plurality of second weight values corresponding to different types of light sources;
the determining, based on the first AWB coordinate point and the second AWB coordinate point, a first weight corresponding to the first AWB parameter, and a second weight corresponding to the second AWB parameter includes:
calculating a first weight value corresponding to each type of light source according to the distance between the first AWB coordinate point and a second AWB coordinate point corresponding to each second AWB gain value;
obtaining a predicted weight value corresponding to each type of light source, wherein the predicted weight value is obtained by calculating the image to be processed by adopting a preset white balance algorithm;
and calculating a second weight value corresponding to each type of light source according to the first weight value and the predicted weight value corresponding to each type of light source.
It can be understood that by calculating the distance between the first AWB coordinate point and the second AWB coordinate point corresponding to each second AWB gain value, the accuracy of the color temperature corresponding to each type of light source can be determined based on the color temperature corresponding to the face region, so as to determine the weight value corresponding to the first AWB gain value and the second AWB gain value corresponding to each type of light source. Specifically, the greater the distance between the second AWB coordinate point and the first AWB coordinate point, the worse the accuracy of the color temperature representing the corresponding type of light source, at which time the weight of the first AWB gain value may be increased, and the weight of the second AWB gain value may be decreased; the smaller the distance between the second AWB coordinate point and the first AWB coordinate point, the better the accuracy of the color temperature representing the corresponding type of light source, at this time, the weight of the second AWB gain value may be increased, and the weight of the first AWB gain value may be reduced.
It can be understood that by adjusting the influence degree of the color temperature corresponding to each type of light source on the white balance gain and the influence degree of the color temperature estimated by the face region on the white balance gain of the image to be processed in real time, the target AWB gain value of the image to be processed is obtained by calculating the color temperature corresponding to each type of light source and the color temperature estimated by the face region, so that the accuracy of the calculated AWB gain value can be further improved, and more true restored image colors can be facilitated.
In a possible implementation of the first aspect, the calculating the target AWB gain value of the image to be processed based on the first AWB parameter and the first weight, and the second AWB parameter and the second weight includes:
for each type of light source, adding the product of the first AWB gain value and the first weight value corresponding to the type of light source and the product of the second AWB gain value corresponding to the type of light source to obtain the AWB gain value corresponding to the type of light source;
and adding the AWB gain values corresponding to the light sources of all types to obtain the target AWB gain value of the image to be processed.
In a possible implementation of the first aspect, the method further includes:
Acquiring the size of the face area;
judging whether the size of the face area is smaller than a preset size threshold value or not;
and when the size of the face area is smaller than a preset size threshold, setting the first weight to be zero.
It can be appreciated that when the size of the face area is smaller than the preset size threshold, the first weight may be set to zero, so as to avoid the problem of color jump caused by unstable RGB statistics value due to the too small face area.
In a possible implementation of the first aspect, the method further includes:
when the size of the face area is smaller than a preset size threshold value, accumulating the number of continuous images meeting preset conditions;
and determining the first weight according to the number of the continuous images.
It can be understood that when the size of the face area is smaller than the preset size threshold, the corresponding first weight can be set according to the number of frames of the images in which no face is continuously detected, so that the first weight is inversely related to the number of frames of the images in which no face is continuously detected, the face is judged to disappear when no face is continuously detected for a certain number of frames, and the first weight is set to zero, thereby avoiding the problem of color jump when no face is detected for a certain number of frames.
In a second aspect, an embodiment of the present application provides an image processing apparatus, provided in an electronic device, based on a face feature, including:
the image acquisition unit is used for acquiring an image to be processed and detecting a face area in the image to be processed;
the coordinate point prediction unit is used for calculating RGB values of the face area and predicting a first AWB coordinate point corresponding to the face area in a target coordinate system based on the RGB values of the face area; the target coordinate system is used for determining a preset relation between RGB values and AWB parameters;
a parameter determining unit, configured to determine a first AWB parameter of the image to be processed based on the first AWB coordinate point;
the parameter acquisition unit is used for acquiring a second AWB parameter of the image to be processed by adopting a preset white balance algorithm to calculate the image to be processed, and acquiring a corresponding second AWB coordinate point in the target coordinate system according to the second AWB parameter;
the weight determining unit is used for respectively determining a first weight corresponding to the first AWB parameter and a second weight corresponding to the second AWB parameter based on the first AWB coordinate point and the second AWB coordinate point;
and the image processing unit is used for calculating a target AWB gain value of the image to be processed based on the first AWB parameter and the first weight and the second AWB parameter and the second weight.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, where at least one instruction or at least one section of program is stored in the memory, where the at least one instruction or the at least one section of program is loaded and executed by the processor to implement an image processing method based on a face feature as described above.
In a fourth aspect, an embodiment of the present application provides a computer readable storage medium, where at least one instruction or at least one program is stored, where the at least one instruction or the at least one program is loaded and executed by a processor to implement an image processing method based on facial features as described above.
Drawings
In order to more clearly illustrate the technical solution of the present application, the following description will make a brief introduction to the drawings used in the description of the embodiments or the prior art. It is evident that the drawings in the following description are only some embodiments of the present application and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art.
Fig. 1 is a schematic view of a scene of capturing and processing an image to generate a corresponding photo according to an embodiment of the present application.
Fig. 2a is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 2b is a schematic structural diagram of an image processing device based on facial features according to an embodiment of the present application.
Fig. 3 is a flowchart of an implementation of an image processing method based on face features according to an embodiment of the present application.
Fig. 4 is a schematic diagram of a face region and a central region according to an embodiment of the present application.
Fig. 5 is a flowchart of predicting a first AWB coordinate point according to RGB values of a face area according to an embodiment of the present application.
Fig. 6 is a schematic diagram of a Log Domain multi-color temperature frame coordinate system according to an embodiment of the present application.
Fig. 7 is a schematic diagram of a ColorChecker24 color card according to an embodiment of the present application.
Fig. 8 is a schematic diagram of a skin color correction data according to an embodiment of the present application.
Fig. 9 is a schematic diagram of a skin color drop point coordinate and a neutral color drop point coordinate under a standard light source according to an embodiment of the present application.
Fig. 10 is a flowchart of calculating a second AWB parameter by a preset white balance algorithm according to an embodiment of the present application.
Fig. 11 is a schematic block diagram of a video coding system according to an embodiment of the present application.
Fig. 12 is a schematic block diagram of a system on chip (SoC) according to an embodiment of the present application.
Detailed Description
In order to facilitate understanding of the solution of the present application, some terms related to the embodiments of the present application will be described first.
Color temperature of light source: the standard black body with complete absorption and emission capability is heated, the temperature is gradually increased, the luminosity is also changed, and the color change process displayed by the standard black body in the process is as follows: red, orange red, yellow-white, and blue-white. The temperature at which a standard black body warms up to the point where the same or close to the source's color light appears is defined as the relevant color temperature of the source, referred to as the color temperature. The more blue the light color of the standard blackbody is, the higher the color temperature of the light source is; the more reddish the lower the color source temperature.
The usual partial standard light sources and the corresponding color temperatures include:
d65, color temperature 6500K, is commonly used as an international standard daylight source.
CWF, color temperature 4150K, is a common light source for markets or offices.
TL84, color temperature 4100K, is commonly used as a commercial fluorescent lamp in europe.
A, color temperature is 2856K, is a halogen tungsten lamp (i.e. incandescent lamp).
H, abbreviated by Hor, has a color temperature of 2700K and is also a halogen tungsten lamp (i.e., incandescent lamp).
DF, color temperature is 6400K, is an artificial light source simulating a sunlight xenon lamp.
D75, color temperature is 7500K standard light source.
The D65 light source serving as a standard illuminating body is provided with an ultraviolet component of a sunlight spectrum, can be used for simulating a sunlight environment, and the color seen by the light source can represent the true color and is not easy to generate chromatic aberration.
The ColorChecker24 color card is an array of 24 natural, color, primary, and gray color patches configured according to numerous color science. The 24 color patches on the ColorChecker24 color card, labeled #1 through #24 from left to right and from top to bottom, respectively, are mostly representative of the actual colors of natural objects such as human skin, leaves, and blue sky. Since they represent the colors of the respective counterparts and reflect visible light in the same way, these color patches can be matched to the colors of the natural objects they represent under any light source, while also being useful in color reproduction procedures. Therefore, the ColorChecker24 color card can be used for creating white balance of a digital camera, and can ensure that accurate and uniform neutral white can be generated under any illumination condition.
The sunlight path line generally corresponds to a pixel value of each pixel point of a gray color block on an image acquired by photographing a ColorChecker24 color card based on each standard light source, for example, a brightness level (RGB value for short) of each color channel expressed by adopting an RGB color mode, calculates R/G and B/G respectively, and further establishes a curve drawn by a corresponding coordinate system for each observation point with coordinates (R/G, B/G). The color temperature distribution of each standard light source on the curve corresponds to the illumination color temperature corresponding to the sunlight track of different time periods, so that the curve is called a sunlight track line. For the sake of distinguishing description, a daylight trajectory line drawn based on each observation point with coordinates (R/G, B/G) may be described as a standard daylight trajectory line in the embodiment of the present application. It can be appreciated that a fitting straight line can be obtained by fitting based on a standard sunlight path line, so as to be used for calculating an included angle theta between the fitting straight line and the horizontal direction, and further be used for determining the related calculation of the white balance gain value of the photographed image.
In addition, log (R/G) and log (B/G) can be calculated based on RGB values of each pixel point of gray color blocks on an image acquired by a ColorChecker24 color card shot by each standard light source, so that a log color temperature curve is drawn for each observation point with a corresponding coordinate system to be the coordinates (log (R/G), log (B/G)), and the log color temperature curve is used as a sunlight track line, and can be called as a log sunlight track line hereinafter. It can be understood that another fitting straight line can be obtained by more accurate fitting based on the log sunlight trajectory line, and an included angle theta between the fitting straight line and the horizontal direction is calculated, so that the included angle is used for carrying out corresponding coordinate conversion calculation and the like on each pixel point or each image block on the shot image, and the purpose of quickly and accurately determining the white balance gain value of the shot image is achieved.
It can be understood that, in the image processing method based on the face feature provided in the embodiment of the present application, the included angle θ may be calculated by any one of the above methods, which is not limited herein.
In order to solve the problems that in white balance scenes such as face images shot by solid-color backgrounds or large-area non-neutral-color backgrounds, an existing white balance algorithm is invalid, so that the accuracy of determining white balance gain values of shot images by adopting the existing white balance algorithm is low and face color restoration is inaccurate, the embodiment of the application provides an image processing method based on face features, which is applied to electronic equipment with an image processing function.
According to the method, a first AWB coordinate point of a current color temperature is estimated according to characteristic data of a face area in an image to be processed, AWB parameters of white balance of the face area are obtained, the AWB parameters of white balance of the whole image and corresponding second AWB coordinate points are obtained through calculation of a preset white balance algorithm on the image to be processed, the influence degree of the face area on the white balance gain of the image to be processed is determined according to the distance between the first AWB coordinate point and the second AWB coordinate point, and the AWB gain value of the image to be processed is obtained through calculation of the comprehensive face area and the AWB parameters corresponding to the whole image.
Based on the method, in white balance scenes such as face images shot by solid-color backgrounds or large-area non-neutral-color backgrounds, the target AWB gain value of the image to be processed can be obtained by calculating the color temperature corresponding to the face area and the whole image according to the influence degree of the white balance gain of the image to be processed, so that the accuracy of the calculated AWB gain value is improved, more true restored image colors are facilitated, and the face colors can be restored accurately.
Embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Fig. 1 illustrates a schematic view of a scene in which an electronic device 100 captures and processes images to generate corresponding photographs, in accordance with an embodiment of the application.
As shown in fig. 1, a Lens (Lens) 101 of an electronic device 100 with an image processing function may collect a person/scene image within a viewing angle range, generate an optical signal, and transmit the optical signal to a photosensitive area on a surface of an image sensor 102; then, the image sensor 102 performs photoelectric conversion to form RAW (RAW) image data, where the RAW image data may be Bayer RAW format obtained by Bayer array conversion, for example; the image sensor 102 then transmits the converted original image data to the image signal processor (Image Signal Processor, ISP) 103 for image processing; the ISP 103 processes the original image data by using various preset image processing algorithms, and outputs BMP format or YUV format images to the image acquisition unit at the rear end to obtain images as photographing results. The image sensor 102 may be, for example, a CMOS sensor, and the white balance algorithm adopted when the ISP 103 performs image processing may be, for example, an image processing method based on a face feature provided in an embodiment of the present application.
Fig. 2a shows a schematic structural diagram of an electronic device 100 according to an embodiment of the application.
As shown in fig. 2a, the electronic device 100 may include a processor 110, a wireless communication module 120, a mobile communication module 130, a power module 140, an audio module 150, an interface module 160, a camera 170, a memory 180, a sensor module 190, keys 201, a display 202, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The components shown in fig. 2a may be implemented in hardware, software or a combination of software and hardware.
The processor 110 may include one or more processing units, for example, processing modules or processing circuits that may include a central processor CPU (Central Processing Unit), an image processor GPU (Graphics Processing Unit), an Image Signal Processor (ISP), a microprocessor (Micro-programmed Control Unit, MCU), an AI (Artificial Intelligence ) processor, a programmable logic device FPGA (Field Programmable Gate Array), and the like. Wherein the different processing units may be separate devices or may be integrated in one or more processors. A memory unit, such as memory 180, may be provided in the processor 110 for storing instructions and data.
The wireless communication module 120 may include an antenna, and transmit and receive electromagnetic waves via the antenna.
The mobile communication module 130 may include, but is not limited to, an antenna, a power amplifier, a filter, a low noise amplifier (Low noise amplify, LNA), and the like. The mobile communication module 130 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied on the electronic device 100. The mobile communication module 130 may receive electromagnetic waves from an antenna, perform processes such as filtering and amplifying on the received electromagnetic waves, and transmit the electromagnetic waves to a modem processor for demodulation. The mobile communication module 130 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna to radiate.
In some embodiments, the mobile communication module 130 and the wireless communication module 120 of the electronic device 100 may also be located in the same module.
The power module 140 may include a power source, a power management component, and the like. The power source may be a battery, and the power management component is configured to manage charging of the power source and powering of other modules by the power source.
The audio module 150 is used to convert digital audio information into an analog audio signal output, or to convert an analog audio input into a digital audio signal. In some embodiments, the audio module 150 may include a speaker, an earpiece, a microphone, and an earphone interface.
The interface module 160 includes an external memory interface, a universal serial bus (universal serial bus, USB) interface, a subscriber identity module (subscriber identification module, SIM) card interface, and the like.
The camera 170 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The electronic device 100 may implement photographing and image processing functions of the electronic device 100 through an ISP, a camera 170, a video codec, a GPU (Graphic Processing Unit, a graphics processor), a display screen 202, an application processor, and the like.
The sensor module 190 may include a proximity light sensor, a pressure sensor, a gyroscope sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
The display 202 is used to display human-machine interaction interfaces, images, videos, and the like. The display 202 includes a display panel.
In some embodiments, the electronic device 100 further includes keys 201, motors, indicators, and the like. The keys 201 may include a volume key, an on/off key, and the like. The motor is used to cause the electronic device 100 to generate a vibration effect, such as when the user's electronic device 100 is called, to prompt the user to answer the call from the electronic device 100. The indicators may include laser indicators, radio frequency indicators, LED indicators, and the like.
Fig. 2b shows a block schematic diagram of an image processing apparatus 200 based on facial features according to an embodiment of the application. The image processing apparatus 200 based on the face features may be applied to the electronic device 100 shown in fig. 2a to implement the image processing method based on the face features according to the embodiment of the present application.
As shown in fig. 2b, the image processing apparatus 200 based on the face features may specifically include:
an image acquisition unit 201, configured to acquire an image to be processed, and detect a face area in the image to be processed.
A coordinate point prediction unit 202, configured to calculate an RGB value of the face area, and predict a first AWB coordinate point corresponding to the face area in a target coordinate system based on the RGB value of the face area; the target coordinate system is used for determining a preset relation between RGB values and AWB parameters.
A parameter determining unit 203, configured to determine a first AWB parameter of the image to be processed based on the first AWB coordinate point.
The parameter obtaining unit 204 is configured to obtain a second AWB parameter of the image to be processed by calculating the image to be processed by using a preset white balance algorithm, and obtain a corresponding second AWB coordinate point in the target coordinate system according to the second AWB parameter.
The weight determining unit 205 is configured to determine a first weight corresponding to the first AWB parameter and a second weight corresponding to the second AWB parameter based on the first AWB coordinate point and the second AWB coordinate point, respectively.
An image processing unit 206, configured to calculate a target AWB gain value of the image to be processed based on the first AWB parameter and the first weight, and the second AWB parameter and the second weight.
In some possible embodiments, the image processing unit 206 is further configured to perform white balance processing on the image to be processed based on the target AWB gain value, to obtain a white balance processed image.
In some possible embodiments, the image processing apparatus 200 based on facial features may further include:
a size obtaining unit, configured to obtain a size of the face area;
the weight determining unit 205 may be further configured to determine whether the size of the face area is smaller than a preset size threshold; and when the size of the face area is smaller than a preset size threshold, setting the first weight to be zero.
In some possible embodiments, the weight determining unit 205 may be further configured to obtain, when the size of the face area is smaller than a preset size threshold, a number of consecutive images in which the size of the currently detected face area is smaller than the preset size threshold; and determining the first weight according to the number of the continuous images.
It is to be understood that the configuration illustrated in the embodiment of the present application does not constitute a specific limitation of the image processing apparatus 200 based on the face features. In other embodiments of the present application, the image processing apparatus 200 based on facial features may include more or less units or modules than illustrated, or may combine some units, or split some units, or may have different structural arrangements of units.
It can be appreciated that the electronic devices to which the image processing method based on facial features provided by the embodiments of the present application is applicable include, but are not limited to, digital cameras, video cameras, tablet computers, mobile phones, and wearable devices such as augmented reality (VR) devices, smart televisions, smart watches, and other electronic devices with image processing functions. The following describes a specific implementation procedure of the image processing method based on the face feature provided in the embodiment of the present application, taking a mobile phone as an example of the electronic device 100.
Fig. 3 is a schematic flow chart of an implementation of an image processing method based on face features according to an embodiment of the present application. It will be appreciated that the execution subject of each step in the flow shown in fig. 3 may be the mobile phone 100, or may be a processor of the mobile phone 100, for example, the ISP 103 described above. In order to avoid repetition of the description, in the following description of each step, the execution subject of each step will not be described.
As shown in fig. 3, the flow includes the following steps.
S301: and acquiring an image to be processed, and detecting a face area in the image to be processed.
The face region detection can be performed on the acquired image to be processed by adopting various detection technologies in the prior art, for example, the face region detection can be performed by utilizing an artificial intelligence (Artificial Intelligence, AI) detection technology to obtain the face region in the image to be processed, and the face region detection method is not limited in the embodiment of the application.
S302: and calculating the RGB value of the face region, and predicting a first AWB coordinate point corresponding to the face region in a target coordinate system based on the RGB value of the face region.
The RGB values of the face region may be an average value of RGB values of each pixel point that forms the face region, in other embodiments, the RGB values of the face region may also be determined based on other calculation manners, for example, may be a weighted average value of RGB values of each pixel point that forms the face region, and the embodiment of the application is not limited herein.
In some embodiments, the calculating the RGB values of the face region may include:
Determining a central area of the face area, and calculating RGB values of the central area;
determining a plurality of image blocks corresponding to the face area, and calculating RGB values of the image blocks;
according to the RGB value of the central area and the RGB value of each image block, one or more target image blocks are selected from a plurality of image blocks corresponding to the face area;
and calculating RGB values of an area formed by the one or more target image blocks as the RGB values of the face area.
Here, an area within a preset range of the center of the face area may be taken as a center area, and, for example, as shown in fig. 4, an area of 50% of the center of the face area may be taken as a center area. The face region may be divided into a plurality of image blocks, the center region/image block may include a plurality of pixels, the RGB values of the center region/image block may be an average of RGB values of respective pixels constituting the center region/image block, and in other embodiments, the RGB values of the center region/image block may be determined based on other calculation manners, for example, may be a weighted average of RGB values of respective pixels constituting the center region/image block, etc., which is not limited herein.
For example, when a certain frame of image is acquired when the mobile phone 100 takes a picture or a certain frame of image is acquired when a video is taken, the ISP 103 may take an area of 50% of the center of the face area of the frame of image as a center area based on RGB values (e.g., R p G p B p Value), calculates the RGB value of the center region (denoted as R 0 G 0 B 0 Values). The ISP 103 may divide the face area of the frame image into a plurality of image blocks according to a preset division rule, for example, into 6×3 image blocks, and further the ISP 103 may calculate the RGB value of each image block based on the RGB value of each pixel included in each image block. For example, ISP 103 may be based on R for each pixel point in each image block p G p B p The values are calculated to obtain R of 6X 3 image blocks b G b B b The values are not described in detail herein in the embodiments of the present application.
Specifically, R/G and B/G can be calculated as center reference ratios respectively according to RGB value calculation of the center area, and the center reference ratio is recorded as RGRatio ref And BGRatio ref R/G and B/G of each image block are calculated according to RGB values of each image block and are marked as RGRatio [ k ]]And BGRatio [ k ]](k is the number of image blocks), according to RGRatio [ k ] of each image block]And BGRatio [ k ]]RGRatio with center region ref And BGRatio ref Determining the difference between each image block and the central area, taking the image blocks with the difference smaller than or equal to a preset difference threshold value as target image blocks, wherein the area formed by each target image block is a skin color area of a human face, eliminating the image blocks with the difference larger than the preset difference threshold value, and possibly eliminating the head area or the background area of a human body from the image blocks with the difference larger than the preset difference threshold value.
The preset difference threshold may be set according to actual situations, which is not limited in the embodiment of the present application; the RGB values of the skin tone region may be an average of RGB values of each pixel constituting the skin tone region, and in other embodiments, the RGB values of the skin tone region may be determined based on other calculation manners, for example, may be a weighted average of RGB values of each pixel constituting the skin tone region, etc., which are not limited herein.
Specifically, fig. 5 illustrates a flowchart of predicting a first AWB coordinate point of a face region corresponding to a target coordinate system based on RGB values of the face region, according to some embodiments of the application.
As shown in fig. 5, the process includes the steps of:
s3021: and determining a skin color coordinate point corresponding to the face region in a target coordinate system according to the RGB value of the face region.
The target coordinate system is used to determine a preset relationship between the RGB values and the AWB parameters, and the target coordinate system may be, for example, a preset Log Domain multi-color temperature frame coordinate system, where the Log Domain multi-color temperature frame coordinate system refers to a coordinate system preset in the mobile phone 100 and including a color temperature area (i.e., a color temperature frame) calibrated based on a standard light source, and a forming process of the Log Domain multi-color temperature frame coordinate system will be described in detail below.
For example, the ISP 103 in the mobile phone 100 may calculate coordinate values on a log color temperature curve based on RGB values of the face region, and record the coordinate values as (log (R/G), log (B/G)). Then, combining the preset coordinate rotation matrixWhere K is data type conversionParameters for converting floating point type data (float) types involved in the coordinate conversion process into integer type data (int) types of computation to increase the computation rate of coordinate conversion.
Based on the above coordinate rotation matrix conversion, ISP 103 in mobile phone 100 can predict and obtain skin color coordinate points of integer data type corresponding to the face region, and the skin color coordinate points are denoted as (X r ,Y r ) The calculation formula may refer to the following formulas (1) and (2):
in the above-mentioned preset coordinate rotation matrix, θ is an included angle between a Log sunlight trajectory line and a horizontal direction determined based on a Log color temperature curve calibrated by a standard light source under Log domain, and in some embodiments, θ may also be an included angle between a standard sunlight trajectory line and a horizontal direction determined based on a curve calibrated by a standard light source under R/G-B/G. The specific process of determining the included angle may refer to the prior art, and the embodiment of the present application is not described herein.
Finally, based on the converted skin color coordinate point (X) corresponding to the face region r ,Y r ) The ISP 103 may further calculate standard coordinate data of the skin color coordinate point of the face region in a preset Log Domain multi-color temperature frame coordinate system. Specifically, the color coordinate point (X r ,Y r ) The conversion formula of the standard coordinate data converted into the preset Log Domain multi-color temperature frame coordinate system can be referred to the following formula (3):
wherein (X, Y) is that the face area is in advanceStandard coordinate data of skin color coordinate points under a set Log Domain multi-color temperature frame coordinate system. Wherein inv matrix k is a skin tone coordinate point (X r ,Y r ) Another data type conversion parameter for reconverting to a floating point type data type, i.e. for removing the calculation (X r ,Y r ) K introduced at that time. Illustratively, inv_matrix_k may be calculated based on the following equation (4):
inv_matrix_k=(K*cosθ) 2 +(K*sinθ) 2 (4)
fig. 6 shows a schematic diagram of a preset Log Domain multi-color temperature frame coordinate system according to an embodiment of the present application.
As shown in fig. 6, a preset Log Domain multi-color temperature frame coordinate system may include a color temperature frame in which corresponding color temperature boundary coordinates are determined based on 7 standard light sources, for example, H, A, TL, CWF, D65, DF, and D75. The RGB values of the face area are calculated based on Log (R/G) and Log (B/G), a preset coordinate rotation matrix is combined to perform coordinate conversion, and finally, after the inverse matrix conversion and the like are completed by combining the formulas (3) and (4), a skin color coordinate point represented by application coordinate values (X and Y) corresponding to the face area can be determined on a Log Domain multi-color temperature frame coordinate system shown in fig. 6, and the skin color coordinate point can be marked as P FAvg
It can be understood that the Log Domain multi-color temperature frame coordinate system shown in fig. 6 can be formed by performing coordinate transformation, creating a coordinate system, and the like on the image data collected by the gray card based on standard light sources such as H, A, TL, D65, CWF, DF, D75, and the like. The specific process of forming the Log Domain multi-color temperature frame coordinate system shown in fig. 6 may refer to related descriptions in the prior art, and the embodiments of the present application are not described herein.
S3022: and acquiring a plurality of skin color correction data corresponding to different types of light sources corresponding to the face region.
The Skin color reference data corresponding to the Light sources of the different types can be predetermined, and include reference data of Dark Skin color and Light Skin color, and corresponding Skin color correction data is calculated based on feature information of the face region and the reference data of Dark Skin color and Light Skin color.
Illustratively, when a certain frame of image is acquired when the mobile phone 100 takes a photograph or when a certain frame of image is acquired when a video is taken, the ISP 103 may determine Skin color correction data of each type of Light source between the reference data of the Light Skin and the Dark Skin of each type of Light source based on the brightness of the face region in the image. The types of the light sources may be set to 5 types, for example, these 5 types of D65, CWF, TL84, a, and H.
Specifically, the acquiring the skin color correction data corresponding to the plurality of light sources of different types corresponding to the face region may include:
determining a first skin color reference RGB value and a second skin color reference RGB value corresponding to each type of light source respectively;
acquiring a brightness value of the face area;
and calculating skin color correction data corresponding to the face region based on the brightness value of the face region, the first skin color reference RGB value and the second skin color reference RGB value.
Illustratively, the mobile phone 100 may photograph the ColorChecker24 color card as shown in fig. 7 under standard Light boxes D65, CWF, TL84, a and H Light, respectively, select an average RGB value of a block #1 (Dark skin) in the 24 color card as a corresponding first skin color reference RGB value, and select an average RGB value of a block #2 (Light skin) as a corresponding second skin color reference RGB value.
The brightness value of the face region may be determined according to the RGB value of the face region. For example, the luminance value of the face region may be taken as the value of the G channel among the RGB values of the face region.
When skin color correction data is actually calculated, skin color correction RGB values of the light sources of all types can be interpolated between a first skin color reference RGB value and a second skin color reference RGB value corresponding to the light sources of all types according to the brightness value of the face area to serve as the skin color correction data.
Illustratively, the values of the channels (R1, G1, B1) in the RGB values of the Dark skin and the values of the channels (R2, G2, B2) in the RGB values of the Light skin may be determined first, and then the calculation process of calculating the values of the R, G, B channels (R, B) in the skin color correction RGB values of the Light sources of the respective types according to the luminance values of the face regions (the luminance values are taken as the values of the G channels in the RGB values of the face regions, and are taken as L) may refer to the following formula:
/>
that is, as shown in fig. 8, when the luminance value of the face region is smaller than the value of the G channel of the Dark skin, the RGB value of the Dark skin is taken as the skin tone correction RGB value, when the luminance value of the face region is larger than the value of the G channel of the Light skin, the RGB value of the Light skin is taken as the skin tone correction RGB value, and when the luminance value of the face region is larger than or equal to the value of the G channel of the Dark skin, the RGB value between the RGB values of the Dark skin and the Light skin is taken as the skin tone correction RGB value.
In some embodiments, the RGB values may be corrected based on the skin color of each type of interpolated light source, and the skin color correction coordinate points of each type of light source on the Log Domain multi-color temperature frame coordinate system may be calculated by using the formulas (1) - (4) as corresponding skin color correction data.
Illustratively, the mobile phone 100 may calculate the corresponding skin color correction coordinate points P according to the skin color correction RGB values of the standard light sources D65, CWF, TL84, a and H light, respectively cali [i]Where i=0, 1, …,4 are the serial numbers of the light sources, as indicated by the individual coordinate points marked by boxes in fig. 6.
S3023: and calculating correction weight values corresponding to the light sources of all types according to the skin color coordinate points and all the skin color correction data.
Wherein, skin color correction coordinate points on a Log Domain multi-color temperature frame coordinate system can be determined according to skin color correction data of each type of light source, and correction weight values corresponding to each type of light source are determined based on the distance between the skin color correction coordinate points and the skin color coordinate points. The smaller the distance between the skin color correction coordinate point of a certain type of light source and the skin color coordinate point is, the larger the correction weight value corresponding to the type of light source can be correspondingly.
Specifically, when the skin tone correction data is a skin tone correction coordinate point, the calculating correction weight values corresponding to the light sources of the respective types may include: and calculating correction weight values corresponding to the light sources of all types according to the distances between the skin color coordinate points and the skin color correction coordinate points.
Specifically, when the skin color correction data is a skin color correction RGB value, the skin color correction coordinate points of the light sources of each type on the Log Domain multi-color temperature frame coordinate system can be calculated by using the formulas (1) - (4) based on the skin color correction RGB value of the light source of each type, and the correction weight values corresponding to the light sources of each type are calculated according to the distance between the skin color coordinate points and each skin color correction coordinate point.
Illustratively, the mobile phone 100 may calculate the skin tone coordinate points P separately FAvg And skin color correction coordinate points P of various types of light sources cali [i]Euclidean distance dist [ i ] between i=0, 1, …,4]I=0, 1, …,4, and the respective euclidean distances dist [ i ] are counted]Is then based on the respective Euclidean distance dist [ i ]]Calculating the correction weight value corresponding to each type of light source by the minimum distance minDist, and marking the correction weight value as faceTargetWeight [ i ]]For a specific calculation process of the correction weight value, the following formulas (5) to (11) may be referred to:
distNear=minDist*2 (5)
weight0[i]=minDist/dist[i] (7)
wherein, the liquid crystal display device comprises a liquid crystal display device,
faceTargetWeightPre[i]=weight0[i]*weight1[i] (9)
that is, the skin color coordinate point P FAvg And a skin tone correction coordinate point P of a certain type of light source cali [i]European distance dist [ i ]]The smaller the correction weight value corresponding to the type of light source can be correspondingly larger; skin color coordinate point P FAvg And a skin tone correction coordinate point P of a certain type of light source cali [i]European distance dist [ i ]]The larger the correction weight value corresponding to the type of light source is, the smaller the correction weight value corresponding to the type of light source can be, when the complexion coordinate point P is FAvg And a skin tone correction coordinate point P of a certain type of light source cali [i]European distance dist [ i ]]When the minimum distance minDist is larger than a certain multiple of the minimum distance minDist, the correction weight value corresponding to the light source can be set to be 0.
S3024: and respectively acquiring neutral color correction data corresponding to each type of light source.
Wherein neutral color correction data corresponding to a plurality of different types of light sources may be predetermined, and the neutral color correction data may be neutral color correction RGB values. Illustratively, the mobile phone 100 may photograph the ColorChecker24 color card as shown in fig. 7 under the standard light boxes D65, CWF, TL84, a and H light, respectively, and select the average RGB value of any one of the blocks #20, #21, #22 in the 24 color card as the neutral color correction RGB value.
S3025: and calculating a neutral color prediction coordinate point corresponding to the face region in a target coordinate system according to the neutral color correction data and the correction weight value corresponding to each type of light source, and taking the neutral color prediction coordinate point as the first AWB coordinate point.
Because there is a certain regularity between the falling point coordinates of the complexion of each color temperature and the falling point coordinates of the neutral color, as shown in fig. 9, under the corresponding coordinate system established based on R/G-B/G, the falling point coordinates of the complexion are shifted to the right and downward, and the regularity still exists in the Log Domain multi-color temperature frame coordinate system, therefore, the correction weight value of each type of light source can be determined based on the complexion coordinate point and the complexion correction data of each type of light source, and then the neutral color prediction coordinate point corresponding to the face area can be predicted by using the neutral color correction data of each type of light source and the correction weight value.
Specifically, the neutral color correction RGB values and the correction weight values corresponding to the various types of light sources can be weighted and calculated to obtain neutral color prediction RGB values corresponding to the face region, and then the neutral color prediction coordinate points corresponding to the face region in the Log Domain multi-color temperature frame coordinate system are calculated by combining the formulas (1) - (4) and recorded as P face I.e., the first AWB coordinate point.
S303: and determining a first AWB parameter of the image to be processed based on the first AWB coordinate point.
Wherein gain values (R Gain ,G Gain ,B Gain ) I.e. the first AWB parameter, is noted FaceAWBGain. It will be appreciated that the first AWB parameter is a first AWB gain value including gain values of respective color channels, the gain values of respective color channels determining the image to be processed mainly include gain values determined for circuits for controlling gains provided in the R-channel and the B-channel, and the gain of the G-channel may be set to a constant value.
For exampleTaking the RGB values represented by 8-bit binary numbers as an example, the gain value determined for the R channel may be determined by 2, for example X The calculation determines that the gain value determined for the B channel can be determined by, for example, 2 Y The calculation determines that the gain value of the G channel may be set to 1, for example. In other embodiments, the gain value of each color channel may be calculated by using a variation of the gain value equation of R, B, and the gain value of G channel may be set to other constant values accordingly. The gain values and the like of the respective color channels can be determined, for example, with reference to the following formula (12), without limitation.
Wherein, X and Y respectively represent the abscissa value and the ordinate value of the first AWB coordinate point.
Alternatively, in other embodiments, the calculation formula for calculating the first AWB parameter may be another formula different from the above formula (12), or may be another form of formula obtained by transforming based on the above formula (12).
It will be appreciated that, for example, the image to be processed uses a 16-bit binary number to represent the RGB values, and the manner of calculating the gain values of each color channel based on the first AWB coordinate point (X, Y) may also be other forms accordingly, and embodiments of the present application are not limited herein.
S304: and acquiring a second AWB parameter of the image to be processed by adopting a preset white balance algorithm to calculate the image to be processed, and acquiring a corresponding second AWB coordinate point in the target coordinate system according to the second AWB parameter.
The method comprises the steps of calculating a second AWB coordinate point corresponding to a Log Domain multi-color temperature frame coordinate system of the image to be processed and a second AWB parameter of the image to be processed by adopting a white balance algorithm in the prior art for all areas of the image to be processed.
For example, the calculation may be performed using an automatic white balance method based on Log Domain multi-color temperature boxes. In particular, the second AWB parameter may include a plurality of second AWB gain values corresponding to different types of light sources, each of the second AWB gain values corresponding to one second AWB coordinate point. The types of the light sources may be set to 7, and may be H, A, TL, D65, CWF, DF, and D75, for example.
Specifically, fig. 10 illustrates a flowchart of acquiring a second AWB parameter of the image to be processed by computing the image to be processed using a preset white balance algorithm, according to some embodiments of the application.
As shown in fig. 10, the process includes the steps of:
s3041: dividing the image to be processed into a plurality of image blocks, and calculating RGB values of the image blocks.
Wherein each image block may include a plurality of pixels, the RGB value of each image block may be an average of RGB values of respective pixels constituting the image block, and in other embodiments, the RGB values of each image block may be determined based on other calculation manners, for example, may be a weighted average of RGB values of respective pixels constituting the image block, etc., which is not limited herein.
For example, when a certain frame image is acquired when the mobile phone 100 takes a photograph or a certain frame image is acquired when a video is taken, the ISP 103 may divide the frame image into a plurality of image blocks, for example, 15×15 image blocks, according to a preset division rule. Further, ISP 103 may determine the RGB values (e.g., R p G p B p Value), RGB values for each image block are calculated. For example, ISP 103 may be based on R for each pixel point in each image block p G p B p R of 15×15 image blocks is calculated b G b B b The values are not described in detail herein in the embodiments of the present application.
S3042: and determining the type of the light source to which each image block belongs based on the RGB value of each image block.
The method comprises the steps of determining the coordinates of each predicted falling point of each image block in a Log Domain multi-color temperature frame coordinate system based on RGB values of each image block, and determining the type of a light source according to the coordinate positions of the predicted falling points of each image block. The specific calculation method of the coordinates of each predicted drop point may refer to the method of calculating the skin color coordinate point in step S3021, which is not described herein in detail in the embodiment of the present application.
It is understood that the predicted drop point coordinates of an image block corresponding to the image to be processed are located in which color temperature frame, and the image block belongs to the type of the light source corresponding to the color temperature frame. For example, the predicted drop point coordinates located in the color temperature frame corresponding to D65 shown in fig. 6, and the corresponding image block belongs to the data collected by the light source D65.
S3043: determining, for each type of light source, an image block belonging to said type of light source, respectively; determining a second AWB coordinate point corresponding to the type of light source in a target coordinate system according to the image block of the type of light source; and calculating a second AWB gain value corresponding to the type of light source according to the second AWB coordinate point.
Wherein, the average value of the predicted drop point coordinates distributed in each color temperature frame can be calculated according to the predicted drop point coordinates of the image blocks belonging to each type of light source to obtain the neutral color coordinate point corresponding to each type of light source in the Log Domain multi-color temperature frame coordinate system, namely the second AWB coordinate point, which is marked as P neu [j]Where j=0, 1, …,6 are the numbers of the light sources. Then, according to the second AWB coordinate points corresponding to the light sources of each type, the second AWB gain values corresponding to the light sources of each type can be obtained by combining the method of calculating gain values in step S303, and recorded as normawbgain [ j ]]。
Correspondingly, the obtaining the corresponding second AWB coordinate point in the target coordinate system according to the second AWB parameter may include:
determining the type of the light source corresponding to each second AWB gain value in the second AWB parameters respectively;
And acquiring a second AWB coordinate point corresponding to the type of light source in the target coordinate system, and taking the second AWB coordinate point corresponding to the second AWB gain value as the second AWB coordinate point.
In some embodiments, each class may be determined based on the number of image blocks belonging to each type of light sourceThe predicted weight value (denoted as W) corresponding to the light source neu [j]J=0, 1, …, 6). In other embodiments, the number of predicted drop points distributed in each color temperature frame may be counted based on the predicted drop point coordinates of each color Wen Kuangna in the coordinate system shown in fig. 6, so as to determine the prediction weight value corresponding to each color temperature frame. The more the number of the predicted drop points in a certain color temperature frame is, the larger the corresponding predicted weight value of the color temperature frame can be correspondingly. It can be understood that the number of predicted drop points distributed in each color temperature frame is the number of image blocks belonging to the light source corresponding to each color temperature frame, and the determined predicted weight value corresponding to each color temperature frame is the predicted weight value corresponding to the light source of the corresponding type.
For example, the prediction weight value corresponding to each type of light source may be a ratio of the number of image blocks belonging to each type of light source to the total number of image blocks, or may be a ratio of the number of predicted drop points distributed within each color temperature frame to the total number of predicted drop points.
S305: and respectively determining a first weight corresponding to the first AWB parameter and a second weight corresponding to the second AWB parameter based on the first AWB coordinate point and the second AWB coordinate point.
Wherein, since the first AWB parameter is a first AWB gain value, the second AWB parameter includes a plurality of second AWB gain values corresponding to different types of light sources, the first weight may include a plurality of first weight values corresponding to different types of light sources, and the second weight may also include a plurality of second weight values corresponding to different types of light sources. The accuracy of the second AWB gain value may be measured based on a distance between the first AWB coordinate point and a second AWB coordinate point corresponding to the second AWB gain value, thereby determining a first weight value corresponding to the first AWB gain value and a second weight value corresponding to the second AWB gain value for the corresponding type of light source.
As shown in fig. 4, it can be seen that the neutral color predicted coordinate point of the current color temperature of the face region estimate and the neutral color coordinate point corresponding to each type of light source are very close to each other, which means that the first AWB gain value calculated based on the RGB values of the face region is very accurate.
Specifically, the determining, based on the first AWB coordinate point and the second AWB coordinate point, a first weight corresponding to the first AWB parameter and a second weight corresponding to the second AWB parameter may include:
calculating a first weight value corresponding to each type of light source according to the distance between the first AWB coordinate point and a second AWB coordinate point corresponding to each second AWB gain value;
obtaining a predicted weight value corresponding to each type of light source, wherein the predicted weight value is obtained by calculating the image to be processed by adopting a preset white balance algorithm;
and calculating a second weight value corresponding to each type of light source according to the first weight value and the predicted weight value corresponding to each type of light source.
Illustratively, the mobile phone 100 may calculate the second AWB coordinate points P corresponding to the light sources of the respective types respectively neu [j]With a first AWB coordinate point P face Is a European distance of colorDist [ j ]]Obtaining a first weight value W corresponding to each type of light source by interpolation according to a preset distance parameter (distThrLow, distThrHi) and a preset maximum weight faceAWGAIN weight of a first AWB parameter face [j]. As an example, the first weight value W corresponding to each type of light source face [j]Can be referred to the following formula (13):
Wherein if colorDist [ j ]]Less than distThrLow, then W face [j]Is 0, if colorDist [ j ]]Greater than distThrHi, then W face [j]Is faceAwbGainWeight. Wherein if colorDist [ j ]]The larger the second AWB gain value is calculated to be less accurate, so the weight of the first AWB gain value can be increased to reduce the weight of the second AWB gain value; if it iscolorDist[j]The smaller the weight of the first AWB gain value, the more accurately the second AWB gain value is calculated, so the weight of the second AWB gain value can be reduced to increase.
Then, according to the first weight value W corresponding to each type of light source face [j]And a predictive weight value W neu [j]Calculating a second weight value W corresponding to each type of light source neu_new [j]. As an example, the second weight value W corresponding to each type of light source neu_new [j]For the calculation formula (14), the following formula (14) can be referred to:
W neu_new [j]=W neu [j]*(1-W face [j])
in some embodiments, the process may further include the steps of: acquiring the size of the face area;
judging whether the size of the face area is smaller than a preset size threshold value or not;
and when the size of the face area is smaller than a preset size threshold, setting the first weight to be zero.
For example, the mobile phone 100 may preset a size threshold, compare the size of the face area with the size threshold after determining the face area of the image to be processed, and directly set the first weight value corresponding to each type of light source to 0 when the size of the face area is smaller than the size threshold, so as to avoid the problem that the color jump is caused by unstable RGB statistic of the face. The size of the preset size threshold may be set according to practical situations, and embodiments of the present application are not limited herein.
In some embodiments, the process may further include the steps of: when the size of the face area is smaller than a preset size threshold value, accumulating the number of continuous images meeting preset conditions; and determining the first weight according to the number of the continuous images.
For example, when the mobile phone 100 captures a plurality of frames of images when capturing a video, the mobile phone 100 may accumulate the number of continuous images satisfying a preset condition when determining that the size of the face area in the current frame of images is smaller than the size threshold, determine an attenuation coefficient according to the number of continuous images, and take the product of the first weight calculated in step S305 and the attenuation coefficient as the final first weight. The images meeting the preset conditions may include images in which the size of the detected face area is smaller than the size threshold, and images in which the face area is not detected; the more the number of successive images accumulated to meet the preset condition, the smaller the attenuation coefficient can be correspondingly, and the smaller the calculated first weight is. Specifically, the attenuation coefficient may be set to a value of 0 or more and 1 or less.
For example, when determining that the size of the face region in the current frame image is smaller than the size threshold, if the number of consecutive N-1 frames of images before the current frame image is equal to an image in which the size of the face region is smaller than the size threshold or an image in which the face region is not detected, the mobile phone 100 may determine that the number of consecutive images accumulated to satisfy the preset condition is N.
In some embodiments, when the mobile phone 100 captures a multi-frame image when capturing a video, the mobile phone 100 may acquire a historical first AWB parameter and a historical first weight as the first AWB parameter and the first weight of the image to be processed when determining that no face region is detected in the image to be processed. At this time, the number of continuous images satisfying the preset condition may be accumulated, an attenuation coefficient may be determined according to the number of continuous images, the product of the obtained first weight and the attenuation coefficient may be used as a final first weight, and then a second weight corresponding to the second AWB gain may be directly calculated according to the first weight value. The images meeting the preset conditions may include images in which the size of the detected face area is smaller than the size threshold, and images in which the face area is not detected; the more the number of successive images accumulated to meet the preset condition, the smaller the attenuation coefficient can be correspondingly, and the smaller the calculated first weight is. Specifically, the attenuation coefficient may be set to a value of 0 or more and 1 or less.
Alternatively, the historical first AWB parameter and the historical first weight may be first target AWB parameters and first target weights corresponding to a target image acquired before the image to be processed, where the target image may be an image in which a size of a face area before the image to be processed is greater than or equal to the size threshold, or may be an image in which a face area is detected before the image to be processed.
In some embodiments, when the number of consecutive images is greater than or equal to the preset number threshold, the corresponding attenuation coefficient may be set to 0, that is, the first weight value corresponding to each type of light source is set to 0 at this time. The magnitude of the preset number threshold may be set according to practical situations, for example, may be set to 5, which is not limited in the embodiment of the present application.
S306: a target AWB gain value for the image to be processed is calculated based on the first AWB parameter and the first weight, and the second AWB parameter and the second weight.
Specifically, the calculating the target AWB gain value of the image to be processed based on the first AWB parameter and the first weight, and the second AWB parameter and the second weight may include:
for each type of light source, adding the product of the first AWB gain value and the first weight value corresponding to the type of light source and the product of the second AWB gain value corresponding to the type of light source to obtain the AWB gain value corresponding to the type of light source;
and adding the AWB gain values corresponding to the light sources of all types to obtain the target AWB gain value of the image to be processed.
For example, the mobile phone 100 may multiply the first weight value corresponding to each type of light source with the first AWB gain value, multiply the second weight value corresponding to each type of light source with the second AWB gain value, add the two obtained products to obtain the AWB gain value corresponding to each type of light source, and then add the AWB gain values corresponding to each type of light source to obtain the target AWB gain value of the image to be processed. As an example, the calculation formula adopted to calculate the target AWB gain value FinalGain may refer to the following formula (14):
in some embodiments, the process may further include: and performing white balance processing on the image to be processed based on the target AWB gain value to obtain a white balance processed image.
Based on the implementation flow of the steps S301 to S306, it can be seen that, in the process of determining the white balance gain value of the image to be processed, the image processing method based on the face features provided by the embodiment of the application can estimate the first AWB coordinate point of the current color temperature according to the feature data of the face region in the image to be processed, thereby obtaining the AWB parameter of the white balance of the face region, and calculate the AWB parameter of the white balance of the whole image and the corresponding second AWB coordinate point by adopting the preset white balance algorithm to the image to be processed, thereby determining the influence degree of the white balance gain of the image to be processed in the face region according to the distance between the first AWB coordinate point and the second AWB coordinate point, and calculate the AWB gain value of the image to be processed by integrating the AWB parameter corresponding to the face region and the whole image. The accuracy of the finally calculated white balance gain value of the image to be processed is higher, so that white balance processing with higher accuracy can be performed on the image to be processed, and the higher reduction degree of the true color of the image to be processed is realized.
The image processing method based on the face features provided by the embodiment of the application can be suitable for various different white balance scenes, including single light source shooting scenes, multi-light source shooting scenes and the like, and has strong scene adaptability.
It can be understood that before implementing the image processing method based on the face features provided by the embodiment of the present application, the electronic device such as the mobile phone 100 may preset a Log Domain multi-color temperature frame coordinate system based on the method, a calculation formula adopted for performing the correlation calculation in the steps S301 to S306, and some correlation parameters corresponding to the white balance scene. In addition, in the process of implementing the image processing method based on the face feature provided by the embodiment of the present application, relevant parameters of a white balance scene in electronic devices such as the mobile phone 100 may be correspondingly updated, or a new white balance scene may be added, which is not limited herein.
Fig. 11 shows a block schematic diagram of a video coding system 700 according to an embodiment of the application.
As shown in fig. 11, system 700 may include one or more processors 704, system control logic 708 coupled to at least one of the processors 704, a system memory 712 coupled to the system control logic 708, a non-volatile memory (NVM) 716 coupled to the system control logic 708, and a network interface 720 coupled to the system control logic 708.
In some embodiments, processor 704 may include one or more single-core or multi-core processors. In some embodiments, the processor 704 may include any combination of general-purpose and special-purpose processors (e.g., graphics processors, application processors, baseband processors, etc.). The processor 704 may be configured to perform various conforming embodiments, such as the embodiments described with reference to fig. 2-10 above.
In some embodiments, system control logic 708 may include any suitable interface controller to provide any suitable interface to at least one of processors 704 and/or any suitable device or component in communication with system control logic 708.
In some embodiments, system control logic 708 may include one or more memory controllers to provide an interface to system memory 712. The system memory 712 may be used for loading and storing data and/or instructions. The memory 712 of the system 700 may comprise any suitable volatile memory in some embodiments, such as a suitable Dynamic Random Access Memory (DRAM).
NVM/memory 716 may include one or more tangible, non-transitory computer storage media for storing data and/or instructions. In some embodiments, NVM/memory 716 may include any suitable nonvolatile memory, such as flash memory, and/or any suitable nonvolatile storage device, such as at least one of a HDD (Hard Disk Drive), a CD (Compact Disc) Drive, a DVD (Digital Versatile Disc ) Drive. NVM/memory 716 may include a portion of a storage resource on the device mounting system 700 or it may be accessed by, but not necessarily part of, the apparatus. NVM/storage 716 may be accessed over a network, for example, via network interface 720.
In particular, system memory 77 and NVM/storage 716 may each include: a temporary copy and a permanent copy of instructions 724. The instructions 724 may include: instructions that, when executed by at least one of the processors 704, cause the system 700 to implement the functions of the embodiments described above with respect to fig. 2-10. In some embodiments, instructions 724, hardware, firmware, and/or software components thereof may additionally/alternatively be disposed in system control logic 708, network interface 720, and/or processor 704.
Network interface 720 may include a transceiver to provide a radio interface for system 700 to communicate with any other suitable device (e.g., front end module, antenna, etc.) over one or more networks. In some embodiments, network interface 720 may be integrated with other components of system 700. For example, network interface 720 may be integrated with at least one of processor 704, system memory 712, nvm/storage 716, and a firmware device (not shown) having instructions which, when executed by at least one of processor 704, implement the functions of the embodiments described in fig. 2-10.
Network interface 720 may further include any suitable hardware and/or firmware to provide a multiple-input multiple-output radio interface. For example, network interface 720 may be a network adapter, a wireless network adapter, a telephone modem, and/or a wireless modem.
In one embodiment, at least one of the processors 704 may be packaged together with logic for one or more controllers of the system control logic 708 to form a System In Package (SiP). In one embodiment, at least one of the processors 704 may be integrated on the same die with logic for one or more controllers of the system control logic 708 to form a system on a chip (SoC).
The system 700 may further include: input/output (I/O) devices 732. The I/O device 732 may include a user interface to enable a user to interact with the system 700; the design of the peripheral component interface enables the peripheral components to also interact with the system 700. In some embodiments, the system 700 further comprises a sensor for determining at least one of environmental conditions and location information associated with the system 700.
Fig. 12 shows a block diagram of a System on Chip (SoC) 800 in accordance with an embodiment of the present application. In fig. 12, similar parts have the same reference numerals. In addition, the dashed box is an optional feature of a more advanced SoC. In fig. 12, the SoC800 includes: an interconnect unit 850 coupled to the application processor 810; a system agent unit 870; a bus controller unit 880; an integrated memory controller unit 840; a set or one or more coprocessors 820 which may include integrated graphics logic, an image processor, an audio processor, and a video processor; a Static Random Access Memory (SRAM) unit 830; a Direct Memory Access (DMA) unit 860. In one embodiment, coprocessor 820 includes a special-purpose processor, such as, for example, a network or communication processor, compression engine, GPGPU, a high-throughput MIC processor, embedded processor, or the like.
Embodiments of the disclosed mechanisms may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as a computer program or program code that is executed on a programmable system comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For the purposes of this application, a processing system includes any system having a processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor. The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. Program code may also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in the present application are not limited in scope by any particular programming language. In either case, the language may be a compiled or interpreted language.
Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one example implementation or technique disclosed in accordance with embodiments of the application. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.
The disclosure of the embodiments of the present application also relates to an operating device for executing the text. The apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random Access Memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application Specific Integrated Circuits (ASICs), or any type of media suitable for storing electronic instructions, and each may be coupled to a computer system bus. Furthermore, the computers referred to in the specification may include a single processor or may be architectures employing multiple processors for increased computing power.
Additionally, the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the disclosed subject matter. Accordingly, the present disclosure of embodiments is intended to be illustrative, but not limiting, of the scope of the concepts discussed herein.

Claims (10)

1. An image processing method based on face features, applied to electronic equipment, is characterized by comprising the following steps:
acquiring an image to be processed, and detecting a face area in the image to be processed;
calculating RGB values of the face region, and predicting a first AWB coordinate point corresponding to the face region in a target coordinate system based on the RGB values of the face region; the target coordinate system is used for determining a preset relation between RGB values and AWB parameters;
determining a first AWB parameter of the image to be processed based on the first AWB coordinate point, wherein the first AWB parameter is a first AWB gain value;
acquiring a second AWB parameter of the image to be processed by adopting a preset white balance algorithm to calculate the image to be processed, and acquiring a corresponding second AWB coordinate point in the target coordinate system according to the second AWB parameter; wherein the second AWB parameters comprise a plurality of second AWB gain values corresponding to different types of light sources, each second AWB gain value corresponding to a second AWB coordinate point;
Determining a first weight corresponding to the first AWB parameter and a second weight corresponding to the second AWB parameter based on the first AWB coordinate point and the second AWB coordinate point respectively;
calculating a target AWB gain value of the image to be processed based on the first AWB parameter and the first weight, and the second AWB parameter and the second weight;
wherein the calculating the RGB values of the face region includes:
determining a central area of the face area, and calculating RGB values of the central area;
determining a plurality of image blocks corresponding to the face area, and calculating RGB values of the image blocks;
according to the RGB value of the central area and the RGB value of each image block, one or more target image blocks are selected from a plurality of image blocks corresponding to the face area;
calculating RGB values of an area formed by the one or more target image blocks, and taking the RGB values as the RGB values of the face area;
the predicting a first AWB coordinate point corresponding to the face region in the target coordinate system based on the RGB value of the face region includes:
determining a skin color coordinate point corresponding to the face region in a target coordinate system according to the RGB value of the face region;
Acquiring skin color correction data corresponding to a plurality of light sources of different types corresponding to the face region;
calculating correction weight values corresponding to various types of light sources according to the skin color coordinate points and the skin color correction data;
respectively acquiring neutral color correction data corresponding to each type of light source;
according to neutral color correction data and correction weight values corresponding to the light sources of all types, calculating a neutral color prediction coordinate point corresponding to the face region in a target coordinate system, and taking the neutral color prediction coordinate point as the first AWB coordinate point;
the obtaining the second AWB parameter of the image to be processed by calculating the image to be processed by adopting a preset white balance algorithm includes:
dividing the image to be processed into a plurality of image blocks, and calculating RGB values of the image blocks;
determining the type of a light source to which each image block belongs based on the RGB value of each image block;
determining, for each type of light source, an image block belonging to said type of light source, respectively; determining a second AWB coordinate point corresponding to the type of light source in a target coordinate system according to the image block of the type of light source; calculating a second AWB gain value corresponding to the type of light source according to the second AWB coordinate point;
Wherein said calculating a target AWB gain value for said image to be processed based on said first AWB parameter and said first weight, and said second AWB parameter and said second weight comprises:
for each type of light source, adding the product of the first AWB gain value and the first weight value corresponding to the type of light source and the product of the second AWB gain value corresponding to the type of light source to obtain the AWB gain value corresponding to the type of light source;
and adding the AWB gain values corresponding to the light sources of all types to obtain the target AWB gain value of the image to be processed.
2. The method of claim 1, wherein the obtaining skin tone correction data corresponding to a plurality of different types of light sources corresponding to the face region comprises:
determining a first skin color reference RGB value and a second skin color reference RGB value corresponding to each type of light source respectively;
acquiring a brightness value of the face area;
and calculating skin color correction data corresponding to the face region based on the brightness value of the face region, the first skin color reference RGB value and the second skin color reference RGB value.
3. The method of claim 1, wherein the skin tone correction data is skin tone correction coordinate points;
The calculating the correction weight value corresponding to each type of light source comprises the following steps:
and calculating correction weight values corresponding to the light sources of all types according to the distances between the skin color coordinate points and the skin color correction coordinate points.
4. The method of claim 1, wherein the obtaining a corresponding second AWB coordinate point in the target coordinate system according to the second AWB parameter comprises:
determining the type of the light source corresponding to each second AWB gain value in the second AWB parameters respectively;
and acquiring a second AWB coordinate point corresponding to the type of light source in the target coordinate system, and taking the second AWB coordinate point corresponding to the second AWB gain value as the second AWB coordinate point.
5. The method of claim 1, wherein the first weight comprises a plurality of first weight values corresponding to different types of light sources and the second weight comprises a plurality of second weight values corresponding to different types of light sources;
the determining, based on the first AWB coordinate point and the second AWB coordinate point, a first weight corresponding to the first AWB parameter, and a second weight corresponding to the second AWB parameter includes:
calculating a first weight value corresponding to each type of light source according to the distance between the first AWB coordinate point and a second AWB coordinate point corresponding to each second AWB gain value;
Obtaining a predicted weight value corresponding to each type of light source, wherein the predicted weight value is obtained by calculating the image to be processed by adopting a preset white balance algorithm;
and calculating a second weight value corresponding to each type of light source according to the first weight value and the predicted weight value corresponding to each type of light source.
6. The method according to claim 1, wherein the method further comprises:
acquiring the size of the face area;
judging whether the size of the face area is smaller than a preset size threshold value or not;
and when the size of the face area is smaller than a preset size threshold, setting the first weight to be zero.
7. The method of claim 6, wherein the method further comprises:
when the size of the face area is smaller than a preset size threshold value, accumulating the number of continuous images meeting preset conditions;
and determining the first weight according to the number of the continuous images.
8. An image processing apparatus based on face features, provided in an electronic device, comprising:
the image acquisition unit is used for acquiring an image to be processed and detecting a face area in the image to be processed;
the coordinate point prediction unit is used for calculating RGB values of the face area and predicting a first AWB coordinate point corresponding to the face area in a target coordinate system based on the RGB values of the face area; the target coordinate system is used for determining a preset relation between RGB values and AWB parameters;
A parameter determining unit, configured to determine a first AWB parameter of the image to be processed based on the first AWB coordinate point, where the first AWB parameter is a first AWB gain value;
the parameter acquisition unit is used for acquiring a second AWB parameter of the image to be processed by adopting a preset white balance algorithm to calculate the image to be processed, and acquiring a corresponding second AWB coordinate point in the target coordinate system according to the second AWB parameter; wherein the second AWB parameters comprise a plurality of second AWB gain values corresponding to different types of light sources, each second AWB gain value corresponding to a second AWB coordinate point;
the weight determining unit is used for respectively determining a first weight corresponding to the first AWB parameter and a second weight corresponding to the second AWB parameter based on the first AWB coordinate point and the second AWB coordinate point;
an image processing unit for calculating a target AWB gain value of the image to be processed based on the first AWB parameter and the first weight, and the second AWB parameter and the second weight;
wherein the calculating the RGB values of the face region includes:
determining a central area of the face area, and calculating RGB values of the central area;
Determining a plurality of image blocks corresponding to the face area, and calculating RGB values of the image blocks;
according to the RGB value of the central area and the RGB value of each image block, one or more target image blocks are selected from a plurality of image blocks corresponding to the face area;
calculating RGB values of an area formed by the one or more target image blocks, and taking the RGB values as the RGB values of the face area;
the predicting a first AWB coordinate point corresponding to the face region in the target coordinate system based on the RGB value of the face region includes:
determining a skin color coordinate point corresponding to the face region in a target coordinate system according to the RGB value of the face region;
acquiring skin color correction data corresponding to a plurality of light sources of different types corresponding to the face region;
calculating correction weight values corresponding to various types of light sources according to the skin color coordinate points and the skin color correction data;
respectively acquiring neutral color correction data corresponding to each type of light source;
according to neutral color correction data and correction weight values corresponding to the light sources of all types, calculating a neutral color prediction coordinate point corresponding to the face region in a target coordinate system, and taking the neutral color prediction coordinate point as the first AWB coordinate point;
The obtaining the second AWB parameter of the image to be processed by calculating the image to be processed by adopting a preset white balance algorithm includes:
dividing the image to be processed into a plurality of image blocks, and calculating RGB values of the image blocks;
determining the type of a light source to which each image block belongs based on the RGB value of each image block;
determining, for each type of light source, an image block belonging to said type of light source, respectively; determining a second AWB coordinate point corresponding to the type of light source in a target coordinate system according to the image block of the type of light source; calculating a second AWB gain value corresponding to the type of light source according to the second AWB coordinate point;
wherein said calculating a target AWB gain value for said image to be processed based on said first AWB parameter and said first weight, and said second AWB parameter and said second weight comprises:
for each type of light source, adding the product of the first AWB gain value and the first weight value corresponding to the type of light source and the product of the second AWB gain value corresponding to the type of light source to obtain the AWB gain value corresponding to the type of light source;
And adding the AWB gain values corresponding to the light sources of all types to obtain the target AWB gain value of the image to be processed.
9. An electronic device comprising a processor and a memory, wherein the memory stores at least one instruction or at least one program, the at least one instruction or the at least one program being loaded and executed by the processor to implement the facial feature based image processing method of any one of claims 1-7.
10. A computer-readable storage medium, wherein at least one instruction or at least one program is stored in the computer-readable storage medium, and the at least one instruction or the at least one program is loaded and executed by a processor to implement the image processing method based on facial features as claimed in any one of claims 1 to 7.
CN202210630448.6A 2022-06-06 2022-06-06 Image processing method, device, equipment and storage medium based on face characteristics Active CN114945087B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210630448.6A CN114945087B (en) 2022-06-06 2022-06-06 Image processing method, device, equipment and storage medium based on face characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210630448.6A CN114945087B (en) 2022-06-06 2022-06-06 Image processing method, device, equipment and storage medium based on face characteristics

Publications (2)

Publication Number Publication Date
CN114945087A CN114945087A (en) 2022-08-26
CN114945087B true CN114945087B (en) 2023-10-03

Family

ID=82909127

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210630448.6A Active CN114945087B (en) 2022-06-06 2022-06-06 Image processing method, device, equipment and storage medium based on face characteristics

Country Status (1)

Country Link
CN (1) CN114945087B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1311111A2 (en) * 2001-11-08 2003-05-14 Fuji Photo Film Co., Ltd. Method and apparatus for correcting white balance, method for correcting density and program recording medium
CN101005628A (en) * 2007-01-04 2007-07-25 四川长虹电器股份有限公司 Skin color signal correcting method
JP2008236101A (en) * 2007-03-19 2008-10-02 Ricoh Co Ltd Imaging device and imaging method
KR100977055B1 (en) * 2009-02-20 2010-08-19 주식회사 코아로직 Device and method for adjusting auto white balance(awb) and image processing apparatus comprising the same device
CN106973278A (en) * 2014-11-11 2017-07-21 怀效宁 A kind of AWB device and method with reference to face color character
CN107788948A (en) * 2016-09-02 2018-03-13 卡西欧计算机株式会社 The storage medium of diagnosis supporting device, the image processing method of diagnosis supporting device and storage program
CN108063891A (en) * 2017-12-07 2018-05-22 广东欧珀移动通信有限公司 Image processing method, device, computer readable storage medium and computer equipment
CN113627328A (en) * 2021-08-10 2021-11-09 安谋科技(中国)有限公司 Electronic device, image recognition method thereof, system on chip, and medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4702635B2 (en) * 2007-07-17 2011-06-15 富士フイルム株式会社 AUTO WHITE BALANCE CORRECTION VALUE CALCULATION DEVICE, METHOD, PROGRAM, AND IMAGING DEVICE
JP5610762B2 (en) * 2009-12-21 2014-10-22 キヤノン株式会社 Imaging apparatus and control method
KR101896386B1 (en) * 2011-11-22 2018-09-11 삼성전자주식회사 Device and method for adjusting white balance
TWI660633B (en) * 2018-04-13 2019-05-21 瑞昱半導體股份有限公司 White balance calibration method based on skin color data and image processing apparatus thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1311111A2 (en) * 2001-11-08 2003-05-14 Fuji Photo Film Co., Ltd. Method and apparatus for correcting white balance, method for correcting density and program recording medium
CN101005628A (en) * 2007-01-04 2007-07-25 四川长虹电器股份有限公司 Skin color signal correcting method
JP2008236101A (en) * 2007-03-19 2008-10-02 Ricoh Co Ltd Imaging device and imaging method
KR100977055B1 (en) * 2009-02-20 2010-08-19 주식회사 코아로직 Device and method for adjusting auto white balance(awb) and image processing apparatus comprising the same device
CN106973278A (en) * 2014-11-11 2017-07-21 怀效宁 A kind of AWB device and method with reference to face color character
CN107788948A (en) * 2016-09-02 2018-03-13 卡西欧计算机株式会社 The storage medium of diagnosis supporting device, the image processing method of diagnosis supporting device and storage program
CN108063891A (en) * 2017-12-07 2018-05-22 广东欧珀移动通信有限公司 Image processing method, device, computer readable storage medium and computer equipment
CN113627328A (en) * 2021-08-10 2021-11-09 安谋科技(中国)有限公司 Electronic device, image recognition method thereof, system on chip, and medium

Also Published As

Publication number Publication date
CN114945087A (en) 2022-08-26

Similar Documents

Publication Publication Date Title
CN112150399B (en) Image enhancement method based on wide dynamic range and electronic equipment
EP3039864B1 (en) Automatic white balancing with skin tone correction for image processing
KR101365369B1 (en) High dynamic range image combining
EP2426928B1 (en) Image processing apparatus, image processing method and program
KR102480600B1 (en) Method for low-light image quality enhancement of image processing devices and method of operating an image processing system for performing the method
CN110022469B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110213502A (en) Image processing method, device, storage medium and electronic equipment
CN114693580B (en) Image processing method and related device
CN110047060B (en) Image processing method, image processing device, storage medium and electronic equipment
CN115802183B (en) Image processing method and related device
CN114331916B (en) Image processing method and electronic device
CN113824914B (en) Video processing method and device, electronic equipment and storage medium
CN113727085B (en) White balance processing method, electronic equipment, chip system and storage medium
CN112200747B (en) Image processing method and device and computer readable storage medium
CN116668862B (en) Image processing method and electronic equipment
JP5619882B2 (en) Lens roll-off correction operation using values corrected based on luminance information
CN114945087B (en) Image processing method, device, equipment and storage medium based on face characteristics
CN114710654B (en) Image processing method, device, readable storage medium and electronic equipment
CN115955611B (en) Image processing method and electronic equipment
JP2015119436A (en) Imaging apparatus
CN116029914B (en) Image processing method and electronic equipment
CN115706766B (en) Video processing method, device, electronic equipment and storage medium
EP4258676A1 (en) Automatic exposure method and electronic device
CN115706863B (en) Video processing method, device, electronic equipment and storage medium
US20230017498A1 (en) Flexible region of interest color processing for cameras

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant