WO2020048192A1 - 图像处理方法、电子设备、计算机可读存储介质 - Google Patents

图像处理方法、电子设备、计算机可读存储介质 Download PDF

Info

Publication number
WO2020048192A1
WO2020048192A1 PCT/CN2019/092931 CN2019092931W WO2020048192A1 WO 2020048192 A1 WO2020048192 A1 WO 2020048192A1 CN 2019092931 W CN2019092931 W CN 2019092931W WO 2020048192 A1 WO2020048192 A1 WO 2020048192A1
Authority
WO
WIPO (PCT)
Prior art keywords
light effect
area
image
overexposed
region
Prior art date
Application number
PCT/CN2019/092931
Other languages
English (en)
French (fr)
Inventor
杨涛
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP19857827.0A priority Critical patent/EP3849170B1/en
Publication of WO2020048192A1 publication Critical patent/WO2020048192A1/zh
Priority to US17/193,428 priority patent/US20210192698A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • G06T2207/20008Globally adaptive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2621Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals

Definitions

  • the present application relates to the field of computer technology, and in particular, to an image processing method, an electronic device, and a computer-readable storage medium.
  • the electronic device can acquire the image by shooting, downloading, transmitting, etc. After the image is acquired, it can also perform some post-processing on the image. For example, you can increase the brightness of the image, adjust the saturation of the image, or adjust the color temperature of the image, etc. You can also add light effects to the image. The added light effect can simulate the change of light intensity, so that the objects in the image show the lighting effect.
  • an image processing method an electronic device, and a computer-readable storage medium are provided.
  • An image processing method includes:
  • An electronic device includes a memory and a processor.
  • the memory stores a computer program.
  • the processor causes the processor to perform the following operations:
  • a computer-readable storage medium stores a computer program thereon.
  • the computer program is executed by a processor, the following operations are implemented:
  • the image processing method, electronic device, and computer-readable storage medium described above can detect a face area in an image to be processed, and then detect an overexposed area in the face area.
  • the light effect intensity coefficient is obtained according to the overexposed area
  • the target light effect model is obtained according to the light effect intensity coefficient.
  • the light effect enhancement processing is performed. After the overexposed area of the face area is detected, adjusting the intensity of the light effect enhancement processing according to the overexposed area can avoid the distortion of the face area caused by the light effect enhancement processing and improve the accuracy of image processing.
  • FIG. 1 is a schematic diagram of an application environment of an image processing method according to an embodiment.
  • FIG. 2 is a flowchart of an image processing method according to an embodiment.
  • FIG. 3 is a flowchart of an image processing method in another embodiment.
  • FIG. 4 is a schematic diagram of performing a light effect enhancement process on a three-dimensional model in an embodiment.
  • FIG. 5 is a flowchart of an image processing method according to another embodiment.
  • FIG. 6 is a schematic diagram of a connected area in an embodiment.
  • FIG. 7 is a flowchart of an image processing method according to another embodiment.
  • FIG. 8 is a structural block diagram of an image processing apparatus according to an embodiment.
  • FIG. 9 is a schematic diagram of an image processing circuit in an embodiment.
  • first, second, and the like used in this application can be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish the first element from another element.
  • first client may be referred to as the second client, and similarly, the second client may be referred to as the first client. Both the first client and the second client are clients, but they are not the same client.
  • FIG. 1 is a schematic diagram of an application environment of an image processing method according to an embodiment.
  • the application environment includes an electronic device 10.
  • the electronic device 10 can collect a to-be-processed image through the installed camera 102, and then detect a face area of the collected to-be-processed image and detect an overexposed area in the face area. .
  • the light effect intensity coefficient is obtained according to the overexposed area
  • the target light effect model is obtained according to the light effect intensity coefficient.
  • Light effect enhancement processing is performed on the image to be processed according to the target light effect model.
  • the electronic device 10 may be a personal computer, a mobile terminal, a personal digital assistant, a wearable electronic device, etc., and is not limited thereto.
  • FIG. 2 is a flowchart of an image processing method according to an embodiment. As shown in FIG. 2, the image processing method includes operations 202 to 208. among them:
  • the image to be processed refers to an image requiring light enhancement processing.
  • the image to be processed may be a two-dimensional matrix composed of a plurality of pixels, and each pixel may have a corresponding pixel value, so that different patterns are formed by the arrangement of the pixels of different pixel values.
  • the resolution of the image to be processed can be expressed by the number of pixels arranged horizontally and vertically.
  • the resolution of the image to be processed can be 640 * 320, which means that the image to be processed is arranged with 640 pixels in each horizontal direction. Points, 320 pixels are arranged in each longitudinal direction.
  • the manner in which the electronic device obtains the image to be processed is not limited.
  • the electronic device can directly capture the image to be processed through the installed camera, can also receive the image to be processed sent by other electronic devices, can also download the image to be processed from the web page, or directly find the image to be processed from the image stored locally on the electronic device. Processing images and the like are not limited here.
  • Operation 204 Detect a face area in the image to be processed, and detect an overexposed area in the face area.
  • face detection can be performed on the to-be-processed image to extract a face region in the to-be-processed image.
  • the face region refers to a region in which a face is located, and can be represented by a minimum rectangular region including the face region, or can be represented by a region included in a face contour, which is not limited herein.
  • Detecting a face region in an image to be processed may be implemented by any face detection algorithm.
  • the face detection algorithm may be an AdaBoost (Adaptive Boosting) algorithm, an SSD (Single Shot MultiBox Detector) algorithm, or a Convolutional Neural Networks (CNN) algorithm.
  • AdaBoost Adaptive Boosting
  • SSD Single Shot MultiBox Detector
  • CNN Convolutional Neural Networks
  • the electronic device can detect the overexposed area in the face area.
  • the overexposed area refers to an overexposed area.
  • it can be detected whether the exposure is past by the brightness of the pixels.
  • the electronic device can count the brightness of each pixel in the face area and obtain an area composed of pixels with a brightness greater than a certain value, which is an overexposed area.
  • a light effect intensity coefficient is obtained according to the overexposed area
  • a target light effect model is obtained according to the light effect intensity coefficient, wherein the target light effect model is a model that simulates light changes.
  • the color and brightness of the pixels in the image to be processed may be changed. Assuming that there are overexposed areas in the image to be processed, then the light effect enhancement processing is performed on the image to be processed Can cause severe distortion in overexposed areas. Therefore, the electronic device needs to first detect an overexposed area in the face area, and then adjust the light effect intensity coefficient for performing light enhancement processing according to the overexposed area.
  • the target light effect model is a model that simulates changes in light, and through this target light effect model, light effect enhancement processing can be performed on the image to be processed.
  • Light enhancement processing refers to the process of adding light effects to an image.
  • the target light effect model can simulate changes in the direction, intensity, and color of light, and the electronic device can add light of different directions, strengths, and colors to the image to be processed through the target light effect model.
  • the target light effect model can simulate the light changes produced by incandescent lamps, and also the light changes of tungsten filament lights. The light colors produced by incandescent lights are blue, and the light colors produced by tungsten lights are yellow.
  • the intensity of the light effect enhancement processing can be adjusted according to the overexposed area.
  • the electronic device may obtain the brightness of the overexposed area, adjust the light effect intensity coefficient of the target light effect model according to the brightness of the overexposed area, and then perform light effect enhancement processing according to the target light effect model. For example, the higher the brightness of the overexposed region, the smaller the light intensity coefficient obtained, and the smaller the intensity of the light effect enhancement process.
  • Operation 208 Perform light enhancement processing on the image to be processed according to the target light effect model.
  • the target light effect model may be a model for performing light enhancement processing on a part of an area in an image to be processed, or a model for performing light enhancement processing on all areas in an image to be processed, which is not limited herein.
  • the electronic device using the target light effect model may be only performing light effect enhancement processing on a face region in an image to be processed, or may be performing light effect enhancement processing on the entire image to be processed.
  • the image to be processed is a two-dimensional matrix composed of a plurality of pixels, and each pixel has a corresponding pixel value. Therefore, after the electronic device obtains the target light effect model, the light effect enhancement parameter of each pixel point in the image to be processed can be calculated according to the target light effect model. After the electronic device calculates the light effect enhancement parameter, the light effect enhancement process may be performed on each pixel point in the image to be processed according to the light effect enhancement parameter. Specifically, the electronic device may perform the light effect enhancement processing by superimposing or multiplying the image to be processed by the light effect enhancement parameter, which is not limited herein. It can be understood that the value range of the pixel value in the image is generally [0,255], so the pixel value of the image to be processed after the light effect enhancement processing cannot be greater than 255.
  • the image to be processed is H 0 (x, y) and the target light effect model is P (x, y)
  • the light effect enhancement processing may also be implemented in other ways, which is not limited herein.
  • each pixel in the image to be processed may correspond to one or more color channel values, and the electronic device may calculate the light effect enhancement parameter of the color channel value corresponding to each pixel according to the obtained target light effect model, and then according to The light enhancement parameter performs light enhancement processing on the color channel value of each pixel.
  • the image to be processed can correspond to four color channels
  • the obtained target light effect model can include four light effect sub-models, and each light effect sub-model corresponds to one color channel, and then the electronic device can process the light effect according to the light effect.
  • the sub-model calculates the light effect enhancement parameters of the corresponding color channels in the image to be processed, and then performs light effect enhancement processing on the color channel values according to the calculated light effect enhancement parameters.
  • the obtained image light effect enhancement effect may be the same.
  • the light effect enhancement parameters corresponding to the RGB three-channel values obtained by the electronic device are greater than the light effect enhancement parameters of the G channel and the B channel, then the electronic device is to be processed according to the obtained light effect enhancement parameters
  • the obtained light effect enhanced image has the effect of reddish light compared to the image to be processed.
  • the image processing method provided in the foregoing embodiment may detect a face area in an image to be processed, and then detect an overexposed area in the face area.
  • the light effect intensity coefficient is obtained according to the overexposed area
  • the target light effect model is obtained according to the light effect intensity coefficient.
  • the light effect enhancement processing is performed. After the overexposed area of the face area is detected, adjusting the intensity of the light effect enhancement processing according to the overexposed area can avoid the distortion of the face area caused by the light effect enhancement processing and improve the accuracy of image processing.
  • FIG. 3 is a flowchart of an image processing method in another embodiment. As shown in FIG. 3, the image processing method includes operations 302 to 316. among them:
  • Operation 302 Obtain a to-be-processed image.
  • the light effect enhancement processing of the image to be processed may be automatically triggered by the electronic device or manually triggered by a user, which is not limited herein.
  • a user can manually select whether to perform light enhancement processing on the captured image.
  • the user clicks the button for light enhancement processing the electronic device uses the captured image as the image to be processed. And enhance the light effect of the image to be processed.
  • Operation 304 Detect a face region in the image to be processed, and divide the pixel points of the face region into different pixel blocks.
  • the to-be-processed image is a two-dimensional matrix composed of several pixels, so the detected human face area will also include several pixels.
  • the electronic device may divide the pixels contained in the detected face area into different pixel blocks, and then separately count the brightness values in the divided pixel blocks. The brightness value to determine whether it is overexposed.
  • the size of the divided pixel block is not limited here.
  • the size of the pixel block can be expressed by m * n.
  • the size of m * n indicates that there are m pixels in each horizontal direction and each vertical direction in the pixel block. There are n pixels in the direction.
  • the number of pixels in the horizontal direction and the vertical direction in the pixel block may be the same or different, which is not limited herein.
  • the divided pixel block may be 16 * 16 size or 10 * 4 size.
  • the first average brightness value of the pixels contained in the pixel block is counted.
  • each pixel block will contain multiple pixels, and each pixel point will correspond to a brightness value. Furthermore, the electronic device can count the average value of the brightness values of all the pixel points in each pixel block as the first brightness average value. Therefore, each pixel block will correspond to a first average brightness value, and the electronic device can determine the overexposed area according to the first average brightness value obtained by statistics.
  • the electronic device may define a 16 * 16 size rectangular frame in advance, and then traverse the face area through the 16 * 16 size rectangular frame.
  • the specific traversal process is as follows: first, the electronic device can define a starting position in the face area, and place the rectangular frame at the starting position, then the pixels in the rectangular frame form a pixel block, and the corresponding number of the pixel block is calculated. The first brightness average value, and then each time the rectangle frame is moved to a different position, and then each time the pixel position in the rectangle frame is formed into a pixel block, thereby counting the first brightness average value of the pixel block formed each time .
  • a first pixel region is formed according to a pixel block whose first brightness average is greater than a brightness threshold, and an overexposed region is generated according to the first pixel region.
  • the electronic device After the electronic device calculates the first brightness average value corresponding to each pixel block, it obtains pixel blocks with the first brightness average value greater than the brightness threshold, and forms a first pixel region according to the obtained pixel blocks. Pixels with excessively high luminance values may be caused by overexposure. Therefore, the electronic device may obtain an exposed area according to a pixel block whose first average luminance value is greater than a luminance threshold.
  • the second brightness average value of the pixels included in the overexposed area is counted, and the light effect intensity coefficient is obtained according to the second brightness average value.
  • the average value of the brightness values of all the pixels included in the overexposed area may be calculated to obtain a second average brightness value, and then the light effect intensity coefficient may be obtained according to the second average brightness value.
  • the larger the second average brightness value the smaller the light effect intensity coefficient, and the weaker the intensity of the corresponding light effect enhancement process.
  • Operation 312 Obtain a target light effect model according to the light effect intensity coefficient.
  • the electronic device may define a reference light effect model in advance, and the reference light effect model may simulate changes in light, and specifically may simulate changes in light color, direction, intensity, etc. After the electronic device obtains the light effect intensity coefficient, the light effect intensity of the reference light effect model can be adjusted according to the light effect intensity coefficient, thereby obtaining the target light effect model.
  • the reference light effect model is P o (x, y)
  • the light effect intensity coefficient is r
  • the obtained target light effect model is P (x, y).
  • Operation 314 Obtain a depth image corresponding to the image to be processed, perform three-dimensional reconstruction according to the image to be processed and the depth image, and obtain a three-dimensional model corresponding to the face region.
  • the electronic device may process the two-dimensional image or the three-dimensional model, which is not limited herein.
  • the image to be processed is a two-dimensional image.
  • the electronic device may directly process the image to be processed, or may process the three-dimensional model obtained by performing three-dimensional reconstruction based on the image to be processed.
  • an electronic device When an electronic device processes a three-dimensional model, it needs to first perform three-dimensional modeling according to the image to be processed to obtain a three-dimensional model. Specifically, the electronic device obtains a depth image corresponding to the image to be processed, and then performs three-dimensional reconstruction according to the image to be processed and the depth image.
  • the image to be processed may be used to represent information such as the color and texture of the object, and the depth image may be used to represent the distance between the object and the image acquisition device.
  • the electronic device can perform three-dimensional modeling on the face area according to the image to be processed and the depth image, and obtain a three-dimensional model corresponding to the face area.
  • the three-dimensional model may be used to represent a polygonal three-dimensional structure of an object.
  • the three-dimensional model can generally be represented by a three-dimensional mesh (3D mesh) structure, which is composed of the point cloud data of the object.
  • the point cloud data may generally include three-dimensional coordinates (XYZ), laser reflection intensity (Intensity), and color information (RGB), and finally is drawn into a three-dimensional grid according to the point cloud data.
  • light effect enhancement processing is performed on the three-dimensional model according to the target light effect model.
  • the target light effect model obtained is also a model on which the light effect enhancement processing is performed on the three-dimensional model.
  • the preset reference light effect model is also on the three-dimensional model.
  • the enhanced processing model, that is, the target light effect model is a model that simulates the change of light in three-dimensional space. After the electronic device obtains the three-dimensional model, the light effect enhancement processing may be performed on the three-dimensional model according to the target light effect model.
  • FIG. 4 is a schematic diagram of performing a light effect enhancement process on a three-dimensional model in an embodiment.
  • the electronic device performs three-dimensional reconstruction on the face area to obtain a three-dimensional model 402.
  • the obtained three-dimensional model 402 can be represented in a spatial three-dimensional coordinate system xyz.
  • the target light effect model that the electronic device performs the light effect enhancement processing on the three-dimensional model 402 can simulate the change of light in three-dimensional space.
  • the target light effect model can be represented in the three-dimensional space coordinate system xyz.
  • the change curve of the light in the spatial three-dimensional coordinate system xyz The change curve of the light in the spatial three-dimensional coordinate system xyz.
  • the operation of determining the overexposed area may include:
  • Operation 502 Obtain a second pixel region other than the first pixel region in the face region.
  • the electronic device composes a first pixel area according to a pixel block whose first brightness average value is greater than a brightness threshold in a face area, and uses a region other than the first pixel area in the face area as a second pixel area, and then according to the obtained first pixel area
  • the one pixel area and the second pixel area determine an overexposed area.
  • the face region is binarized according to the first pixel region and the second pixel region.
  • the electronic device After the electronic device determines the first pixel region and the second pixel region, it performs a binarization process according to the first pixel region and the second pixel region. For example, if the electronic device sets all the brightness values of the pixels in the first pixel area to 1 and all the brightness values of the pixels in the second pixel area to 0, the face area can be binarized.
  • Operation 506 Determine an overexposed area according to the binarized face area.
  • the binarized face region can more easily distinguish the first pixel region and the second pixel region.
  • the electronic device determines the overexposed area according to the binarized face area. Specifically, since the first pixel region is a region composed of pixel blocks with higher brightness values, the first pixel region is considered to be more likely to be an overexposed region.
  • the electronic device can compare the area of the first pixel area. If the area of the first pixel area is too small, the first pixel area is considered to be less likely to be overexposed; if the area of the first pixel area is large, the first pixel area is considered to be A pixel region is more likely to be an overexposed region, and the overexposed region may be generated according to a first pixel region with a larger area.
  • the electronic device sets the brightness values of all the pixels in the first pixel area to non-zero brightness values and sets all the pixels in the second pixel area to 0 to binarize the face area.
  • the binarized face area thus obtained includes one or more connected areas, and the electronic device can determine the overexposed area according to the area of the connected area.
  • the electronic device obtains the connected areas in the binarized face area, and obtains the area ratio of each connected area to the face area; and generates the overexposed area according to the connected area whose area ratio is greater than the area threshold.
  • the area can be represented by the number of pixels included, the area of the connected area is the number of pixels included in the connected area, and the area of the face area is the number of pixels included in the face area.
  • the electronic device obtains the area of each connected area in the face area, it can obtain the area ratio between the area of each connected area and the area of the face area, and then determine the overexposed area according to the obtained area ratio.
  • the electronic device may further perform a process of expanding and then corroding the binarized face area, and then obtain the connected areas in the face area after the expansion and erosion process, and obtain each connected area and The area ratio of the face area; an overexposed area is generated based on the connected area with the area ratio greater than the area threshold.
  • the area of the connected area is S1
  • the area of the face area is S2.
  • the electronic device marks the connected area and finally composes the labeled connected area The last exposed area.
  • FIG. 6 is a schematic diagram of a connected area in an embodiment.
  • the face region can be binarized, for example, the brightness values of all pixels in the first pixel region are set to 255, and the brightness values of all pixels in the second pixel region are set to 0. , You can get the binarized face area 60.
  • the binarized face region 60 may include a connected region 602 and a non-connected region 604.
  • the operation of obtaining the light effect intensity coefficient may include:
  • the electronic device may detect overexposed areas in each face area separately, and count the overexposed areas in each face area.
  • the second brightness average value to obtain a light effect intensity coefficient according to the statistically obtained second brightness average value.
  • Operation 704 Obtain a light effect intensity coefficient according to the largest second brightness average.
  • the electronic device When the electronic device detects that there are two or more human face areas, it can obtain the light effect intensity coefficient according to the human face areas with higher brightness. Specifically, the electronic device may obtain the maximum value of the second brightness average value corresponding to each face area obtained by statistics, and then obtain the light effect intensity coefficient according to the maximum brightness average value. For example, if the image to be processed includes two face regions, the second brightness value corresponding to the face area A is 241, and the second brightness value corresponding to the other face area B is 246. Then, the electronic device may correspond to the face area B. The second brightness average value 246 is used to calculate the light effect intensity coefficient.
  • the image processing method provided in the foregoing embodiment may detect a face area in an image to be processed, and then detect an overexposed area in the face area.
  • the light effect intensity coefficient is obtained according to the overexposed area
  • the target light effect model is obtained according to the light effect intensity coefficient.
  • three-dimensional modeling is performed according to the image to be processed and the corresponding depth image, and the light effect enhancement processing is performed on the three-dimensional model according to the target light effect model.
  • adjusting the intensity of the light effect enhancement processing according to the overexposed area can avoid the distortion of the face area caused by the light effect enhancement processing and improve the accuracy of image processing.
  • FIG. 8 is a structural block diagram of an image processing apparatus according to an embodiment.
  • the image processing apparatus 800 includes an image acquisition module 802, an overexposure detection module 804, a model acquisition module 806, and a light effect processing module 808. among them:
  • the image acquisition module 802 is configured to acquire an image to be processed.
  • An overexposure detection module 804 is configured to detect a face area in the image to be processed, and detect an overexposed area in the face area.
  • a model obtaining module 806 is configured to obtain a light effect intensity coefficient according to the overexposed area, and obtain a target light effect model according to the light effect intensity coefficient, wherein the target light effect model is a model that simulates a change in light.
  • a light effect processing module 808 is configured to perform light effect enhancement processing on the image to be processed according to the target light effect model.
  • the image processing apparatus may detect a face area in an image to be processed, and then detect an overexposed area in the face area.
  • the light effect intensity coefficient is obtained according to the overexposed area
  • the target light effect model is obtained according to the light effect intensity coefficient.
  • the light effect enhancement processing is performed. After the overexposed area of the face area is detected, adjusting the intensity of the light effect enhancement processing according to the overexposed area can avoid the distortion of the face area caused by the light effect enhancement processing and improve the accuracy of image processing.
  • the overexposure detection module 804 is further configured to divide the pixels of the face region into different pixel blocks; count the first mean value of the brightness of the pixels contained in the pixel blocks; and according to the A pixel block with a first average brightness value greater than a brightness threshold constitutes a first pixel region, and an overexposed region is generated according to the first pixel region.
  • the overexposure detection module 804 is further configured to obtain a second pixel region other than the first pixel region in the face region; The binarization processing of the face region is described; the over-exposed region is determined according to the binarized face region.
  • the overexposure detection module 804 is further configured to obtain the connected areas in the binarized face area, and obtain the area ratio of each connected area to the face area; according to the area ratio, the area ratio is greater than the area. Threshold connected regions generate overexposed regions.
  • the model acquisition module 806 is further configured to count a second average brightness value of the pixels included in the overexposed area, and obtain a light efficiency intensity coefficient according to the second average brightness value.
  • the model acquisition module 806 is further configured to, when detecting that there are two or more face regions in the image to be processed, count the number of the overexposed regions in each of the face regions. Two brightness averages; obtain the light effect intensity coefficient according to the largest second brightness average.
  • the light effect processing module 808 is further configured to obtain a depth image corresponding to the image to be processed, perform three-dimensional reconstruction according to the image to be processed and the depth image, and obtain a three-dimensional model corresponding to the face area;
  • the target light effect model performs light effect enhancement processing on the three-dimensional model.
  • each module in the above image processing apparatus is for illustration only. In other embodiments, the image processing apparatus may be divided into different modules as needed to complete all or part of the functions of the above image processing apparatus.
  • Each module in the above image processing apparatus may be implemented in whole or in part by software, hardware, and a combination thereof.
  • the above-mentioned modules may be embedded in the hardware form or independent of the processor in the computer device, or may be stored in the memory of the computer device in the form of software, so that the processor calls and performs the operations corresponding to the above modules.
  • each module in the image processing apparatus provided in the embodiments of the present application may be in the form of a computer program.
  • the computer program can be run on a terminal or a server.
  • the program module constituted by the computer program can be stored in the memory of the terminal or server.
  • the computer program is executed by a processor, the operations of the method described in the embodiments of the present application are implemented.
  • An embodiment of the present application further provides an electronic device.
  • the above electronic device includes an image processing circuit.
  • the image processing circuit may be implemented by using hardware and / or software components, and may include various processing units that define an ISP (Image Signal Processing) pipeline.
  • FIG. 9 is a schematic diagram of an image processing circuit in an embodiment. As shown in FIG. 9, for ease of description, only aspects of the image processing technology related to the embodiments of the present application are shown.
  • the image processing circuit includes an ISP processor 940 and a control logic 950.
  • the image data captured by the imaging device 910 is first processed by the ISP processor 940, which analyzes the image data to capture image statistical information that can be used to determine and / or one or more control parameters of the imaging device 910.
  • the imaging device 910 may include a camera having one or more lenses 912 and an image sensor 914.
  • the image sensor 914 may include a color filter array (such as a Bayer filter).
  • the image sensor 914 may obtain light intensity and wavelength information captured by each imaging pixel of the image sensor 914, and provide a set of raw data that can be processed by the ISP processor 940. Image data.
  • the sensor 920 may provide parameters (such as image stabilization parameters) of the acquired image processing to the ISP processor 940 based on the interface type of the sensor 920.
  • the sensor 920 interface may use a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the foregoing interfaces.
  • SMIA Standard Mobile Imaging Architecture
  • the image sensor 914 may also send the original image data to the sensor 920.
  • the sensor 920 may provide the original image data to the ISP processor 940 for processing based on the interface type of the sensor 920, or the sensor 920 stores the original image data in the image memory 930 .
  • the ISP processor 940 processes the original image data pixel by pixel in a variety of formats.
  • each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 940 may perform one or more image processing operations on the original image data and collect statistical information about the image data.
  • the image processing operations may be performed with the same or different bit depth accuracy.
  • the ISP processor 940 may also receive pixel data from the image memory 930.
  • the sensor 920 interface sends the original image data to the image memory 930, and the original image data in the image memory 930 is then provided to the ISP processor 940 for processing.
  • the image memory 930 may be a part of a memory device, a storage device, or a separate dedicated memory in an electronic device, and may include a DMA (Direct Memory Access) feature.
  • DMA Direct Memory Access
  • the ISP processor 940 may perform one or more image processing operations, such as time-domain filtering.
  • the image data processed by the ISP processor 940 may be sent to the image memory 930 for further processing before being displayed.
  • the ISP processor 940 receives processing data from the image memory 930 and performs image data processing on the processing data in the original domain and in the RGB and YCbCr color spaces.
  • the processed image data may be output to the display 980 for viewing by a user and / or further processed by a graphics engine or GPU (Graphics Processing Unit).
  • the output of the ISP processor 940 can also be sent to the image memory 930, and the display 980 can read image data from the image memory 930.
  • the image memory 930 may be configured to implement one or more frame buffers.
  • the output of the ISP processor 940 may be sent to an encoder / decoder 970 to encode / decode image data. The encoded image data can be saved and decompressed before being displayed on the display 980 device.
  • the image data processed by the ISP may be sent to the light effect module 960 to perform light effect processing on the image before being displayed.
  • the light effect processing performed by the light effect module 960 on the image data may include obtaining a light effect enhancement parameter of each pixel in the image to be processed, and performing light effect enhancement processing on the image to be processed according to the light effect enhancement parameter.
  • the light effect enhancement processing image data may be sent to the encoder / decoder 970 to encode / decode the image data.
  • the encoded image data can be saved and decompressed before being displayed on the display 980 device.
  • the image data processed by the light effect module 960 may be directly sent to the display 980 for display without passing through the encoder / decoder 970.
  • the image data processed by the ISP processor 940 may also be processed by the encoder / decoder 970 first, and then processed by the light effect module 960.
  • the light effect module 960 or the encoder / decoder 970 may be a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit) in a mobile terminal.
  • the statistical data determined by the ISP processor 940 may be sent to the control logic 950 unit.
  • the statistical data may include image information of the image sensor 914 such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, and lens 912 shading correction.
  • the control logic 950 may include a processor and / or a microcontroller that executes one or more routines (such as firmware). The one or more routines may determine the control parameters of the imaging device 910 and the ISP processing based on the received statistical data. Parameters of the controller 940.
  • control parameters of the imaging device 910 may include sensor 920 control parameters (such as gain, integration time for exposure control, image stabilization parameters, etc.), camera flash control parameters, lens 912 control parameters (such as focus for zoom or zoom), or these A combination of parameters.
  • ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (eg, during RGB processing), and lens 912 shading correction parameters.
  • An embodiment of the present application further provides a computer-readable storage medium.
  • One or more non-transitory computer-readable storage media containing computer-executable instructions, when the computer-executable instructions are executed by one or more processors, causing the processors to perform the image processing provided by the foregoing embodiments The operation of the method.
  • Non-volatile memory may include read-only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory can include random access memory (RAM), which is used as external cache memory.
  • RAM is available in various forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
  • SRAM static RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDR SDRAM dual data rate SDRAM
  • ESDRAM enhanced SDRAM
  • SLDRAM synchronous Link (Synchlink) DRAM
  • Rambus direct RAM
  • DRAM direct memory bus dynamic RAM
  • RDRAM memory bus dynamic RAM

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

一种图像处理方法,包括:获取待处理图像;检测待处理图像中的人脸区域,并检测人脸区域中的过曝区域;根据过曝区域获取光效强度系数,并根据光效强度系数获取目标光效模型,其中,目标光效模型为模拟光线变化的模型;根据目标光效模型对待处理图像进行光效增强处理。

Description

图像处理方法、电子设备、计算机可读存储介质
相关申请的交叉引用
本申请要求于2018年09月07日提交中国专利局、申请号为2018110456593、发明名称为“图像处理方法和装置、电子设备、计算机可读存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机技术领域,特别是涉及一种图像处理方法、电子设备、计算机可读存储介质。
背景技术
电子设备可以通过拍摄、下载、传输等方式获取图像,在获取图像之后还可以对图像进行一些后期处理。例如,提高图像的亮度、调整图像的饱和度或者调整图像的色温等,还可以对图像添加光效。添加的光效可以模拟光线强弱变化,使图像中的物体呈现出光照效果。
发明内容
根据本申请的各种实施例,提供一种图像处理方法、电子设备、计算机可读存储介质。
一种图像处理方法,包括:
获取待处理图像;
检测所述待处理图像中的人脸区域,并检测所述人脸区域中的过曝区域;
根据所述过曝区域获取光效强度系数,并根据所述光效强度系数获取目标光效模型,其中,所述目标光效模型为模拟光线变化的模型;及
根据所述目标光效模型对所述待处理图像进行光效增强处理。
一种电子设备,包括存储器及处理器,所述存储器中储存有计算机程序,所述计算机程序被所述处理器执行时,使得所述处理器执行如下操作:
获取待处理图像;
检测所述待处理图像中的人脸区域,并检测所述人脸区域中的过曝区域;
根据所述过曝区域获取光效强度系数,并根据所述光效强度系数获取目标光效模型,其中,所述目标光效模型为模拟光线变化的模型;及
根据所述目标光效模型对所述待处理图像进行光效增强处理。
一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现如下操作:
获取待处理图像;
检测所述待处理图像中的人脸区域,并检测所述人脸区域中的过曝区域;
根据所述过曝区域获取光效强度系数,并根据所述光效强度系数获取目标光效模型,其中,所述目标光效模型为模拟光线变化的模型;及
根据所述目标光效模型对所述待处理图像进行光效增强处理。
上述图像处理方法、电子设备、计算机可读存储介质,可以对待处理图像中的人脸区域进行检测,然后再检测人脸区域中的过曝区域。根据过曝区域获取光效强度系数,再根据光效强度系数获取目标光效模型。最后根据目标光效模型对待处理图像进行光效增强处理。检测到人脸区域的过曝区域之后,根据过曝区域来调节光效增强处理的强度,可以避免在光效增强处理时引起的人脸区域的失真,提高了图像处理的准确性。
本申请的一个或多个实施例的细节在下面的附图和描述中提出。本申请的其它特征、 目的和优点将从说明书、附图以及权利要求书变得明显。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为一个实施例中图像处理方法的应用环境示意图。
图2为一个实施例中图像处理方法的流程图。
图3为另一个实施例中图像处理方法的流程图。
图4为一个实施例中对三维模型进行光效增强处理的示意图。
图5为又一个实施例中图像处理方法的流程图。
图6为一个实施例中连通区域的示意图。
图7为又一个实施例中图像处理方法的流程图。
图8为一个实施例的图像处理装置的结构框图。
图9为一个实施例中图像处理电路的示意图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
可以理解,本申请所使用的术语“第一”、“第二”等可在本文中用于描述各种元件,但这些元件不受这些术语限制。这些术语仅用于将第一个元件与另一个元件区分。举例来说,在不脱离本申请的范围的情况下,可以将第一客户端称为第二客户端,且类似地,可将第二客户端称为第一客户端。第一客户端和第二客户端两者都是客户端,但其不是同一客户端。
图1为一个实施例中图像处理方法的应用环境示意图。如图1所示,该应用环境包括电子设备10,电子设备10可通过安装的摄像头102采集待处理图像,然后对采集的待处理图像检测人脸区域,并检测人脸区域中的过曝区域。根据过曝区域获取光效强度系数,并根据光效强度系数获取目标光效模型。根据目标光效模型对待处理图像进行光效增强处理。在一个实施例中,电子设备10可以是个人电脑、移动终端、个人数字助理、可穿戴电子设备等,不限于此。
图2为一个实施例中图像处理方法的流程图。如图2所示,该图像处理方法包括操作202至操作208。其中:
操作202,获取待处理图像。
在一个实施例中,待处理图像是指需要进行光效增强处理的图像。具体的,该待处理图像可以是一个由若干个像素点构成的二维矩阵,每个像素点可以有对应的像素值,从而通过不同像素值的像素点的排列,形成不同的图案。待处理图像可以分别通过横向和纵向排列的像素点的数量来表示分辨率,例如待处理图像的分辨率可以是640*320,则表示该待处理图像在每个横向方向上排列了640个像素点,在每个纵向方向上排列了320个像素点。
具体的,电子设备获取待处理图像的方式不限。例如,电子设备可以通过安装的摄像头直接拍摄得到待处理图像,还可以接收其他电子设备发送的待处理图像,也可以从网页上下载待处理图像,或者直接从电子设备本地存储的图像中查找待处理图像等,在此不做限定。
操作204,检测待处理图像中的人脸区域,并检测人脸区域中的过曝区域。
获取到待处理图像之后,可以对待处理图像进行人脸检测,提取待处理图像中的人脸区域。其中,人脸区域是指人脸所在的区域,可以通过包含人脸区域的最小矩形区域进行表示,也可以通过人脸轮廓所包含的区域进行表示,在此不做限定。
检测待处理图像中的人脸区域可以是通过任何人脸检测算法来实现的。例如,人脸检测算法可以是基于AdaBoost(Adaptive Boosting,自适应提升)的算法、SSD(Single Shot MultiBox Detector,单次检测器)算法、CNN(Convolutional Neural Networks,卷积神经网络)算法,在此不做限定。
在检测到人脸区域之后,电子设备可以在人脸区域中检测过曝区域。过曝区域是指曝光过度的区域,一般可以通过像素点的亮度来检测是否为曝光过去。例如,电子设备可以统计人脸区域中各个像素点的亮度,获取亮度大于一定值的像素点组成的区域,即为过曝区域。
操作206,根据过曝区域获取光效强度系数,并根据光效强度系数获取目标光效模型,其中,目标光效模型为模拟光线变化的模型。
在对待处理图像进行光效增强处理的时候,可能会改变待处理图像中像素点的颜色、亮度等,假设待处理图像中存在曝光过度的区域,那么再对待处理图像进行光效增强处理,就会造成曝光过度的区域严重失真。因此,电子设备需要先检测人脸区域中的过曝区域,然后根据过曝区域来调整进行光效增强处理的光效强度系数。
目标光效模型为模拟光线变化的模型,通过该目标光效模型可以对待处理图像进行光效增强处理。光效增强处理是指对图像添加光线效果的处理。具体的,该目标光效模型可以模拟光线的方向、强弱、颜色等变化曲线,电子设备通过该目标光效模型可以对待处理图像添加不同方向、强弱和颜色的光线。例如,目标光效模型可以模拟白炽灯产生的光线变化,也可以模拟钨丝光的光线变化,白炽灯产生的光线颜色偏蓝,钨丝灯产生的光线颜色偏黄。
在一个实施例中,检测到过曝区域后,可以根据过曝区域来调整进行光效增强处理的强度。具体的,电子设备可以获取过曝区域的亮度,根据过曝区域的亮度调整目标光效模型的光效强度系数,然后根据目标光效模型来进行光效增强处理。例如,过曝区域的亮度越高时,得到的光效强度系数越小,那么进行光效增强处理的强度就小。
操作208,根据目标光效模型对待处理图像进行光效增强处理。
在一个实施例中,目标光效模型可以是对待处理图像中的部分区域进行光效增强处理的模型,也可以是对待处理图像中的全部区域进行光效增强处理的模型,在此不做限定。例如,电子设备通过目标光效模型可以是只对待处理图像中的人脸区域进行光效增强处理,还可以是对整个待处理图像进行光效增强处理。
具体的,待处理图像是由若干像素点构成的二维矩阵,每个像素点都有对应的像素值。因此,电子设备获取目标光效模型之后,可以根据目标光效模型计算待处理图像中各个像素点的光效增强参数。电子设备计算出光效增强参数之后,可以根据光效增强参数对待处理图像中的各个像素点进行光效增强处理。具体地,电子设备可以通过光效增强参数对待处理图像进行叠加或乘积的方式进行光效增强处理,在此不做限定。可以理解的是,图像中的像素值的取值范围一般为[0,255],因此在经过光效增强处理之后的待处理图像的像素值不能大于255。
例如,假设待处理图像为H 0(x,y),目标光效模型为P(x,y),则通过叠加方式进行光效增强处理之后的待处理图像H(x,y)就可以表示为H(x,y)=(1+P(x,y))H 0(x,y),通过乘积的方式进行光效增强处理后的待处理图像就可以表示为H(x,y)=P(x,y)H 0(x,y)。可以理解的是,光效增强处理还可以是以其他方式实现的,在此不做限定。
在一个实施例中,电子设备在对待处理图像进行光效增强处理的时候,还可以对待处 理图像中的各个颜色通道做不同的处理。具体的,待处理图像中的每个像素点可以对应一个或多个颜色通道值,则电子设备可以根据获取的目标光效模型计算各个像素点对应的颜色通道值的光效增强参数,再根据光效增强参数分别对各个像素点的颜色通道值进行光效增强处理。例如,待处理图像可以对应四个颜色通道,则获取的目标光效模型中可以包含四个光效子模型,每个光效子模型对应处理一个颜色通道,那么电子设备就可以根据该光效子模型计算待处理图像中对应的颜色通道的光效增强参数,然后根据计算得到的光效增强参数对颜色通道值进行光效增强处理。
可以理解的是,电子设备对各个颜色通道值进行不同的光效增强处理之后,得到的图像光效增强效果可能会一样。例如,电子设备获取的RGB三通道值对应的光效增强参数中,R通道对应的光效增强参数大于G通道和B通道的光效增强参数,那么电子设备根据获取的光效增强参数对待处理图像进行光效增强处理之后,得到的光效增强图像相对待处理图像就是偏红光的效果。
上述实施例提供的图像处理方法,可以对待处理图像中的人脸区域进行检测,然后再检测人脸区域中的过曝区域。根据过曝区域获取光效强度系数,再根据光效强度系数获取目标光效模型。最后根据目标光效模型对待处理图像进行光效增强处理。检测到人脸区域的过曝区域之后,根据过曝区域来调节光效增强处理的强度,可以避免在光效增强处理时引起的人脸区域的失真,提高了图像处理的准确性。
图3为另一个实施例中图像处理方法的流程图。如图3所示,该图像处理方法包括操作302至操作316。其中:
操作302,获取待处理图像。
在本申请提供的实施例中,对待处理图像进行光效增强处理可以是电子设备自动触发的,也可以是用户手动触发的,在此不做限定。例如,电子设备拍摄到一张图像的时候,用户可以手动选择是否对拍摄的图像进行光效增强处理,当用户点击光效增强处理的按钮时,则电子设备将拍摄得到的图像作为待处理图像,并对待处理图像进行光效增强处理。
操作304,检测待处理图像中的人脸区域,并将人脸区域的像素点划分为不同的像素块。
待处理图像是由若干个像素点构成的二维矩阵,所以检测得到的人脸区域中也会包含若干个像素点。电子设备在检测到待处理图像中的人脸区域之后,可以将检测到的人脸区域中包含的像素点划分为不同的像素块,然后分别统计划分的像素块中的亮度值,根据统计的亮度值来判断是否曝光过度。
可以理解的是,划分的像素块的大小在此不做限定,像素块的大小可以通过m*n进行表示,m*n大小表示像素块中每个横向方向上有m像素点,每个纵向方向上有n个像素点。像素块中横向方向和纵向方向上的像素点的数量可以相同,也可以不同,在此不做限定。例如,划分的像素块可以是16*16大小的,也可以是10*4大小的。
操作306,统计像素块中所包含的像素点的第一亮度均值。
电子设备将人脸区域划分为不同的像素块之后,每个像素块中会包含多个像素点,每个像素点会对应一个亮度值。进而电子设备可以统计每一个像素块中的所有像素点的亮度值的平均值,作为第一亮度均值。从而每个像素块都会对应一个第一亮度均值,电子设备可以根据统计得到的第一亮度均值来判断过曝区域。
在一个实施例中,假设像素块为16*16大小的,则电子设备可以预先定义一个16*16大小的矩形框,然后通过该16*16大小的矩形框遍历人脸区域。具体的遍历过程为:首先电子设备可以在人脸区域中定义一个起始位置,将矩形框放到该起始位置,则矩形框内的像素点组成一个像素块,统计得到该像素块对应的第一亮度均值,之后每次将矩形框移动到一个不同的位置,然后每次移动一个位置之后矩形框内的像素点都会组成一个像素块, 从而统计每次形成的像素块的第一亮度均值。
操作308,根据第一亮度均值大于亮度阈值的像素块组成第一像素区域,根据第一像素区域生成过曝区域。
电子设备统计得到的各个像素块对应的第一亮度均值之后,获取第一亮度均值大于亮度阈值的像素块,并根据获取的像素块组成第一像素区域。亮度值过高的像素点可能是因为曝光过度引起的,因此电子设备可以根据得到的第一亮度均值大于亮度阈值的像素块得到曝光区域。
操作310,统计过曝区域中包含的像素点的第二亮度均值,并根据第二亮度均值获取光效强度系数。
电子设备确定过曝区域之后,可以计算过曝区域中包含的所有像素点的亮度值的平均值,得到第二亮度均值,然后根据第二亮度均值获取光效强度系数。一般的,第二亮度均值越大,光效强度系数越小,对应的光效增强处理的强度越弱。
具体的,电子设备在统计过曝区域中包含的像素点的第二亮度均值时,可以将过曝区域中的所有像素点的亮度值进行叠加,然后统计过曝区域中包含的像素点的数量,然后将叠加得到的亮度均值的总和除以像素点的数量,得到第二亮度均值。例如,过曝区域中包含4个像素点,亮度值分别为201、186、158、165,那么得到的第二亮度均值就为(203、186、158、165)/4=178。
在一个实施例中,获取光效强度系数的操作可以包括:根据上述第二亮度均值和亮度阈值获取光效强度系数,具体可以是根据亮度阈值与第二亮度均值的比值得到光效强度系数。假设第二亮度均值为V 2,亮度阈值为T,那么得到的光效强度系数r=T/V 2
例如,亮度阈值可以为T=240,那么电子设备在统计像素块的第一亮度均值V 1之后,可以根据第一亮度均值V 1大于240的像素块组成第一像素区域,然后根据第一像素区域获取过曝区域。进而电子设备计算过曝区域的第二亮度均值V 2,根据第二亮度均值和亮度均值计算得到的光效强度系数r=240/V 2
操作312,根据光效强度系数获取目标光效模型。
在本申请提供的实施例中,电子设备可以预先定义一个参考光效模型,该参考光效模型可以模拟光线变化,具体可以模拟光线的颜色、方向、强弱等变化。电子设备获取到光效强度系数之后,可以根据光效强度系数调整参考光效模型的光效强度,从而得到目标光效模型。
举例来说,假设参考光效模型为P o(x,y),光效强度系数为r,得到的目标光效模型为P(x,y)。那么电子设备根据参考光效模型和光效强度系数得到目标光效模型的公式就可以表示为:P(x,y)=r*P o(x,y)。
操作314,获取待处理图像对应的深度图像,根据待处理图像和深度图像进行三维重建,得到人脸区域对应的三维模型。
在一个实施例中,电子设备在获取到目标光效模型之后,可以对二维图像进行处理,也可以对三维模型进行处理,在此不做限定。待处理图像为二维图像,那么在得到目标光效模型之后,电子设备可以直接对待处理图像进行处理,也可以对根据待处理图像进行三维重建得到的三维模型进行处理。
电子设备在对三维模型进行处理的时候,需要先根据待处理图像进行三维建模,得到三维模型。具体的,电子设备获取待处理图像对应的深度图像,然后根据待处理图像和深度图像进行三维重建。待处理图像可以用于表示物体的颜色、纹理等信息,深度图像可以用于表示物体到图像采集装置之间的距离。
电子设备根据待处理图像和深度图像可以对人脸区域进行三维建模,得到人脸区域对应的三维模型。具体的,三维模型可以用于表示物体的多边形空间立体结构。三维模型一般可以用三维网格(3Dimensions mesh,3D mesh)结构进行表示,网格是由物体的点云 数据组成的。点云数据中一般可以包括三维坐标(XYZ)、激光反射强度(Intensity)和颜色信息(RGB),最终根据点云数据绘制成三维网格。
操作316,根据目标光效模型对三维模型进行光效增强处理。
可以理解的是,电子设备在针对三维模型进行光效增强处理时,得到的目标光效模型也是针对三维模型进行光效增强处理的模型,预先设置的参考光效模型也是对三维模型进行光效增强处理的模型,即目标光效模型为模拟光线在三维空间中的变化的模型。电子设备得到三维模型之后,可以根据目标光效模型对三维模型进行光效增强处理。
图4为一个实施例中对三维模型进行光效增强处理的示意图。如图4所示,电子设备对人脸区域进行三维重建,得到三维模型402,得到的三维模型402可以在空间三维坐标系xyz中进行表示。电子设备对三维模型402进行光效增强处理的目标光效模型可以模拟光线在三维空间中的变化,具体的可以将目标光效模型在空间三维坐标系xyz中进行表示,即为光源中心P生成的光线在空间三维坐标系xyz中的变化曲线。
在一个实施例中,如图5所示,确定过曝区域的操作可以包括:
操作502,获取人脸区域中除第一像素区域之外的第二像素区域。
电子设备根据人脸区域中第一亮度均值大于亮度阈值的像素块,组成第一像素区域,并将人脸区域中除第一像素区域之外的区域作为第二像素区域,然后根据得到的第一像素区域和第二像素区域确定过曝区域。
操作504,根据第一像素区域和第二像素区域对人脸区域二值化处理。
电子设备确定第一像素区域和第二像素区域之后,根据第一像素区域和第二像素区域进行二值化处理。例如,电子设备将第一像素区域中的像素点的亮度值全部设置为1,将第二像素区域中的像素点的亮度值全部设置为0,就可以将人脸区域二值化。
操作506,根据二值化后的人脸区域确定过曝区域。
二值化后的人脸区域能够更容易分辨出第一像素区域和第二像素区域。电子设备根据二值化后的人脸区域来确定过曝区域。具体的,由于第一像素区域是亮度值较高的像素块组成的区域,因此第一像素区域被认为是过曝区域的可能性较大。电子设备可以比较第一像素区域的面积,如果第一像素区域的面积过小,则认为第一像素区域为过曝区域的可能性较小;如果第一像素区域的面积较大,则认为第一像素区域为过曝区域的可能性较大,则可以根据面积较大的第一像素区域生成过曝区域。
电子设备将第一像素区域中的所有像素点的亮度值设置为非0亮度值,将第二像素区域中的所有像素点设置为0,就可以对人脸区域进行二值化。这样得到的二值化后的人脸区域,就包含了一个或多个连通区域,电子设备可以根据连通区域的面积来确定过曝区域。
具体的,电子设备获取二值化后的人脸区域中的连通区域,并获取各个连通区域与人脸区域的面积比值;根据面积比值大于面积阈值的连通区域生成过曝区域。其中,面积可以通过包含的像素点的数量进行表示,连通区域的面积即为连通区域中所包含的像素点的数量,人脸区域的面积即为人脸区域中所包含的像素点的数量。电子设备获取到人脸区域中各个连通区域的面积之后,可以获取各个连通区域的面积与人脸区域的面积的面积比值,然后根据得到的面积比值确定过曝区域。
在一个实施例中,电子设备还可以将二值化后的人脸区域进行先膨胀后腐蚀的处理,然后获取先膨胀后腐蚀处理后的人脸区域中的连通区域,并获取各个连通区域与人脸区域的面积比值;根据面积比值大于面积阈值的连通区域生成过曝区域。例如,连通区域的面积为S1,人脸区域的面积为S2,假设连通区域与人脸区域的面积比值S1/S2大于0.1,则电子设备将该连通区域进行标记,最后根据标记的连通区域组成最后的曝光区域。
图6为一个实施例中连通区域的示意图。如图6所示,可以将人脸区域进行二值化,例如将第一像素区域中的所有像素点的亮度值设置为255,将第二像素区域中的所有像素点的亮度值设置为0,就可以得到二值化后的人脸区域60。二值化后的人脸区域60中可 包含连通区域602和非连通区域604。
在本申请提供的实施例中,如图7所示,获取光效强度系数的操作可以包括:
操作702,当检测待处理图像中存在两个或两个以上的人脸区域时,统计各个人脸区域中的过曝区域的第二亮度均值。
具体的,若在待处理图像中检测到两个或两个以上的人脸区域,那么电子设备可以分别检测各个人脸区域中的过曝区域,统计每一个人脸区域中的过曝区域的第二亮度均值,根据统计得到的第二亮度均值来获取光效强度系数。
操作704,根据最大的第二亮度均值获取光效强度系数。
电子设备在检测到有两个或两个以上的人脸区域时,可以根据亮度较高的人脸区域来获取光效强度系数。具体的,电子设备可以获取统计得到的各个人脸区域对应的第二亮度均值的最大值,然后根据该最大亮度均值获取光效强度系数。例如,待处理图像中包含两个人脸区域,人脸区域A对应的第二亮度均值为241,另一个人脸区域B对应的第二亮度均值为246,那么电子设备可以根据人脸区域B对应的第二亮度均值246来计算得到光效强度系数。
上述实施例提供的图像处理方法,可以对待处理图像中的人脸区域进行检测,然后再检测人脸区域中的过曝区域。根据过曝区域获取光效强度系数,再根据光效强度系数获取目标光效模型。最后根据待处理图像和对应的深度图像进行三维建模,并根据目标光效模型对三维模型进行光效增强处理。检测到人脸区域的过曝区域之后,根据过曝区域来调节光效增强处理的强度,可以避免在光效增强处理时引起的人脸区域的失真,提高了图像处理的准确性。
应该理解的是,虽然图2、3、5、7的流程图中的各个操作按照箭头的指示依次显示,但是这些操作并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些操作的执行并没有严格的顺序限制,这些操作可以以其它的顺序执行。而且,图2、3、5、7中的至少一部分操作可以包括多个子操作或者多个阶段,这些子操作或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子操作或者阶段的执行顺序也不必然是依次进行,而是可以与其它操作或者其它操作的子操作或者阶段的至少一部分轮流或者交替地执行。
图8为一个实施例的图像处理装置的结构框图。如图8所示,该图像处理装置800包括图像获取模块802、过曝检测模块804、模型获取模块806和光效处理模块808。其中:
图像获取模块802,用于获取待处理图像。
过曝检测模块804,用于检测所述待处理图像中的人脸区域,并检测所述人脸区域中的过曝区域。
模型获取模块806,用于根据所述过曝区域获取光效强度系数,并根据所述光效强度系数获取目标光效模型,其中,所述目标光效模型为模拟光线变化的模型。
光效处理模块808,用于根据所述目标光效模型对所述待处理图像进行光效增强处理。
上述实施例提供的图像处理装置,可以对待处理图像中的人脸区域进行检测,然后再检测人脸区域中的过曝区域。根据过曝区域获取光效强度系数,再根据光效强度系数获取目标光效模型。最后根据目标光效模型对待处理图像进行光效增强处理。检测到人脸区域的过曝区域之后,根据过曝区域来调节光效增强处理的强度,可以避免在光效增强处理时引起的人脸区域的失真,提高了图像处理的准确性。
在一个实施例中,过曝检测模块804还用于将所述人脸区域的像素点划分为不同的像素块;统计所述像素块中所包含的像素点的第一亮度均值;根据所述第一亮度均值大于亮度阈值的像素块组成第一像素区域,根据所述第一像素区域生成过曝区域。
在一个实施例中,过曝检测模块804还用于获取所述人脸区域中除所述第一像素区域 之外的第二像素区域;根据所述第一像素区域和第二像素区域对所述人脸区域二值化处理;根据二值化后的人脸区域确定过曝区域。
在一个实施例中,过曝检测模块804还用于获取二值化后的人脸区域中的连通区域,并获取各个连通区域与所述人脸区域的面积比值;根据所述面积比值大于面积阈值的连通区域生成过曝区域。
在一个实施例中,模型获取模块806还用于统计所述过曝区域中包含的像素点的第二亮度均值,并根据所述第二亮度均值获取光效强度系数。
在一个实施例中,模型获取模块806还用于当检测所述待处理图像中存在两个或两个以上的人脸区域时,统计各个所述人脸区域中的所述过曝区域的第二亮度均值;根据最大的第二亮度均值获取光效强度系数。
在一个实施例中,光效处理模块808还用于获取所述待处理图像对应的深度图像,根据所述待处理图像和深度图像进行三维重建,得到所述人脸区域对应的三维模型;根据所述目标光效模型对所述三维模型进行光效增强处理。
上述图像处理装置中各个模块的划分仅用于举例说明,在其他实施例中,可将图像处理装置按照需要划分为不同的模块,以完成上述图像处理装置的全部或部分功能。
关于图像处理装置的具体限定可以参见上文中对于图像处理方法的限定,在此不再赘述。上述图像处理装置中的各个模块可全部或部分通过软件、硬件及其组合来实现。上述各模块可以硬件形式内嵌于或独立于计算机设备中的处理器中,也可以以软件形式存储于计算机设备中的存储器中,以便于处理器调用执行以上各个模块对应的操作。
本申请实施例中提供的图像处理装置中的各个模块的实现可为计算机程序的形式。该计算机程序可在终端或服务器上运行。该计算机程序构成的程序模块可存储在终端或服务器的存储器上。该计算机程序被处理器执行时,实现本申请实施例中所描述方法的操作。
本申请实施例还提供一种电子设备。上述电子设备中包括图像处理电路,图像处理电路可以利用硬件和/或软件组件实现,可包括定义ISP(Image Signal Processing,图像信号处理)管线的各种处理单元。图9为一个实施例中图像处理电路的示意图。如图9所示,为便于说明,仅示出与本申请实施例相关的图像处理技术的各个方面。
如图9所示,图像处理电路包括ISP处理器940和控制逻辑器950。成像设备910捕捉的图像数据首先由ISP处理器940处理,ISP处理器940对图像数据进行分析以捕捉可用于确定和/或成像设备910的一个或多个控制参数的图像统计信息。成像设备910可包括具有一个或多个透镜912和图像传感器914的照相机。图像传感器914可包括色彩滤镜阵列(如Bayer滤镜),图像传感器914可获取用图像传感器914的每个成像像素捕捉的光强度和波长信息,并提供可由ISP处理器940处理的一组原始图像数据。传感器920(如陀螺仪)可基于传感器920接口类型把采集的图像处理的参数(如防抖参数)提供给ISP处理器940。传感器920接口可以利用SMIA(Standard Mobile Imaging Architecture,标准移动成像架构)接口、其它串行或并行照相机接口、或上述接口的组合。
此外,图像传感器914也可将原始图像数据发送给传感器920,传感器920可基于传感器920接口类型把原始图像数据提供给ISP处理器940进行处理,或者传感器920将原始图像数据存储到图像存储器930中。
ISP处理器940按多种格式逐个像素地处理原始图像数据。例如,每个图像像素可具有8、10、12或14比特的位深度,ISP处理器940可对原始图像数据进行一个或多个图像处理操作、收集关于图像数据的统计信息。其中,图像处理操作可按相同或不同的位深度精度进行。
ISP处理器940还可从图像存储器930接收像素数据。例如,传感器920接口将原始图像数据发送给图像存储器930,图像存储器930中的原始图像数据再提供给ISP处理器 940以供处理。图像存储器930可为存储器装置的一部分、存储设备、或电子设备内的独立的专用存储器,并可包括DMA(Direct Memory Access,直接直接存储器存取)特征。
当接收到来自图像传感器914接口或来自传感器920接口或来自图像存储器930的原始图像数据时,ISP处理器940可进行一个或多个图像处理操作,如时域滤波。ISP处理器940处理后的图像数据可发送给图像存储器930,以便在被显示之前进行另外的处理。ISP处理器940从图像存储器930接收处理数据,并对所述处理数据进行原始域中以及RGB和YCbCr颜色空间中的图像数据处理。处理后的图像数据可输出给显示器980,以供用户观看和/或由图形引擎或GPU(Graphics Processing Unit,图形处理器)进一步处理。此外,ISP处理器940的输出还可发送给图像存储器930,且显示器980可从图像存储器930读取图像数据。在一个实施例中,图像存储器930可被配置为实现一个或多个帧缓冲器。此外,ISP处理器940的输出可发送给编码器/解码器970,以便编码/解码图像数据。编码的图像数据可被保存,并在显示于显示器980设备上之前解压缩。
ISP处理后的图像数据可发送给光效模块960,以便在被显示之前对图像进行光效处理。光效模块960对图像数据光效处理可包括获取待处理图像中每一个像素的光效增强参数,并根据光效增强参数对待处理图像进行光效增强处理等。光效模块960将图像数据进行光效增强处理后,可将光效增强处理后的图像数据发送给编码器/解码器970,以便编码/解码图像数据。编码的图像数据可被保存,并在显示与显示器980设备上之前解压缩。可以理解的是,光效模块960处理后的图像数据可以不经过编码器/解码器970,直接发给显示器980进行显示。ISP处理器940处理后的图像数据还可以先经过编码器/解码器970处理,然后再经过光效模块960进行处理。其中,光效模块960或编码器/解码器970可为移动终端中CPU(Central Processing Unit,中央处理器)或GPU(Graphics Processing Unit,图形处理器)等。
ISP处理器940确定的统计数据可发送给控制逻辑器950单元。例如,统计数据可包括自动曝光、自动白平衡、自动聚焦、闪烁检测、黑电平补偿、透镜912阴影校正等图像传感器914统计信息。控制逻辑器950可包括执行一个或多个例程(如固件)的处理器和/或微控制器,一个或多个例程可根据接收的统计数据,确定成像设备910的控制参数以及ISP处理器940的控制参数。例如,成像设备910的控制参数可包括传感器920控制参数(例如增益、曝光控制的积分时间、防抖参数等)、照相机闪光控制参数、透镜912控制参数(例如聚焦或变焦用焦距)、或这些参数的组合。ISP控制参数可包括用于自动白平衡和颜色调整(例如,在RGB处理期间)的增益水平和色彩校正矩阵,以及透镜912阴影校正参数。
以下为运用图9中图像处理技术实现上述实施例提供的图像处理方法的操作。
本申请实施例还提供了一种计算机可读存储介质。一个或多个包含计算机可执行指令的非易失性计算机可读存储介质,当所述计算机可执行指令被一个或多个处理器执行时,使得所述处理器执行上述实施例提供的图像处理方法的操作。
一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述实施例提供的图像处理方法。
本申请所使用的对存储器、存储、数据库或其它介质的任何引用可包括非易失性和/或易失性存储器。非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM),它用作外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDR SDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接 RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)。
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对本申请专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。

Claims (20)

  1. 一种图像处理方法,包括:
    获取待处理图像;
    检测所述待处理图像中的人脸区域,并检测所述人脸区域中的过曝区域;
    根据所述过曝区域获取光效强度系数,并根据所述光效强度系数获取目标光效模型,其中,所述目标光效模型为模拟光线变化的模型;及
    根据所述目标光效模型对所述待处理图像进行光效增强处理。
  2. 根据权利要求1所述的方法,其特征在于,所述检测所述人脸区域中的过曝区域,包括:
    将所述人脸区域的像素点划分为不同的像素块;
    统计所述像素块中所包含的像素点的第一亮度均值;及
    根据所述第一亮度均值大于亮度阈值的像素块组成第一像素区域,根据所述第一像素区域生成所述过曝区域。
  3. 根据权利要求2所述的方法,其特征在于,所述根据所述第一像素区域生成所述过曝区域,包括:
    获取所述人脸区域中除所述第一像素区域之外的第二像素区域;
    根据所述第一像素区域和第二像素区域对所述人脸区域二值化处理;及
    根据二值化后的人脸区域确定所述过曝区域。
  4. 根据权利要求3所述的方法,其特征在于,所述根据二值化后的人脸区域确定所述过曝区域,包括:
    获取二值化后的人脸区域中的连通区域,并获取各个连通区域与所述人脸区域的面积比值;及
    根据所述面积比值大于面积阈值的连通区域生成所述过曝区域。
  5. 根据权利要求4所述的方法,所述获取二值化后的人脸区域中的连通区域,包括:
    获取二值化后的人脸区域,将所述二值化后的人脸区域依次进行膨胀处理和腐蚀处理,获取处理后的人脸区域中的连通区域。
  6. 根据权利要求1所述的方法,其特征在于,所述根据所述过曝区域获取光效强度系数,包括:
    统计所述过曝区域中包含的像素点的第二亮度均值,并根据所述第二亮度均值获取光效强度系数。
  7. 根据权利要求6所述的方法,其特征在于,所述根据所述第二亮度均值获取光效强度系数,包括:
    获取用于生成所述过曝区域的亮度阈值;
    将所述亮度阈值与所述第二亮度均值的比值作为所述光效强度系数。
  8. 根据权利要求6所述的方法,其特征在于,所述统计所述过曝区域中包含的像素点的第二亮度均值,并根据所述第二亮度均值获取光效强度系数,包括:
    当检测到所述待处理图像中存在两个或两个以上的人脸区域时,则统计各个所述人脸区域中的所述过曝区域的第二亮度均值;
    根据最大的第二亮度均值获取光效强度系数。
  9. 根据权利要求1所述的方法,其特征在于,所述根据所述目标光效模型对所述待处理图像进行光效增强处理,包括:
    根据所述目标光效模型计算各个像素点对应的颜色通道值的光效增强参数,根据光效增强参数分别对各个像素点的颜色通道值进行光效增强处理。
  10. 根据权利要求1至9中任一项所述的方法,其特征在于,所述根据所述目标光效模型对所述待处理图像进行光效增强处理,包括:
    获取所述待处理图像对应的深度图像,根据所述待处理图像和深度图像进行三维重建,得到所述人脸区域对应的三维模型;
    根据所述目标光效模型对所述三维模型进行光效增强处理。
  11. 一种电子设备,包括存储器及处理器,所述存储器中储存有计算机程序,所述计算机程序被所述处理器执行时,使得所述处理器执行如下操作:
    获取待处理图像;
    检测所述待处理图像中的人脸区域,并检测所述人脸区域中的过曝区域;
    根据所述过曝区域获取光效强度系数,并根据所述光效强度系数获取目标光效模型,其中,所述目标光效模型为模拟光线变化的模型;及
    根据所述目标光效模型对所述待处理图像进行光效增强处理。
  12. 根据权利要求11所述的电子设备,其特征在于,所述处理器执行所述检测所述人脸区域中的过曝区域时,还执行如下操作:
    将所述人脸区域的像素点划分为不同的像素块;
    统计所述像素块中所包含的像素点的第一亮度均值;及
    根据所述第一亮度均值大于亮度阈值的像素块组成第一像素区域,根据所述第一像素区域生成所述过曝区域。
  13. 根据权利要求12所述的电子设备,其特征在于,所述处理器执行所述根据所述第一像素区域生成所述过曝区域时,还执行如下操作:
    获取所述人脸区域中除所述第一像素区域之外的第二像素区域;
    根据所述第一像素区域和第二像素区域对所述人脸区域二值化处理;及
    根据二值化后的人脸区域确定所述过曝区域。
  14. 根据权利要求13所述的电子设备,其特征在于,所述处理器执行所述根据二值化后的人脸区域确定所述过曝区域时,还执行如下操作:
    获取二值化后的人脸区域中的连通区域,并获取各个连通区域与所述人脸区域的面积比值;及
    根据所述面积比值大于面积阈值的连通区域生成所述过曝区域。
  15. 根据权利要求14所述的电子设备,其特征在于,所述处理器执行所述获取二值化后的人脸区域中的连通区域时,还执行如下操作:
    获取二值化后的人脸区域,将所述二值化后的人脸区域依次进行膨胀处理和腐蚀处理,获取处理后的人脸区域中的连通区域。
  16. 根据权利要求11所述的电子设备,其特征在于,所述处理器执行所述根据所述过曝区域获取光效强度系数时,还执行如下操作:
    统计所述过曝区域中包含的像素点的第二亮度均值,并根据所述第二亮度均值获取光效强度系数。
  17. 根据权利要求16所述的电子设备,其特征在于,所述处理器执行所述统计所述过曝区域中包含的像素点的第二亮度均值,并根据所述第二亮度均值获取光效强度系数时,还执行如下操作:
    当检测到所述待处理图像中存在两个或两个以上的人脸区域时,则统计各个所述人脸区域中的所述过曝区域的第二亮度均值;
    根据最大的第二亮度均值获取光效强度系数。
  18. 根据权利要求11所述的电子设备,其特征在于,所述处理器执行所述根据所述目标光效模型对所述待处理图像进行光效增强处理时,还执行如下操作:
    根据所述目标光效模型计算各个像素点对应的颜色通道值的光效增强参数,根据光效增强参数分别对各个像素点的颜色通道值进行光效增强处理。
  19. 根据权利要求11至18中任一项所述的电子设备,其特征在于,所述处理器执行 所述根据所述目标光效模型对所述待处理图像进行光效增强处理时,还执行如下操作:
    获取所述待处理图像对应的深度图像,根据所述待处理图像和深度图像进行三维重建,得到所述人脸区域对应的三维模型;
    根据所述目标光效模型对所述三维模型进行光效增强处理。
  20. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至10中任一项所述的方法的操作。
PCT/CN2019/092931 2018-09-07 2019-06-26 图像处理方法、电子设备、计算机可读存储介质 WO2020048192A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19857827.0A EP3849170B1 (en) 2018-09-07 2019-06-26 Image processing method, electronic device, and computer-readable storage medium
US17/193,428 US20210192698A1 (en) 2018-09-07 2021-03-05 Image Processing Method, Electronic Device, and Non-Transitory Computer-Readable Storage Medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811045659.3A CN109246354B (zh) 2018-09-07 2018-09-07 图像处理方法和装置、电子设备、计算机可读存储介质
CN201811045659.3 2018-09-07

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/193,428 Continuation US20210192698A1 (en) 2018-09-07 2021-03-05 Image Processing Method, Electronic Device, and Non-Transitory Computer-Readable Storage Medium

Publications (1)

Publication Number Publication Date
WO2020048192A1 true WO2020048192A1 (zh) 2020-03-12

Family

ID=65067433

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/092931 WO2020048192A1 (zh) 2018-09-07 2019-06-26 图像处理方法、电子设备、计算机可读存储介质

Country Status (4)

Country Link
US (1) US20210192698A1 (zh)
EP (1) EP3849170B1 (zh)
CN (1) CN109246354B (zh)
WO (1) WO2020048192A1 (zh)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109246354B (zh) * 2018-09-07 2020-04-24 Oppo广东移动通信有限公司 图像处理方法和装置、电子设备、计算机可读存储介质
CN110033418B (zh) * 2019-04-15 2023-03-24 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN110110778B (zh) * 2019-04-29 2023-04-25 腾讯科技(深圳)有限公司 图像处理方法、装置、电子设备和计算机可读存储介质
CN110223244B (zh) * 2019-05-13 2021-08-27 浙江大华技术股份有限公司 一种图像处理的方法、装置、电子设备和存储介质
CN111507298B (zh) * 2020-04-24 2023-12-12 深圳数联天下智能科技有限公司 人脸检测方法、装置、计算机设备和存储介质
CN112040091B (zh) * 2020-09-01 2023-07-21 先临三维科技股份有限公司 相机增益的调整方法和装置、扫描系统
CN112348738B (zh) * 2020-11-04 2024-03-26 Oppo广东移动通信有限公司 图像优化方法、图像优化装置、存储介质与电子设备
CN112653847B (zh) * 2020-12-17 2022-08-05 杭州艾芯智能科技有限公司 深度相机的自动曝光方法、计算机设备和存储介质
CN114827482B (zh) * 2021-01-28 2023-11-03 抖音视界有限公司 图像亮度的调整方法、装置、电子设备及介质
CN112950509B (zh) * 2021-03-18 2023-10-10 杭州海康威视数字技术股份有限公司 一种图像处理方法、装置及电子设备
CN117278865A (zh) * 2023-11-16 2023-12-22 荣耀终端有限公司 一种图像处理方法及相关装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008152097A (ja) * 2006-12-19 2008-07-03 Nikon Corp 連続撮影制御方法および撮像装置
CN104994306A (zh) * 2015-06-29 2015-10-21 厦门美图之家科技有限公司 一种基于脸部亮度自动调整曝光度的摄像方法和摄像装置
CN108419028A (zh) * 2018-03-20 2018-08-17 广东欧珀移动通信有限公司 图像处理方法、装置、计算机可读存储介质和电子设备
CN109246354A (zh) * 2018-09-07 2019-01-18 Oppo广东移动通信有限公司 图像处理方法和装置、电子设备、计算机可读存储介质

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6900805B2 (en) * 2002-08-29 2005-05-31 Nec Laboratories America, Inc. Torrance-sparrow off-specular reflection and linear subspaces for object recognition
US7542600B2 (en) * 2004-10-21 2009-06-02 Microsoft Corporation Video image quality
US8014034B2 (en) * 2005-04-13 2011-09-06 Acd Systems International Inc. Image contrast enhancement
JP4934326B2 (ja) * 2005-09-29 2012-05-16 富士フイルム株式会社 画像処理装置およびその処理方法
JP5049356B2 (ja) * 2007-02-28 2012-10-17 デジタルオプティックス・コーポレイション・ヨーロッパ・リミテッド テクスチャ空間分解に基づく統計的顔モデリングにおける指向性照明変動性の分離
CN102006421A (zh) * 2009-09-01 2011-04-06 华晶科技股份有限公司 具有人脸的影像的处理方法
US9754629B2 (en) * 2010-03-03 2017-09-05 Koninklijke Philips N.V. Methods and apparatuses for processing or defining luminance/color regimes
US8233789B2 (en) * 2010-04-07 2012-07-31 Apple Inc. Dynamic exposure metering based on face detection
US8488958B2 (en) * 2010-05-25 2013-07-16 Apple Inc. Scene adaptive auto exposure
US8441548B1 (en) * 2012-06-15 2013-05-14 Google Inc. Facial image quality assessment
US10558848B2 (en) * 2017-10-05 2020-02-11 Duelight Llc System, method, and computer program for capturing an image with correct skin tone exposure
CN203414661U (zh) * 2013-08-05 2014-01-29 杭州海康威视数字技术股份有限公司 抑制局部过曝的滤光片
US9275445B2 (en) * 2013-08-26 2016-03-01 Disney Enterprises, Inc. High dynamic range and tone mapping imaging techniques
US9508173B2 (en) * 2013-10-30 2016-11-29 Morpho, Inc. Image processing device having depth map generating unit, image processing method and non-transitory computer redable recording medium
JP6833415B2 (ja) * 2016-09-09 2021-02-24 キヤノン株式会社 画像処理装置、画像処理方法、及びプログラム
CN107506714B (zh) * 2017-08-16 2021-04-02 成都品果科技有限公司 一种人脸图像重光照的方法
US10552707B2 (en) * 2017-12-07 2020-02-04 Qualcomm Incorporated Methods and devices for image change detection
CN108573480B (zh) * 2018-04-20 2020-02-11 太平洋未来科技(深圳)有限公司 基于图像处理的环境光补偿方法、装置及电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008152097A (ja) * 2006-12-19 2008-07-03 Nikon Corp 連続撮影制御方法および撮像装置
CN104994306A (zh) * 2015-06-29 2015-10-21 厦门美图之家科技有限公司 一种基于脸部亮度自动调整曝光度的摄像方法和摄像装置
CN108419028A (zh) * 2018-03-20 2018-08-17 广东欧珀移动通信有限公司 图像处理方法、装置、计算机可读存储介质和电子设备
CN109246354A (zh) * 2018-09-07 2019-01-18 Oppo广东移动通信有限公司 图像处理方法和装置、电子设备、计算机可读存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3849170A4 *

Also Published As

Publication number Publication date
EP3849170A4 (en) 2021-10-20
US20210192698A1 (en) 2021-06-24
EP3849170A1 (en) 2021-07-14
EP3849170B1 (en) 2023-12-20
CN109246354B (zh) 2020-04-24
CN109246354A (zh) 2019-01-18

Similar Documents

Publication Publication Date Title
WO2020048192A1 (zh) 图像处理方法、电子设备、计算机可读存储介质
US11430103B2 (en) Method for image processing, non-transitory computer readable storage medium, and electronic device
CN109767467B (zh) 图像处理方法、装置、电子设备和计算机可读存储介质
CN108734676B (zh) 图像处理方法和装置、电子设备、计算机可读存储介质
CN108717530B (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
CN108419028B (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
WO2020001197A1 (zh) 图像处理方法、电子设备、计算机可读存储介质
US11431915B2 (en) Image acquisition method, electronic device, and non-transitory computer readable storage medium
KR20200044093A (ko) 이미지 처리 방법 및 장치, 전자 장치 및 컴퓨터-판독 가능 저장 매체
CN108716982B (zh) 光学元件检测方法、装置、电子设备和存储介质
CN110493506B (zh) 一种图像处理方法和系统
WO2019105305A1 (zh) 图像亮度处理方法、计算机可读存储介质和电子设备
CN108600740B (zh) 光学元件检测方法、装置、电子设备和存储介质
CN108616700B (zh) 图像处理方法和装置、电子设备、计算机可读存储介质
CN109685853B (zh) 图像处理方法、装置、电子设备和计算机可读存储介质
CN108769523B (zh) 图像处理方法和装置、电子设备、计算机可读存储介质
CN109242794B (zh) 图像处理方法、装置、电子设备及计算机可读存储介质
JP6525543B2 (ja) 画像処装置および画像処理方法、並びにプログラム
CN109325905B (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
CN107454317B (zh) 图像处理方法、装置、计算机可读存储介质和计算机设备
TWI708192B (zh) 影像處理方法、電子設備、電腦可讀儲存媒體
CN109191398B (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
CN108600631B (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
CN108629329B (zh) 图像处理方法和装置、电子设备、计算机可读存储介质
CN109446945B (zh) 三维模型处理方法和装置、电子设备、计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19857827

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019857827

Country of ref document: EP

Effective date: 20210407