CN107993209B - Image processing method, image processing device, computer-readable storage medium and electronic equipment - Google Patents

Image processing method, image processing device, computer-readable storage medium and electronic equipment Download PDF

Info

Publication number
CN107993209B
CN107993209B CN201711240703.1A CN201711240703A CN107993209B CN 107993209 B CN107993209 B CN 107993209B CN 201711240703 A CN201711240703 A CN 201711240703A CN 107993209 B CN107993209 B CN 107993209B
Authority
CN
China
Prior art keywords
beauty
area
parameter
image
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201711240703.1A
Other languages
Chinese (zh)
Other versions
CN107993209A (en
Inventor
杜成鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201711240703.1A priority Critical patent/CN107993209B/en
Publication of CN107993209A publication Critical patent/CN107993209A/en
Application granted granted Critical
Publication of CN107993209B publication Critical patent/CN107993209B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)

Abstract

The application relates to an image processing method, an image processing device, a computer readable storage medium and an electronic device. The method comprises the following steps: acquiring a target region and a corresponding region area in an image to be processed; acquiring a corresponding beauty parameter model according to the area, wherein the beauty parameter model is used for calculating beauty parameters; acquiring a corresponding target beauty parameter according to the area and the beauty parameter model; and performing beauty treatment on the image to be treated according to the target beauty parameter. The image processing method, the image processing device, the computer readable storage medium and the electronic equipment improve the accuracy of image processing.

Description

Image processing method, image processing device, computer-readable storage medium and electronic equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, a computer-readable storage medium, and an electronic device.
Background
Photographing is an indispensable skill in both work and life. In order to take a satisfactory picture, it is necessary to improve not only the shooting parameters during shooting but also the picture itself after completion of shooting. The beauty treatment is a method for beautifying the photos, and people in the photos can be more beautiful to human after the beauty treatment.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, a computer readable storage medium and an electronic device, which can improve the accuracy of image processing.
A method of image processing, the method comprising:
acquiring a target region and a corresponding region area in an image to be processed;
acquiring a corresponding beauty parameter model according to the area, wherein the beauty parameter model is used for calculating beauty parameters;
acquiring a corresponding target beauty parameter according to the area and the beauty parameter model;
and performing beauty treatment on the image to be treated according to the target beauty parameter.
An image processing apparatus, the apparatus comprising:
the image acquisition module is used for acquiring a target area and a corresponding area in an image to be processed;
the model acquisition module is used for acquiring a corresponding beauty parameter model according to the area, wherein the beauty parameter model is used for calculating beauty parameters;
the parameter acquisition module is used for acquiring corresponding target beauty parameters according to the area and the beauty parameter model;
and the beautifying processing module is used for performing beautifying processing on the image to be processed according to the target beautifying parameter.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring a target region and a corresponding region area in an image to be processed;
acquiring a corresponding beauty parameter model according to the area, wherein the beauty parameter model is used for calculating beauty parameters;
acquiring a corresponding target beauty parameter according to the area and the beauty parameter model;
and performing beauty treatment on the image to be treated according to the target beauty parameter.
An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the steps of:
acquiring a target region and a corresponding region area in an image to be processed;
acquiring a corresponding beauty parameter model according to the area, wherein the beauty parameter model is used for calculating beauty parameters;
acquiring a corresponding target beauty parameter according to the area and the beauty parameter model;
and performing beauty treatment on the image to be treated according to the target beauty parameter.
According to the image processing method, the image processing device, the computer readable storage medium and the electronic equipment, the corresponding beauty parameter model is obtained according to the area corresponding to the target area in the image to be processed, the corresponding target beauty parameter is obtained according to the area and the beauty parameter model, and the image to be processed is subjected to beauty processing according to the obtained target beauty parameter. Therefore, different beautifying treatments can be performed according to target areas with different areas, and the accuracy of the beautifying treatment is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an exemplary embodiment of an image processing method;
FIG. 2 is a flow diagram of a method of image processing in one embodiment;
FIG. 3 is a flow chart of an image processing method in another embodiment;
FIG. 4 is a flowchart of an image processing method in yet another embodiment;
FIG. 5 is a color histogram generated in one embodiment;
FIG. 6 is a flowchart of an image processing method in yet another embodiment;
FIG. 7 is a graph illustrating the variation of the beauty coefficients in one embodiment;
FIG. 8 is a diagram showing a configuration of an image processing apparatus according to an embodiment;
FIG. 9 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first acquisition module may be referred to as a second acquisition module, and similarly, a second acquisition module may be referred to as a first acquisition module, without departing from the scope of the present application. The first acquisition module and the second acquisition module are both acquisition modules, but they are not the same acquisition module.
FIG. 1 is a diagram of an embodiment of an application environment of an image processing method. As shown in fig. 1, the application environment includes a user terminal 102 and a server 104. The user terminal 102 may be configured to capture an image to be processed and send the image to be processed to the server 104. After receiving the image to be processed, the server 104 acquires a target region and a corresponding region area in the image to be processed; acquiring a corresponding beauty parameter model according to the area of the region; acquiring a corresponding target beauty parameter according to the area and the beauty parameter model; and performing beauty treatment on the image to be treated according to the target beauty parameters. Finally, the server 104 returns the beautified image set to be processed to the user terminal 102. It is understood that the server may also send the obtained target beauty parameter to the user terminal 102, and the user terminal 102 performs beauty processing on the image to be processed according to the target beauty parameter. The user terminal 102 is an electronic device located at the outermost periphery of the computer network and mainly used for inputting user information and outputting a processing result, and may be, for example, a personal computer, a mobile terminal, a personal digital assistant, a wearable electronic device, or the like. The server 104 is a device, such as one or more computers, for responding to service requests while providing computing services. It is understood that in other embodiments provided in the present application, the application environment of the image processing method may include only the user terminal 102, that is, the user terminal 102 is used for acquiring the image to be processed and performing the beauty processing on the image to be processed.
FIG. 2 is a flow diagram of a method of image processing in one embodiment. As shown in fig. 2, the image processing method includes steps 202 to 208. Wherein:
step 202, a target region and a corresponding region area in the image to be processed are obtained.
In one embodiment, the image to be processed refers to an image that needs to be beautified. The image to be processed may be acquired by the mobile terminal. The mobile terminal is provided with a camera which can be used for shooting, a user can initiate a shooting instruction through the mobile terminal, and the mobile terminal collects shooting images through the camera after detecting the shooting instruction. The mobile terminal stores the collected images to form an image set. It is understood that the image to be processed may be acquired by other ways, and is not limited herein. For example, the image to be processed may be downloaded from a web page, or imported from an external storage device, etc. The acquiring of the image to be processed may specifically include: receiving a beautifying instruction input by a user, and acquiring an image to be processed according to the beautifying instruction, wherein the beautifying instruction comprises an image identifier. The image identification refers to a unique identification for distinguishing different images to be processed, and the images to be processed are obtained according to the image identification. For example, the image identification may be one or more of an image name, an image code, an image storage address, and the like. Specifically, after the mobile terminal acquires the image to be processed, the mobile terminal may perform the beautifying processing locally, or send the image to be processed to the server for beautifying processing.
The target area is generally an area that is relatively focused by a user, and specifically may be an area that needs to be beautified in the image to be processed. For example, the target area may refer to a human face area, a portrait area, a skin area, a lip area, and the like, which are not limited herein. It can be understood that the image to be processed is composed of a plurality of pixel points, and the target area is composed of a plurality of pixel points in the image to be processed. The area refers to the size of the area occupied by the target area, and can be expressed as the total number of pixel points contained in the target area, or as the area ratio of the target area to the corresponding image to be processed.
And 204, acquiring a corresponding beauty parameter model according to the area, wherein the beauty parameter model is a model for calculating beauty parameters.
Specifically, the beauty parameter model refers to a model for calculating beauty parameters, and in general, the beauty parameter model may be represented as a functional model which may represent a functional relationship between an input variable and an output result. It will be appreciated that this functional relationship may be linear or non-linear. The input variables are input into the function model, and output results corresponding to the output can be obtained. For example, the functional model may be represented as Y ═ X +1, where Y is the output result and X is the variable of the input. Then when the input variable X is 1, the corresponding output result Y is 2. And pre-establishing a corresponding relation between the area of the region and the beauty parameter model, and obtaining the corresponding beauty parameter model according to the area of the region. When the area of the region is different, the obtained beauty parameter models can be different, so that the target beauty parameter can be calculated by adopting different beauty parameter models according to different area of the region.
And step 206, acquiring a corresponding target beauty parameter according to the area and the beauty parameter model.
In the embodiments provided in the present application, the target beauty parameter refers to a parameter for performing beauty processing on the target area. The beautifying processing is a method for beautifying the image. For example, the process of whitening or peeling the portrait in the image may be a process of making up the portrait, reducing the face, or slimming the portrait. The beauty parameter model may represent a functional relationship between the area of the region and the beauty parameter. After the area and the beauty parameter model are obtained, the area is used as an input variable of the beauty parameter model, calculation is carried out through the beauty parameter model, and an obtained output result is the target beauty parameter.
It will be appreciated that the target region is typically represented in the image as a separate connected region, which refers to a closed region. For example, each face in the image corresponds to an independent connected region, when there are multiple faces in the image, there are multiple connected regions, and each face corresponds to an independent connected region. If a plurality of target areas are detected in the image to be processed, the corresponding area can be obtained for each target area, and corresponding target beauty parameters can be obtained according to the area and the beauty parameter model.
And step 208, performing beauty treatment on the image to be treated according to the target beauty parameters.
And after the target beautifying parameters are acquired, carrying out beautifying processing on the image to be processed according to the target beautifying parameters. It is to be understood that the entire image may be subjected to the beauty processing, or only the target region may be subjected to the beauty processing. For example, the whitening process may be performed on the entire image to increase the brightness of the entire image, or may be performed only on a skin region, and the face thinning process may be performed only on a face region. Specifically, the image to be processed is composed of a plurality of pixel points, each pixel point may be composed of a plurality of color channels, and each color channel represents a color component. For example, the image may be composed of three channels of RGB (Red Green Blue, Red, Green, Blue), three channels of HSV (Hue, saturation, lightness), and three channels of CMY (Cyan Magenta Yellow). When the image is beautified, the image can be beautified by each color channel of the image, and the processing degree of each color channel can be different. Specifically, the beauty parameters corresponding to the color channels may be obtained according to the area of the region and the beauty parameter models corresponding to the color channels, and the beauty processing may be performed on the color channels of the target region according to the beauty parameters.
In the image processing method provided in the foregoing embodiment, the corresponding beauty parameter model is obtained according to the area corresponding to the target area in the image to be processed, the corresponding target beauty parameter is obtained according to the area and the beauty parameter model, and the image to be processed is subjected to beauty processing according to the obtained target beauty parameter. Therefore, different beautifying treatments can be performed according to target areas with different areas, and the accuracy of the beautifying treatment is improved.
Fig. 3 is a flowchart of an image processing method in another embodiment. As shown in fig. 3, the image processing method includes steps 302 to 310. Wherein:
step 302, a target region and a corresponding region area in the image to be processed are obtained.
If the image processing is performed on the server, each mobile terminal may send the image to be processed to the server, and after receiving the image to be processed, the server performs the image processing on the image to be processed in the image to be processed. And when the mobile terminal sends the image to be processed, the corresponding terminal identification is sent at the same time, after the server finishes processing, the corresponding mobile terminal is searched according to the terminal identification, and the image to be processed after the processing is finished is sent to the mobile terminal. The terminal identifier refers to a unique identifier of the user terminal. For example, the terminal identifier may be at least one of an IP (Internet Protocol, Protocol for interconnecting networks) address, a MAC (Media Access Control) address, and the like.
In one embodiment, the target region may refer to a face region or a skin region corresponding to the face region in the image to be processed. Step 302 may specifically include: detecting a face region in an image to be processed, and acquiring a region area corresponding to the face region; detecting a face region in an image to be processed, acquiring a corresponding skin region according to the face region, and acquiring a region area corresponding to the skin region. Specifically, the face region of the image to be processed may be obtained through a face detection algorithm, and the face detection algorithm may include a detection method based on geometric features, a feature face detection method, a linear discriminant analysis method, a detection method based on a hidden markov model, and the like, which is not limited herein. The skin area refers to an area where skin is located, and the method for acquiring the corresponding skin area according to the face area may specifically include:
step 402, generating a color histogram according to the color information corresponding to the face region.
The color histogram is used to describe the proportion of different colors in the face area, and the color information refers to the relevant parameters used to represent the colors of the image. For example, in the HSV color space, the color information may include information such as H (Hue), S (Saturation), and V (Value) of a color in an image. The color information corresponding to the face region is obtained, the color information can be divided into a plurality of small color intervals, and the number of pixel points falling into each color interval in the face region is calculated respectively, so that a color histogram is obtained. The color histogram may be an RGB color histogram, an HSV color histogram, or a YUV color histogram, but is not limited thereto. In the HSV color space, the components may include H (Hue), S (Saturation), and V (Value), where H represents an angle metric, and has a Value range of 0 ° to 360 °, and is calculated from red in the counterclockwise direction, and red is 0 °, green is 120 °, and blue is 240 °; s represents the degree that the color is close to the spectral color, the larger the proportion of the spectral color is, the higher the degree that the color is close to the spectral color is, the higher the saturation of the color is, the higher the saturation is, and the color is generally dark and bright; v represents the brightness degree of the color, and for the light source color, the brightness value is related to the brightness of the luminous body; for object colors, this value is related to the transmittance or reflectance of the object, and V is typically in the range of 0% (black) to 100% (white). Specifically, the method for generating the HSV color histogram may include: converting the face area from an RGB color space to an HSV color space; respectively quantizing H, S, V components in HSV, and synthesizing H, S, V quantized components into a one-dimensional eigenvector; determining the quantization levels of three components of the pixel points at H, S, V according to the value of each pixel point in the human face region in the HSV color space; calculating corresponding characteristic vectors according to the quantization levels of the pixel points, and counting the number of the pixel points distributed on each quantization level according to the characteristic vectors; and generating a color histogram according to the statistical result.
The value of the eigenvector can be between 0 and 255, and is 256 in total, that is, the HSV color space can be divided into 256 color intervals, and each color interval corresponds to the value of one eigenvector. For example, the H component may be quantized to 16 levels, the S component and the V component may be quantized to 4 levels, and the resultant one-dimensional feature vector may be represented by the following equation:
L=H*QS*QV+S*QV+V;
l represents the one-dimensional characteristic of the quantized H, S and V three-component compositionVector quantity; qSNumber of quantization steps, Q, representing the S componentVRepresenting the quantization level of the V component.
Step 404, obtain the peak value and the corresponding color interval in the color histogram.
The peak refers to the maximum value of the amplitude in a section of wave formed by the color histogram, and can be determined by calculating the first difference of each point in the color histogram, and the peak is the maximum value on the peak. After the peak value in the color histogram is obtained, a quantized color interval corresponding to the peak value is obtained, where the color interval may be a value of a feature vector corresponding to the peak value in the HSV color space. FIG. 5 is a color histogram generated in one embodiment. As shown in fig. 5, the vertical axis of the color histogram represents the distribution of the pixel points, i.e., the number of the pixel points corresponding to the color interval. The horizontal axis represents a feature vector of the HSV color space, i.e., a plurality of color intervals divided by the HSV color space. It can be seen that the color histogram in fig. 5 includes a peak 502, a peak value corresponding to the peak 502 is 850, a color interval corresponding to the peak is 150, that is, a feature vector value of 850 pixel points in the statistical image is 150.
And 406, dividing skin color intervals according to the color intervals, and taking the area corresponding to the skin color interval in the face area as a skin area.
The skin color interval of the face region is divided according to the color interval corresponding to the peak value of the color histogram, the range value of the skin color interval can be preset, and then the skin color interval is calculated according to the color interval corresponding to the peak value and the preset range value. Optionally, the computer device may multiply a color interval corresponding to the peak value by a preset range value, where the preset range value may include an upper limit value and a lower limit value, and may multiply the color interval corresponding to the peak value by the upper limit value and the lower limit value, respectively, to obtain a skin color interval. For example, the computer device may preset a range value of the skin color interval to be 80% to 120%, and if the color interval corresponding to the peak value of the color histogram is a value of 150, the skin color interval may be calculated to be 120 to 180.
Taking an area corresponding to a skin color interval in the face area as a skin area, optionally, the computer device may obtain a feature vector of each pixel point in the face area in the HSV color space, and determine whether the feature vector falls into the skin color interval, and if so, may define the corresponding pixel point as a pixel point of the skin area. For example, if the skin color interval is 120-180 degrees, the computer device may define the pixel points in the face region with the feature vectors of the HSV color space between 120-180 degrees as the pixel points in the skin region.
And 304, determining an area interval where the area of the area is located, and acquiring a beauty parameter model corresponding to the area interval.
Generally, if the area of the target region is too small, the detail information in the target region is more, and the loss of the detail information is easily caused after the beauty treatment, thereby reducing the aesthetic feeling of the image. The degree of the beautifying processing can be correspondingly lightened when the beautifying processing is carried out, so that some detail information can be better kept, and the distortion of the image can not be caused. Specifically, the area of the image is divided into different area sections, and when the area of the image is in different area sections, different degrees of beauty processing are correspondingly performed. For example, the area of the region may be represented by an area ratio of the target region to the image to be processed, the value of the area of the target region is 0 to 1, when the area of the region is 0, it indicates that the target region does not exist in the image to be processed, and when the area of the region is 1, it indicates that the entire image region of the image to be processed is the target region. The area of the region is divided, and each area interval corresponds to one beauty parameter model. After a target area in an image to be processed is obtained, an area corresponding to the target area is obtained, and then a corresponding beauty parameter model is obtained according to an area interval where the area is located. For example, the area of the region can be divided into four area intervals of 0-0.2, 0.2-0.6, 0.6-0.8 and 0.8-1, which correspond to model 1, model 2, model 3 and model 4 respectively. When the area of the target region is 0.5, the obtained beauty parameter model is model 2.
And step 306, acquiring the beauty basic parameters corresponding to the target area.
And 308, calculating a beauty coefficient according to the area and the beauty parameter model, and acquiring a target beauty parameter according to the beauty basic parameter and the beauty coefficient.
In one embodiment, the target area may correspond to the beauty basic parameter, and the corresponding beauty basic parameter is obtained according to the target area. Different target areas can correspond to the same basic beauty parameters and can also correspond to different basic beauty parameters. And calculating a beauty coefficient according to the area of the region and the beauty parameter model, and then acquiring a target beauty parameter according to the beauty basic parameter and the beauty coefficient. The beauty basic parameter refers to a reference value for performing beauty processing, and the beauty coefficient refers to a weight for acquiring the beauty parameter. And acquiring a target beauty parameter according to the beauty basic parameter and the beauty coefficient.
And 310, performing beauty treatment on the image to be treated according to the target beauty parameters.
After the target areas are obtained, an area identifier can be established for each target area, then the relation among the area identifiers, the position coordinates and the target beauty parameters is established, the corresponding target areas are obtained according to the position coordinates, and then the target areas are beautified according to the corresponding target beauty parameters. For example, it is detected that the image to be processed "pic.jpg" includes three face regions, i.e., a face1, a face2, a face3, and the like, corresponding face identifiers are face1, face2, and face3, respectively, and corresponding target beauty parameters are level 1 whitening, level 2 whitening, and level 1 acne removal, respectively.
The image processing method provided in the foregoing embodiment divides the area into a plurality of area sections, obtains a corresponding beauty parameter model according to the area section where the area region is located and the area corresponding to the target area in the image to be processed, obtains a corresponding target beauty parameter according to the area and the beauty parameter model, and performs beauty processing on the image to be processed according to the obtained target beauty parameter. Therefore, different beautifying treatments can be performed according to target areas with different areas, and the accuracy of the beautifying treatment is improved.
FIG. 6 is a flowchart of an image processing method in yet another embodiment. As shown in fig. 6, the image processing method includes steps 602 to 612. Wherein:
step 602, a target region and a corresponding region area in an image to be processed are obtained.
The target area in the image to be processed may be one or more, for example, there may be one face or multiple faces in the image to be processed, and the area where the face is located is taken as the target area. It can be understood that there may be no target area in the image to be processed, and the image to be processed without the target area may not be subjected to the beautifying processing. The target area in the image to be processed can be obtained through the area mark, and can also be directly obtained through detection in the image to be processed. The region mark means a mark for indicating a range of the target region in the image to be processed, for example, the target region is marked in the image to be processed by a red rectangular frame, and a region in the red rectangular frame is regarded as the target region. And if a plurality of target areas exist in one image to be processed, establishing a corresponding relation between each target area and the image identifier and the position coordinate. The position coordinates refer to coordinates representing the position of the target area in the image to be processed. For example, the position coordinates may be coordinates of a position of the center position of the target region in the image to be processed, or coordinates of a position of the upper left corner position in the image to be processed. After the target area is processed, searching a corresponding image to be processed through the image identifier, then searching a specific position of the target area in the image to be processed through the position coordinate, and restoring the target area.
Step 604, obtaining a corresponding beauty parameter model according to the area of the region, wherein the beauty parameter model is a model for calculating beauty parameters.
Step 606, obtaining character attribute characteristics corresponding to the target area, and obtaining corresponding beauty basic parameters according to the character attribute characteristics, wherein the beauty basic parameters comprise a first beauty basic parameter and a second beauty basic parameter.
The person attribute feature refers to a feature indicating a person attribute of a person in an image, and for example, the person attribute feature may refer to one or more of a gender feature, an age feature, a race feature, and the like. Specifically, a face region in the image to be processed is acquired, and then corresponding person attribute features are identified according to the face region. Furthermore, a human face area in the image to be processed is obtained, and the character attribute characteristics corresponding to the human face area are obtained through the characteristic identification model. The feature recognition model is a model for recognizing character attribute features, and is obtained by training a face sample set. The face sample set is an image set formed by a plurality of face images, and a feature recognition model is obtained through training according to the face sample set. For example, in supervised learning, each face image in the face sample set is labeled with a corresponding label for marking the type of the face image, and a feature recognition model can be obtained through training the face sample set. The feature recognition model can classify the face region to obtain corresponding character attribute features. For example, the face area may be divided into a yellow person, a black person and a white person, and the obtained corresponding person attribute feature is one of the yellow person, the black person and the white person. That is, the classification by the feature recognition model is based on the same criterion. If people attribute features of different dimensions of the face region are to be obtained, the people attribute features can be obtained through different feature recognition models respectively.
Specifically, the character attribute feature may include a race feature parameter, a gender feature parameter, an age feature parameter, a skin color feature parameter, a skin type feature parameter, a face style feature parameter, and a makeup feature parameter, which are not limited herein. For example, race feature parameters corresponding to the face region are obtained through the race recognition model, age feature parameters corresponding to the face region are obtained according to the age recognition model, and gender feature parameters corresponding to the face region are obtained according to the gender recognition model. And establishing a relation between the character attribute characteristics and the beauty basic parameters in advance, and acquiring the corresponding beauty basic parameters according to the character attribute characteristics. For example, the character attribute features may include a male and a female, and when the face is recognized as a male, the corresponding beauty basic parameter is a skin-polishing process, and when the face is recognized as a female, the corresponding beauty basic parameter is a skin-whitening process. The corresponding relation between the character attribute characteristics and the basic beauty parameters can be set by a user or obtained by the system through big data learning.
Step 608, calculating a beauty coefficient according to the area and the beauty parameter model, obtaining a corresponding first beauty parameter according to the first beauty basic parameter and the beauty coefficient, and using the second beauty basic parameter as a second beauty parameter.
The beautifying treatment may include different treatments such as peeling, face thinning, large eye, freckle removing, dark eye removing, whitening and brightening, sharpening, red lip and eye brightening, and some treatments may be affected by the area of the target area, and some treatments are not directly related to the area of the target area. For example, the size and detail information of the face can be changed by skin grinding, face thinning, eye enlarging, freckle removing, dark eye circle removing and other processing, and if the face area is too small, facial skin grinding, face thinning and other processing can cause facial features deformation, which leads to image distortion; the treatment of whitening, brightening, sharpening, lip-rounding and eye-brightening can not change the size, and can not cause the deformation of five sense organs. The target beauty parameter may include a first beauty parameter and a second beauty parameter, the first beauty parameter may be influenced by the area of the region and has a corresponding relationship with the area of the region, and the second beauty parameter may not be influenced by the area of the region. Specifically, the beauty basic parameters include a first beauty basic parameter and a second beauty basic parameter. And calculating a beauty coefficient according to the area of the region and the beauty parameter model, acquiring a corresponding first beauty parameter according to the first beauty basic parameter and the beauty coefficient, and taking the second beauty basic parameter as a second beauty parameter.
Further, when the area of the target region is too small, the target region may not be subjected to the beauty treatment, or may be subjected to a part of the treatment. For example, only a small degree of peeling or whitening of the target area is performed, which ensures that the distortion of the processed image is not too severe. An area threshold may be set while the beauty coefficients are divided into a first beauty coefficient and a second beauty coefficient. And when the area of the region is larger than the area threshold, calculating a first beauty coefficient according to the area of the region and the beauty parameter model, acquiring a corresponding first beauty parameter according to the first beauty basic parameter and the first beauty coefficient, and taking the second beauty basic parameter as a second beauty parameter. And when the area of the region is smaller than the area threshold, acquiring a first beauty coefficient and a second beauty coefficient, acquiring a first beauty parameter according to the first beauty coefficient and the first beauty basic parameter, and acquiring a second beauty parameter according to the second beauty coefficient and the second beauty basic parameter. The first beauty parameter and the second beauty parameter are set to be smaller values, and the first beauty coefficient and the second beauty coefficient which are acquired finally are smaller values.
And step 610, acquiring a target beauty parameter according to the first beauty parameter and the second beauty parameter.
In one embodiment, the beauty base parameter is assumed to be defaultParam, wherein the beauty base parameter is divided into a first beauty base parameter adjustParam and a second beauty base parameter unchangedParam. The beautifying treatment can respectively comprise the treatments of grinding the skin, thinning the face, enlarging the eyes, removing freckles, removing dark circles, whitening and brightening the eyes, sharpening the eyes, making the lips red, brightening the eyes and the like, and then:
defaultParam=[adjustParam|unchangedParam];
djustParam=[softenP,faceSlenderP,eyeLargerP,deblemishP,depouchP];
unchangedParam=[skinBrightenP,sharpP,lipP,eyeBrightenP];
wherein softenP represents a buffing basic parameter, faceSlenderP represents a face thinning basic parameter, eyeLargerP represents a large-eye basic parameter, debleishP represents a speckle removing basic parameter, and deponuhP represents a black eye removing basic parameter. skin brighten p whitening basic parameters, sharpP sharpening basic parameters, lipP red lip basic parameters, eyebright enp bright eye basic parameters.
Assuming that the beauty coefficient is Factor, the beauty coefficient includes a first beauty coefficient adjust Factor and a second beauty coefficient changedfactor. The first beauty parameter can be obtained according to the first beauty coefficient and the first beauty basic parameter, and the second beauty parameter can be obtained according to the second beauty coefficient and the second beauty basic parameter.
Factor=[adjustFactor|unchangedFactor];
adjustFactor=[softenF,faceSlenderF,eyeLargerF,deblemishF,depouchF];
unchangedFactor=[skinBrightenF,sharpF,lipF,eyeBrightenF];
Wherein softenF represents a buffing coefficient, faceSlenderF represents a face thinning coefficient, eyeLargerF represents a large-eye coefficient, debleishF represents a freckle removing coefficient, and deponuhF represents a black eye removing coefficient. skin brightenf whitening brightening coefficient, sharpF sharpening coefficient, lipF representing red lip coefficient, eyeBrightenF representing bright eye coefficient.
If the area of the acquired target area is FaceArea, the value range of the optimal area can be analyzed through big data, and the value range can be customized by a user. The value range of the area of the optimal region is assumed to be defined as [ stdArea ]min,stdAreamax]When the area of the region is within the value range, the first beauty basic parameter can be directly used as the first beauty parameter, that is, the first beauty coefficient adjust factor is [1,1,1,1]. When the area of the region exceeds the value range, the corresponding beauty coefficient can be obtained through a function, and the smaller the area of the general region is, the smaller the first beauty coefficient is, the larger the area of the region is, and the larger the first beauty coefficient is. In order to avoid the beauty treatment process from being too deep or too shallow, a lower limit value minArea and an upper limit value maxArea may be set for the area, and when the area is greater than the upper limit value or less than the lower limit value, the value of the first beauty coefficient is unchanged. Specifically, the upper limit value and the lower limit value may be customized, assuming that the lower limit value and the upper limit value are defined as minArea 0.25 stdArea, respectivelymin,maxArea=1.75*stdAreamaxThen, the formula for obtaining the first beauty coefficient is as follows:
Figure BDA0001489760190000141
when the area FaceArea is greater than minArea, the first beauty coefficient may be obtained according to the above formula, the first beauty parameter may be obtained according to the first beauty coefficient, and the second beauty parameter is directly used as the second beauty parameter, that is, the second beauty coefficient is [1,1,1,1 ]. When the area FaceArea is smaller than minArea, the target area may be selected not to be subjected to the beauty treatment, or may be subjected to a small degree of beauty treatment. Defining the first beauty coefficient and the second beauty coefficient as smaller values, obtaining a first beauty parameter according to the obtained first beauty coefficient and the first beauty basic parameter, and obtaining a second beauty parameter according to the obtained second beauty coefficient and the second beauty basic parameter, wherein the obtained first beauty parameter and the second beauty parameter are also smaller values. For example, when FaceArea is smaller than minArea, the first cosmetic factor may be made as adjustFactor [0.5,0,0,0,0], and the second cosmetic factor may be made as changedfactor [0,0,0,0 ]. And acquiring a corresponding target beauty parameter according to the acquired beauty coefficient and the beauty basic parameter, wherein the target beauty parameter adjustParam is as follows:
adjustParam=defaultParam*FactorT
FIG. 7 is a graph illustrating the variation of the beauty coefficients in one embodiment. As shown in fig. 7, when the area FaceArea is larger than minArea, the change of the first beauty coefficient increases in a stepwise manner, and is divided into four stages in total, the first stage: when minArea<FaceArea<stdAreaminWhen the first beauty coefficient and the area of the region increase linearly; and a second stage: when stdAreamin<FaceArea<stdAreamaxThen, the first beauty coefficient is kept unchanged; and a third stage: when stdAreamax<FaceArea<The first beauty coefficient and the area of the region increase linearly when maxAlea; a fourth stage: when FaceArea>The first beauty factor remains unchanged for maxArea.
And step 612, performing beauty treatment on the image to be treated according to the target beauty parameters.
A plurality of beautifying modules can be included in the system, and each beautifying module can perform a beautifying treatment. For example, the system may include a skin polishing module, a whitening module, a large-eye module, a face thinning module, and a skin color adjusting module, and may perform skin polishing, whitening, large-eye processing, face thinning, and skin color adjusting on the image to be processed, respectively. In one embodiment, each beautifying module may be a code function module through which beautifying processing of the image is implemented. Each code function module corresponds to a flag bit, and whether corresponding processing is performed or not is determined through the flag bit. For example, each beautifying module corresponds to a flag Stat, and when the value of the flag Stat is 1 or true, it indicates that the beautifying processing corresponding to the beautifying module needs to be performed; when the value of the flag Stat is 0 or false, it indicates that the beauty processing corresponding to the beauty module is not required.
Specifically, the flag bits of the beauty modules are assigned according to the target beauty parameters, the beauty modules for beauty treatment are obtained according to the flag bits, and the target beauty parameters are input into the obtained beauty modules to perform beauty treatment on the image to be treated. For example, the target beauty parameter includes that the whitening processing is performed on the human face, the flag bit corresponding to the whitening module is assigned to 1, and if the large-eye processing is not required, the flag bit corresponding to the large-eye module is assigned to 0. And when the beautifying processing is carried out, traversing each beautifying module to judge whether the corresponding processing is required or not according to the flag bit. It can be understood that the beautifying processing performed by each beautifying module is independent and does not affect each other. If the image needs to be subjected to various beautifying processes, the images can be sequentially subjected to the processing by the beautifying modules to obtain a final beautifying image.
When a plurality of target areas exist in the image to be processed, each target area can be traversed, a target beauty parameter corresponding to each target area is obtained, and the target areas are respectively subjected to beauty processing according to the obtained target beauty parameters. If only the target area in the image to be processed is subjected to the beautifying processing, and the remaining area except the target area in the image to be processed is not subjected to the beautifying processing, a significant difference between the target area and the remaining area may be caused after the processing. For example, after the whitening treatment is performed on the target area, the luminance of the target area is significantly higher than that of the remaining area, thus making the image look unnatural. Then the boundary of the target area can be subjected to transition processing in the generated beauty image, so that the obtained beauty image looks more natural.
In the image processing method provided by the above embodiment, the corresponding beauty parameter model is obtained according to the area corresponding to the target area in the image to be processed, the corresponding beauty basic parameter is obtained according to the character attribute characteristic of the target area, and the corresponding beauty coefficient is obtained according to the area and the beauty parameter model. And acquiring a target beauty parameter according to the beauty basic parameter and the beauty coefficient, and then performing beauty treatment on the image to be treated according to the acquired target beauty parameter. Therefore, different beautifying treatments can be performed according to target areas with different areas, and the accuracy of the beautifying treatment is improved.
Fig. 8 is a schematic structural diagram of an image processing apparatus according to an embodiment. As shown in fig. 8, the image processing apparatus 800 includes an image acquisition module 802, a model acquisition module 804, a parameter acquisition module 806, and a beauty processing module 808. Wherein:
an image obtaining module 802, configured to obtain a target region and a corresponding region area in an image to be processed.
A model obtaining module 804, configured to obtain a corresponding beauty parameter model according to the area, where the beauty parameter model is a model used for calculating beauty parameters.
A parameter obtaining module 806, configured to obtain a corresponding target beauty parameter according to the area and the beauty parameter model.
And the beauty processing module 808 is configured to perform beauty processing on the image to be processed according to the target beauty parameter.
The image processing apparatus provided in the above embodiment obtains the corresponding beauty parameter model according to the area corresponding to the target area in the image to be processed, obtains the corresponding target beauty parameter according to the area and the beauty parameter model, and performs beauty processing on the image to be processed according to the obtained target beauty parameter. Therefore, different beautifying treatments can be performed according to target areas with different areas, and the accuracy of the beautifying treatment is improved.
In an embodiment, the image obtaining module 802 is further configured to detect a face region in the image to be processed, and obtain a region area corresponding to the face region; or detecting a face region in the image to be processed, acquiring a corresponding skin region according to the face region, and acquiring a region area corresponding to the skin region.
In one embodiment, the image obtaining module 802 is further configured to generate a color histogram according to the color information corresponding to the face region; acquiring a peak value in the color histogram and a corresponding color interval; and dividing skin color intervals according to the color intervals, and taking the area corresponding to the skin color intervals in the face area as a skin area.
In an embodiment, the model obtaining module 804 is further configured to determine an area interval where the area of the region is located, and obtain a beauty parameter model corresponding to the area interval.
In one embodiment, the parameter obtaining module 806 is further configured to obtain a beauty basic parameter corresponding to the target area; and calculating a beauty coefficient according to the area and the beauty parameter model, and acquiring a target beauty parameter according to the beauty basic parameter and the beauty coefficient.
In an embodiment, the parameter obtaining module 806 is further configured to obtain a character attribute feature corresponding to the target area, and obtain a corresponding beauty basic parameter according to the character attribute feature.
In one embodiment, the parameter obtaining module 806 is further configured to calculate a beauty coefficient according to the area and beauty parameter model, obtain a corresponding first beauty parameter according to the first beauty basic parameter and beauty coefficient, and use the second beauty basic parameter as a second beauty parameter; and acquiring a target beauty parameter according to the first beauty parameter and the second beauty parameter.
The division of the modules in the image processing apparatus is only for illustration, and in other embodiments, the image processing apparatus may be divided into different modules as needed to complete all or part of the functions of the image processing apparatus.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media embodying a computer program that, when executed by one or more processors, causes the processors to perform the steps of:
acquiring a target region and a corresponding region area in an image to be processed;
acquiring a corresponding beauty parameter model according to the area, wherein the beauty parameter model is used for calculating beauty parameters;
acquiring a corresponding target beauty parameter according to the area and the beauty parameter model;
and performing beauty treatment on the image to be treated according to the target beauty parameter.
In one embodiment, the obtaining of the target region and the corresponding region area in the image to be processed performed by the processor comprises one of the following methods:
detecting a face region in an image to be processed, and acquiring a region area corresponding to the face region;
detecting a face region in an image to be processed, acquiring a corresponding skin region according to the face region, and acquiring a region area corresponding to the skin region.
In one embodiment, the obtaining, by the processor, a corresponding skin region according to the face region includes:
generating a color histogram according to the color information corresponding to the face region;
acquiring a peak value in the color histogram and a corresponding color interval;
and dividing skin color intervals according to the color intervals, and taking the area corresponding to the skin color intervals in the face area as a skin area.
In one embodiment, the obtaining of the corresponding beauty parameter model according to the area of the region performed by the processor comprises:
and determining an area interval where the area of the area is located, and obtaining a beauty parameter model corresponding to the area interval.
In one embodiment, the obtaining of the corresponding target beauty parameter according to the region area and beauty parameter model performed by the processor comprises:
acquiring beauty basic parameters corresponding to the target area;
and calculating a beauty coefficient according to the area and the beauty parameter model, and acquiring a target beauty parameter according to the beauty basic parameter and the beauty coefficient.
In one embodiment, the obtaining, performed by the processor, the beauty basic parameter corresponding to the target area includes:
and acquiring character attribute characteristics corresponding to the target area, and acquiring corresponding beauty basic parameters according to the character attribute characteristics.
In one embodiment, the beauty base parameter includes a first beauty base parameter and a second beauty base parameter;
the obtaining of the corresponding target beauty parameter according to the region area and beauty parameter model executed by the processor comprises:
calculating a beauty coefficient according to the area and the beauty parameter model, acquiring a corresponding first beauty parameter according to the first beauty basic parameter and the beauty coefficient, and taking the second beauty basic parameter as a second beauty parameter;
and acquiring a target beauty parameter according to the first beauty parameter and the second beauty parameter.
The embodiment of the application also provides the electronic equipment. The electronic device includes therein an Image Processing circuit, which may be implemented using hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 9 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 9, for convenience of explanation, only aspects of the image processing technique related to the embodiments of the present application are shown.
As shown in fig. 9, the image processing circuit includes an ISP processor 940 and a control logic 950. The image data captured by the imaging device 910 is first processed by the ISP processor 940, and the ISP processor 940 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of the imaging device 910. The imaging device 910 may include a camera having one or more lenses 912 and an image sensor 914. Image sensor 914 may include an array of color filters (e.g., Bayer filters), and image sensor 914 may acquire light intensity and wavelength information captured with each imaging pixel of image sensor 914 and provide a set of raw image data that may be processed by ISP processor 940. The sensor 920 (e.g., a gyroscope) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 940 based on the type of interface of the sensor 920. The sensor 920 interface may utilize an SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
In addition, image sensor 914 may also send raw image data to sensor 920, sensor 920 may provide raw image data to ISP processor 940 based on the type of interface of sensor 920, or sensor 920 may store raw image data in image memory 930.
The ISP processor 940 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 940 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 940 may also receive image data from image memory 930. For example, the sensor 920 interface sends raw image data to the image memory 930, and the raw image data in the image memory 930 is then provided to the ISP processor 940 for processing. The image Memory 930 may be a part of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving raw image data from image sensor 914 interface or from sensor 920 interface or from image memory 930, ISP processor 940 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 930 for additional processing before being displayed. ISP processor 940 may also receive from image memory 930 processed data for image data processing in the raw domain and in the RGB and YCbCr color spaces. The processed image data may be output to a display 980 for viewing by a user and/or further Processing by a Graphics Processing Unit (GPU). Further, the output of ISP processor 940 may also be sent to image memory 930 and display 980 may read image data from image memory 930. In one embodiment, image memory 930 may be configured to implement one or more frame buffers. In addition, the output of the ISP processor 940 may be transmitted to an encoder/decoder 970 for encoding/decoding image data. The encoded image data may be saved and decompressed before being displayed on a display 980 device.
The step of the ISP processor 940 processing the image data includes: the image data is subjected to VFE (Video Front End) Processing and CPP (Camera Post Processing). The VFE processing of the image data may include modifying the contrast or brightness of the image data, modifying digitally recorded lighting status data, performing compensation processing (e.g., white balance, automatic gain control, gamma correction, etc.) on the image data, performing filter processing on the image data, etc. CPP processing of image data may include scaling an image, providing a preview frame and a record frame to each path. Among other things, the CPP may use different codecs to process the preview and record frames. The image data processed by the ISP processor 940 may be sent to a beauty module 960 for beauty processing of the image before being displayed. The beautifying module 960 may beautify the image data, including: whitening, removing freckles, buffing, thinning face, removing acnes, enlarging eyes and the like. The beauty module 960 may be a Central Processing Unit (CPU), a GPU, a coprocessor, or the like. The data processed by the beauty module 960 may be transmitted to the encoder/decoder 970 in order to encode/decode image data. The encoded image data may be saved and decompressed before being displayed on a display 980 device. The beauty module 960 may also be located between the encoder/decoder 970 and the display 980, i.e., the beauty module performs beauty processing on the imaged image. The encoder/decoder 970 may be a CPU, GPU, coprocessor, or the like in the mobile terminal.
The statistical data determined by the ISP processor 940 may be transmitted to the control logic 950 unit. For example, the statistical data may include image sensor 914 statistics such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 912 shading correction, and the like. The control logic 950 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of the imaging device 910 and control parameters of the ISP processor 940 based on the received statistical data. For example, the control parameters of the imaging device 910 may include sensor 920 control parameters (e.g., gain, integration time for exposure control), camera flash control parameters, lens 912 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 912 shading correction parameters.
The image processing method provided by the above-described embodiment can be realized by using the image processing technology in fig. 9.
A computer program product comprising instructions which, when run on a computer, cause the computer to perform the image processing method provided by the above embodiments.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An image processing method, characterized in that the method comprises:
acquiring a target region and a corresponding region area in an image to be processed; the target area is an independent connected area;
acquiring a corresponding beauty parameter model according to the area, wherein the beauty parameter model is used for calculating beauty parameters according to the area; wherein, different area areas correspond to different beauty parameter models;
acquiring a corresponding target beauty parameter according to the area and the beauty parameter model; the target beauty parameters comprise a beauty basic parameter and a beauty coefficient; the beauty basic parameter represents a reference value for beauty treatment; the beauty coefficient is used for representing the weight for acquiring the target beauty parameter;
and performing beauty treatment on the image to be treated according to the target beauty parameter.
2. The image processing method according to claim 1, wherein the acquiring the target region and the corresponding region area in the image to be processed comprises one of the following methods:
detecting a face region in an image to be processed, and acquiring a region area corresponding to the face region;
detecting a face region in an image to be processed, acquiring a corresponding skin region according to the face region, and acquiring a region area corresponding to the skin region.
3. The image processing method according to claim 2, wherein the obtaining the corresponding skin region according to the face region comprises:
generating a color histogram according to the color information corresponding to the face region;
acquiring a peak value in the color histogram and a corresponding color interval;
and dividing skin color intervals according to the color intervals, and taking the area corresponding to the skin color intervals in the face area as a skin area.
4. The image processing method according to claim 1, wherein said obtaining a corresponding beauty parameter model according to the area of the region comprises:
and determining an area interval where the area of the area is located, and obtaining a beauty parameter model corresponding to the area interval.
5. The image processing method according to any one of claims 1 to 4, wherein the obtaining of the corresponding target beauty parameter according to the region area and beauty parameter model comprises:
acquiring beauty basic parameters corresponding to the target area;
and calculating a beauty coefficient according to the area and the beauty parameter model, and acquiring a target beauty parameter according to the beauty basic parameter and the beauty coefficient.
6. The image processing method according to claim 5, wherein the obtaining of the beauty basic parameter corresponding to the target area comprises:
and acquiring character attribute characteristics corresponding to the target area, and acquiring corresponding beauty basic parameters according to the character attribute characteristics.
7. The image processing method according to claim 5, wherein the beauty base parameter includes a first beauty base parameter and a second beauty base parameter;
the obtaining of the corresponding target beauty parameter according to the region area and beauty parameter model comprises:
calculating a beauty coefficient according to the area and the beauty parameter model, acquiring a corresponding first beauty parameter according to the first beauty basic parameter and the beauty coefficient, and taking the second beauty basic parameter as a second beauty parameter;
and acquiring a target beauty parameter according to the first beauty parameter and the second beauty parameter.
8. An image processing apparatus, characterized in that the apparatus comprises:
the image acquisition module is used for acquiring a target area and a corresponding area in an image to be processed; the target area is an independent connected area;
the model obtaining module is used for obtaining a corresponding beauty parameter model according to the area of the region, wherein the beauty parameter model is used for calculating beauty parameters according to the area of the region; wherein, different area areas correspond to different beauty parameter models;
the parameter acquisition module is used for acquiring corresponding target beauty parameters according to the area and the beauty parameter model; the target beauty parameters comprise a beauty basic parameter and a beauty coefficient; the beauty basic parameter represents a reference value for beauty treatment; the beauty coefficient is used for representing the weight for acquiring the target beauty parameter;
and the beautifying processing module is used for performing beautifying processing on the image to be processed according to the target beautifying parameter.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the image processing method according to any one of claims 1 to 7.
10. An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the image processing method of any of claims 1 to 7.
CN201711240703.1A 2017-11-30 2017-11-30 Image processing method, image processing device, computer-readable storage medium and electronic equipment Expired - Fee Related CN107993209B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711240703.1A CN107993209B (en) 2017-11-30 2017-11-30 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711240703.1A CN107993209B (en) 2017-11-30 2017-11-30 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN107993209A CN107993209A (en) 2018-05-04
CN107993209B true CN107993209B (en) 2020-06-12

Family

ID=62034731

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711240703.1A Expired - Fee Related CN107993209B (en) 2017-11-30 2017-11-30 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN107993209B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108765273B (en) * 2018-05-31 2021-03-09 Oppo广东移动通信有限公司 Virtual face-lifting method and device for face photographing
CN108764370B (en) * 2018-06-08 2021-03-12 Oppo广东移动通信有限公司 Image processing method, image processing device, computer-readable storage medium and computer equipment
CN108921798B (en) * 2018-06-14 2021-06-22 北京微播视界科技有限公司 Image processing method and device and electronic equipment
CN109003246A (en) * 2018-08-26 2018-12-14 朱丽萍 Eye repairs graph parameter detection method
CN109919029A (en) * 2019-01-31 2019-06-21 深圳和而泰数据资源与云技术有限公司 Black eye kind identification method, device, computer equipment and storage medium
CN110136054B (en) * 2019-05-17 2024-01-09 北京字节跳动网络技术有限公司 Image processing method and device
CN111275650B (en) * 2020-02-25 2023-10-17 抖音视界有限公司 Beauty treatment method and device
CN112887665B (en) * 2020-12-30 2023-07-18 重庆邮电大学移通学院 Video image processing method and related device

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065290B (en) * 2013-01-23 2016-05-18 广东欧珀移动通信有限公司 In photo, carry out the apparatus and method of flesh correction
TW201445454A (en) * 2013-05-22 2014-12-01 Asustek Comp Inc Image processing system and method of promoting human face recognition
CN103413270A (en) * 2013-08-15 2013-11-27 北京小米科技有限责任公司 Method and device for image processing and terminal device
CN103605975B (en) * 2013-11-28 2018-10-19 小米科技有限责任公司 A kind of method, apparatus and terminal device of image procossing
CN103632165B (en) * 2013-11-28 2017-07-04 小米科技有限责任公司 A kind of method of image procossing, device and terminal device
CN104751419B (en) * 2015-03-05 2018-03-27 广东欧珀移动通信有限公司 A kind of photo vision-control method and terminal
CN106210516B (en) * 2016-07-06 2019-05-14 Oppo广东移动通信有限公司 One kind is taken pictures processing method and terminal
CN106210521A (en) * 2016-07-15 2016-12-07 深圳市金立通信设备有限公司 A kind of photographic method and terminal
CN106326849A (en) * 2016-08-17 2017-01-11 北京小米移动软件有限公司 Beauty processing method and device
CN107124548A (en) * 2017-04-25 2017-09-01 深圳市金立通信设备有限公司 A kind of photographic method and terminal
CN107341762B (en) * 2017-06-16 2021-04-16 Oppo广东移动通信有限公司 Photographing processing method and device and terminal equipment
CN107301626B (en) * 2017-06-22 2020-11-06 成都品果科技有限公司 Buffing algorithm suitable for shooting images by mobile equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A study for facial beauty prediction model;Junying Gan 等;《2015 International Conference on Wavelet Analysis and Pattern Recognition (ICWAPR)》;20151231;8-13 *
基于Android人脸美化App的研究与实现;欧阳杰臣 等;《计算机技术与发展》;20160331;第26卷(第3期);9-13 *

Also Published As

Publication number Publication date
CN107993209A (en) 2018-05-04

Similar Documents

Publication Publication Date Title
CN107730445B (en) Image processing method, image processing apparatus, storage medium, and electronic device
CN107808136B (en) Image processing method, image processing device, readable storage medium and computer equipment
CN107993209B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN107730444B (en) Image processing method, image processing device, readable storage medium and computer equipment
EP3477931B1 (en) Image processing method and device, readable storage medium and electronic device
CN107730446B (en) Image processing method, image processing device, computer equipment and computer readable storage medium
CN107680128B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107945135B (en) Image processing method, image processing apparatus, storage medium, and electronic device
CN107818305B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107886484B (en) Beautifying method, beautifying device, computer-readable storage medium and electronic equipment
CN107451969B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN107424198B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN107766831B (en) Image processing method, image processing device, mobile terminal and computer-readable storage medium
CN108537749B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN108537155B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN108024107B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107509031B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN107563976B (en) Beauty parameter obtaining method and device, readable storage medium and computer equipment
CN107862659B (en) Image processing method, image processing device, computer equipment and computer readable storage medium
CN107862658B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN107862653B (en) Image display method, image display device, storage medium and electronic equipment
CN108846807B (en) Light effect processing method and device, terminal and computer-readable storage medium
CN108111749B (en) Image processing method and device
CN107800965B (en) Image processing method, device, computer readable storage medium and computer equipment
CN108717530B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200612