CN107862658B - Image processing method, image processing device, computer-readable storage medium and electronic equipment - Google Patents

Image processing method, image processing device, computer-readable storage medium and electronic equipment Download PDF

Info

Publication number
CN107862658B
CN107862658B CN201711054076.2A CN201711054076A CN107862658B CN 107862658 B CN107862658 B CN 107862658B CN 201711054076 A CN201711054076 A CN 201711054076A CN 107862658 B CN107862658 B CN 107862658B
Authority
CN
China
Prior art keywords
image
channel
beauty
processed
beautifying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711054076.2A
Other languages
Chinese (zh)
Other versions
CN107862658A (en
Inventor
杜成鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201711054076.2A priority Critical patent/CN107862658B/en
Publication of CN107862658A publication Critical patent/CN107862658A/en
Application granted granted Critical
Publication of CN107862658B publication Critical patent/CN107862658B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/169Holistic features and representations, i.e. based on the facial image taken as a whole
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)

Abstract

The application relates to an image processing method, an image processing device, a computer readable storage medium and an electronic device. The method comprises the following steps: acquiring an image to be processed; acquiring a brightness value corresponding to each channel image in the image to be processed, and acquiring a beauty parameter corresponding to each channel image according to the brightness value; performing face beautifying processing on each channel image according to the face beautifying parameters; and fusing the channel images after the beautifying processing to obtain a beautifying image. The image processing method, the image processing device, the computer readable storage medium and the electronic equipment improve the accuracy of image processing.

Description

Image processing method, image processing device, computer-readable storage medium and electronic equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, a computer-readable storage medium, and an electronic device.
Background
Photographing is an indispensable skill in both work and life. In order to take a satisfactory picture, it is necessary to improve not only the shooting parameters during shooting but also the picture itself after completion of shooting. The beauty treatment is a method for beautifying the photos, and people in the photos can be more beautiful to human after the beauty treatment.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, a computer readable storage medium and an electronic device, which can improve the accuracy of image processing.
A method of image processing, the method comprising:
acquiring an image to be processed;
acquiring a brightness value corresponding to each channel image in the image to be processed, and acquiring a beauty parameter corresponding to each channel image according to the brightness value;
performing face beautifying processing on each channel image according to the face beautifying parameters;
and fusing the channel images after the beautifying processing to obtain a beautifying image.
An image processing apparatus, the apparatus comprising:
the image acquisition module is used for acquiring an image to be processed;
the parameter acquisition module is used for acquiring the brightness value corresponding to each channel image in the image to be processed and acquiring the beauty parameter corresponding to each channel image according to the brightness value;
the beautifying processing module is used for respectively carrying out beautifying processing on each channel image according to the beautifying parameters;
and the image fusion module is used for fusing the image of each channel after the beautifying processing to obtain a beautifying image.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring an image to be processed;
acquiring a brightness value corresponding to each channel image in the image to be processed, and acquiring a beauty parameter corresponding to each channel image according to the brightness value;
performing face beautifying processing on each channel image according to the face beautifying parameters;
and fusing the channel images after the beautifying processing to obtain a beautifying image.
An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the steps of:
acquiring an image to be processed;
acquiring a brightness value corresponding to each channel image in the image to be processed, and acquiring a beauty parameter corresponding to each channel image according to the brightness value;
performing face beautifying processing on each channel image according to the face beautifying parameters;
and fusing the channel images after the beautifying processing to obtain a beautifying image.
According to the image processing method, the image processing device, the computer readable storage medium and the electronic equipment, firstly, the brightness value of each channel image in the image to be processed is obtained, the beautifying parameter of each channel image is obtained according to the brightness value, and then the beautifying processing is carried out on each channel image according to the obtained beautifying parameter. Therefore, different beautifying treatments can be performed on each channel image, so that the beautifying treatment is optimized, and the image treatment is more accurate.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an exemplary embodiment of an image processing method;
FIG. 2 is a flow diagram of a method of image processing in one embodiment;
FIG. 3 is a flow chart of an image processing method in another embodiment;
FIG. 4 is a schematic diagram of obtaining depth information in one embodiment;
FIG. 5 is a flowchart of an image processing method in yet another embodiment;
FIG. 6 is a flowchart of an image processing method in yet another embodiment;
FIG. 7 is a diagram showing a configuration of an image processing apparatus according to an embodiment;
FIG. 8 is a schematic diagram showing the configuration of an image processing system according to an embodiment;
FIG. 9 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first acquisition module may be referred to as a second acquisition module, and similarly, a second acquisition module may be referred to as a first acquisition module, without departing from the scope of the present application. The first acquisition module and the second acquisition module are both acquisition modules, but they are not the same acquisition module.
FIG. 1 is a diagram of an embodiment of an application environment of an image processing method. As shown in fig. 1, the application environment includes a user terminal 102 and a server 104. The user terminal 102 may be configured to collect an image to be processed, generate the image to be processed, and then send the image to be processed to the server 104. After receiving the image to be processed, the server 104 obtains a brightness value corresponding to each channel image in the image to be processed, and obtains a beauty parameter corresponding to each channel image according to the brightness value; performing face beautifying treatment on each channel image according to the face beautifying parameters; and fusing the image of each channel after the beautifying treatment to obtain a beautifying image. Finally, the server 104 returns the beauty image to the user terminal 102. It is understood that the user terminal 102 may send a collection of images to the server 104, the collection of images including a plurality of images. After receiving the image set, the server 104 performs a beautifying process on the images in the image set. The user terminal 102 is an electronic device located at the outermost periphery of the computer network and mainly used for inputting user information and outputting a processing result, and may be, for example, a personal computer, a mobile terminal, a personal digital assistant, a wearable electronic device, or the like. The server 104 is a device, such as one or more computers, for responding to service requests while providing computing services. It is understood that in other embodiments provided in the present application, the application environment of the image processing method may include only the user terminal 102, that is, the user terminal 102 is used for acquiring the image to be processed and performing the beauty processing on the image to be processed.
FIG. 2 is a flow diagram of a method of image processing in one embodiment. As shown in fig. 2, the image processing method includes steps 202 to 206. Wherein:
step 202, acquiring an image to be processed.
In one embodiment, the image to be processed refers to an image that needs to be beautified. The image to be processed may be acquired by the mobile terminal. The mobile terminal is provided with a camera which can be used for shooting, a user can initiate a shooting instruction through the mobile terminal, and the mobile terminal collects shooting images through the camera after detecting the shooting instruction. The mobile terminal stores the collected images to form an image set. It is understood that the image to be processed may be acquired by other ways, and is not limited herein. For example, the image to be processed may be downloaded from a web page, or imported from an external storage device, etc. The acquiring of the image to be processed may specifically include: receiving a beautifying instruction input by a user, and acquiring an image to be processed according to the beautifying instruction, wherein the beautifying instruction comprises an image identifier. The image identification refers to a unique identification for distinguishing different images to be processed, and the images to be processed are obtained according to the image identification. For example, the image identification may be one or more of an image name, an image code, an image storage address, and the like. Specifically, after the mobile terminal acquires the image to be processed, the mobile terminal may perform the beautifying processing locally, or send the image to be processed to the server for beautifying processing.
And 204, acquiring a brightness value corresponding to each channel image in the image to be processed, and acquiring a beauty parameter corresponding to each channel image according to the brightness value.
Specifically, the image to be processed is composed of a plurality of pixel points, each pixel point may be composed of a plurality of color channels, and each color channel represents a color component. For example, the image may be composed of three channels of RGB (Red Green Blue, Red, Green, Blue) or three channels of CMY (Cyan Magenta Yellow). In the process of image processing, each color component of the image can be extracted through a function, and each color component is processed respectively. For example, in Matlab, an image named "rainbow.jpg" is read by an imread () function, and let im be equal to imread ('rainbow.jpg'), RGB color components may be extracted by the functions r being equal to im (: 1), g being equal to im (: 2), and b being equal to im (: 3). When the image is subjected to the beautifying processing, the image formed by the pixels of each color channel in the image to be processed can be subjected to the beautifying processing respectively, and the processing of each color channel can be different.
The brightness refers to the brightness of the image, and the brightness value can represent the deviation degree of the image. The method for obtaining the beauty degree parameter according to the brightness value may specifically include: and acquiring a brightness reference value corresponding to each channel image, and acquiring the beauty parameter of each channel image according to the difference value between the acquired brightness value and the brightness reference value. For example, a standard RGB channel value for a skin tone may be first established, assuming that the standard skin tone corresponds to RGB channel values of 233, 206, and 188, respectively. When the skin area is whitened, brightness values of RGB three channels may be obtained respectively, and the brightness values are compared with standard channel values, and the channel with the larger difference is, the deeper the corresponding whitening degree is, that is, the larger the corresponding beauty parameter is. Specifically, the brightness values in the channel images may be obtained respectively, and the beauty parameters corresponding to the channel images may be obtained according to the obtained brightness values. The beauty parameter is a parameter for performing beauty processing on the image, and the beauty parameter can reflect the degree of the beauty processing on the image. For example, when the image is subjected to the skin polishing treatment, the corresponding beauty parameter may be a beauty level, the beauty level may be divided into 1 level, 2 levels and 3 levels, and the degree of the skin polishing treatment is gradually increased from the 1 level to the 3 levels.
And step 206, performing the beauty treatment on each channel image according to the beauty parameters.
The beautifying processing is a method for beautifying an image, and particularly relates to a method for beautifying a portrait in an image. In general, the beauty process may be performed for the entire image or only for one region in the image. For example, the beautifying process may include whitening, buffing, face-thinning, and slimming processes, which may improve brightness and smoothness of the image, so the whitening, buffing, and the like may be performed on the entire image, and the face-thinning, slimming, and the like may be performed only on the region where the portrait is located. The brightness value of the image to be processed and the beauty parameter have a corresponding relation, the beauty parameter of each channel image is obtained according to the brightness value, and the beauty processing is respectively carried out on each channel image according to the beauty parameters. It is understood that the corresponding relationship between the brightness value and the beauty parameter may be a linear functional relationship or a non-linear functional relationship. For example, in an RGB image, the image may include an R-channel image, a G-channel image, and a B-channel image, the luminance values of the three channel images are 50, 210, and 130, respectively, and the corresponding degrees of beauty are 1 level, 3 levels, and 2 levels, respectively, so that the R-channel image, the G-channel image, and the B-channel image need to be subjected to 1-level, 3-level, and 2-level beauty processing, respectively.
And step 208, fusing the channel images subjected to the beautifying processing to obtain a beautifying image.
In one embodiment, image fusion refers to a process of combining a plurality of images to generate a target image. And after each channel image of the image to be processed is subjected to beautifying processing, fusing each channel image subjected to beautifying processing to obtain a final beautifying image. And performing color beautifying processing according to the brightness value in each channel image, wherein the lower the brightness value is, the more serious the brightness deviation is. And acquiring corresponding beauty parameters according to the brightness deviation, and performing beauty treatment on each channel image respectively. For example, when the buffing processing is performed, the luminance value of the G-channel image is the highest, and in order to retain the detail information of the luminance area of the G-channel image, the buffing processing may be performed to a shallower degree on the G-channel image.
The image processing method, the image processing apparatus, the computer-readable storage medium, and the electronic device provided in the above embodiments first obtain the brightness value of each channel image in the image to be processed, obtain the beauty parameter of each channel image according to the brightness value, and then perform beauty processing on each channel image according to the obtained beauty parameter. Therefore, different beautifying treatments can be performed on each channel image, so that the beautifying treatment is optimized, and the image treatment is more accurate.
Fig. 3 is a flowchart of an image processing method in another embodiment. As shown in fig. 3, the image processing method includes steps 302 to 310. Wherein:
step 302, acquiring an image to be processed.
In an embodiment, the image to be processed may be acquired by the mobile terminal, and after the image to be processed is acquired, the image may be subjected to the skin care processing locally at the mobile terminal, or may be sent to the server for the skin care processing. If the color is beautified on the server, one image to be processed may be sent to the server, and the set of images to be processed refers to a set formed by one or more images to be processed. Each mobile terminal can send the image set to be processed to the server, and after receiving the image set to be processed, the server performs the beautifying processing on the image to be processed in the image set to be processed. When the mobile terminal sends the image set to be processed, the corresponding terminal identification is sent at the same time, after the server finishes processing, the corresponding mobile terminal is searched according to the terminal identification, and the image set to be processed after the processing is finished is sent to the mobile terminal. The terminal identifier refers to a unique identifier of the user terminal. For example, the terminal identifier may be at least one of an IP (Internet Protocol, Protocol for interconnecting networks) address, a MAC (Media Access Control) address, and the like.
Step 304, acquiring a target area in the image to be processed.
In general, the user focuses on not the entire region in the image but a certain region in the image. For example, a user generally compares the region where a person in an image of interest is located, or the region where a face of a person is located. The target area is an area that the user pays attention to, and when the beauty parameters are obtained, the brightness value in the whole image does not need to be obtained, and only the brightness value of the target area is obtained. For example, the target area may refer to a human face area, a portrait area, a skin area, a lip area, and the like, which are not limited herein. Specifically, the target area may be a face area or a portrait area in the image to be processed, where the face area is an area where a face of a portrait in the image to be processed is located, and the portrait area is an area where the whole portrait in the image to be processed is located. The acquiring of the target region in the image to be processed may specifically include: detecting a face area in an image to be processed, and taking the face area as a target area; and/or detecting a face region in the image to be processed, acquiring a portrait region according to the face region, and taking the portrait region as a target region.
It is easy to understand that the image to be processed is composed of a plurality of pixel points, and the human face area is an area composed of pixel points corresponding to the human face in the image to be processed. Specifically, the face region of the image to be processed may be obtained through a face detection algorithm, and the face detection algorithm may include a detection method based on geometric features, a feature face detection method, a linear discriminant analysis method, a detection method based on a hidden markov model, and the like, which is not limited herein. Generally, when an image is acquired by an image acquisition device, a depth map corresponding to the image can be acquired at the same time, and a pixel point in the depth map corresponds to a pixel point in the image. And the pixel points in the depth map represent depth information of corresponding pixels in the image, and the depth information is depth information from an object corresponding to the pixel points to the image acquisition device. For example, the depth information may be obtained by two cameras, and the obtained depth information corresponding to the pixel points may be 1 meter, 2 meters, or 3 meters. The acquiring the portrait area may specifically include: acquiring an image to be processed and corresponding depth information; and detecting a face region in the image to be processed, and acquiring the face region in the image to be processed according to the face region and the depth information. Generally, the portrait and the face are on the same vertical plane, and the value of the depth information from the portrait to the image acquisition device and the value of the depth information from the face to the image acquisition device are in the same range. Therefore, after the face region is obtained, the depth information corresponding to the face region can be obtained from the depth map, then the depth information corresponding to the portrait region can be obtained according to the depth information corresponding to the face region, and then the portrait region in the image to be processed can be obtained according to the depth information corresponding to the portrait region.
FIG. 4 is a schematic diagram of obtaining depth information in one embodiment. As shown in FIG. 4, the distance T between the first camera 402 and the second camera 404 is knowncThe first camera 402 and the second camera 404 respectively capture images corresponding to the object 406The first included angle A can be obtained according to the image1And a second angle a2, the vertical intersection between the horizontal line from the first camera 402 to the second camera 404 and the object 402 being the intersection point 408. Assume that the first camera 402 is at a distance T from the intersection 408xThen the distance from the intersection 408 to the second camera 404 is Tc-TxThe depth information of object 406, i.e. the vertical distance of object 406 to intersection 408, is Ts. From the triangle formed by the first camera 402, the object 406 and the intersection 408, the following formula can be obtained:
Figure BDA0001451485350000081
similarly, from the triangle formed by second camera 404, object 406 and intersection 408, the following formula can be obtained:
Figure BDA0001451485350000082
the depth information of the object 406 can be obtained from the above formula as:
Figure BDA0001451485350000083
step 306, obtaining the brightness value corresponding to each channel image of the target area, and obtaining the beauty parameter corresponding to each channel image according to the brightness value.
And acquiring a brightness value corresponding to each channel image of the target area, and acquiring a beauty parameter corresponding to each channel image according to the brightness value. For example, the luminance values of the CMY channel images corresponding to the face area in the image to be processed are obtained, and if the luminance value corresponding to the C channel image is the largest, the buffing degree corresponding to the C channel image corresponding to the image to be processed is the lowest. It can be understood that, when performing the beauty treatment, the whole image to be treated may not be treated, but only the target area is treated, and then the beauty parameter corresponding to each channel image of the target area may be obtained according to the brightness value, and the beauty treatment may be performed on each channel image of the target area according to the beauty parameter. Generally, the image to be processed may include one or more target regions, each of which may be an independent connected region, and these independent target regions are extracted from the image to be processed. When the brightness value of the target area is obtained, if two or more target areas exist in the image to be processed, the multiple target areas may be used as a whole to obtain the brightness value corresponding to each channel image, and the beauty parameter of each channel image is obtained according to the obtained brightness value, or the brightness value of each channel image corresponding to each target area may be obtained respectively, and the beauty parameter of each channel image corresponding to each target area is obtained respectively according to the brightness value. For example, if the image to be processed includes a face 1 and a face 2, when the beauty parameters are obtained, the face 1 and the face 2 may be used as a whole to obtain the brightness values of the RGB three-channel image, and the beauty parameters of the RGB three-channel image corresponding to the image to be processed are obtained through the brightness values. Or the brightness values of the face 1 and the face 2 may be obtained respectively, and the beauty parameters corresponding to the face 1 and the face 2 are obtained respectively according to the obtained brightness values. Specifically, the brightness value of an RGB three-channel image corresponding to the face 1 is obtained, and the beauty parameters of the RGB three-channel image corresponding to the face 1 are respectively obtained according to the obtained brightness value; acquiring brightness values of the RGB three-channel images corresponding to the face 2, and respectively acquiring beauty parameters of the RGB three-channel images corresponding to the face 2 according to the acquired brightness values.
Specifically, when the face area is beautified, the areas of the face areas in the images may be different, the area of the main face which generally needs to be highlighted is larger, and the area of the face of the passerby is smaller. Meanwhile, when the area of the face is small, if the face is ground and the like, five sense organs of the face can be blurred after the face is processed. Then, when performing the face-beautifying processing, the area corresponding to the target area may be obtained, and if the area is smaller than the area threshold, the face-beautifying processing is not performed, and only the target area whose area is larger than the area threshold is performed with the face-beautifying processing. Then step 306 may also be preceded by: and acquiring the area of the target area, and acquiring the face area of which the area is larger than an area threshold. The target area is composed of a plurality of pixel points, and the area of the target area can be expressed as the total number of the pixel points contained in the target area, and can also be expressed as the area ratio of the target area to the corresponding image to be processed.
And 308, performing face beautifying processing on each channel image according to the face beautifying parameters.
In one embodiment, the beauty parameter of each channel image in the image to be processed is acquired according to the brightness value of the target area, and the beauty processing is performed on each channel image in the image to be processed according to the acquired beauty parameter. Or only processing the target area, that is, obtaining the brightness value of the target area, obtaining the beauty parameter corresponding to each channel image of the target area according to the brightness value, and performing beauty processing on each channel image of the target area according to the obtained beauty parameter. For example, the brightness values of the RGB three-channel images corresponding to the skin area may be acquired, the whitening levels of the RGB three-channel images of the skin area are respectively acquired according to the acquired brightness values, and then the whitening processing of the corresponding degree is respectively performed on the RGB three-channel images of the skin area according to the acquired whitening levels.
And step 310, fusing the channel images subjected to the beautifying processing to obtain a beautifying image.
In one embodiment, if only the target area in the image to be processed is beautified, and the remaining area of the image to be processed except the target area is not beautified, a significant difference between the target area and the remaining area may be caused after the processing. For example, after the whitening treatment is performed on the target area, the luminance of the target area is significantly higher than that of the remaining area, thus making the image look unnatural. Then the boundary of the target area can be subjected to transition processing in the generated beauty image, so that the obtained beauty image looks more natural.
The image processing method provided in the foregoing embodiment first obtains the brightness value of each channel image of the target area in the image to be processed, obtains the beauty parameter of each channel image according to the brightness value, and then performs beauty processing on each channel image according to the obtained beauty parameter. Therefore, different beautifying treatments can be performed on each channel image, so that the beautifying treatment is optimized, and the image treatment is more accurate.
FIG. 5 is a flowchart of an image processing method in yet another embodiment. As shown in fig. 5, the image processing method includes steps 502 to 512. Wherein:
step 502, acquiring an image to be processed.
Step 504, obtaining a brightness value corresponding to each channel image in the image to be processed.
It can be understood that the image to be processed is composed of a plurality of pixel points, and the pixel points are arranged according to a certain rule. Generally, an image to be processed is a two-dimensional pixel array, each pixel point has a corresponding channel value, and the position of the pixel point in the image can be represented by a position coordinate. For example, the to-be-processed image may be 640 × 480, which means that the to-be-processed image has 640 pixels in each length direction and 480 pixels in each width direction, and the total amount of the pixels is 640 × 480 — 307200, that is, the to-be-processed image is 30 ten thousand pixels. The brightness value corresponding to each channel image in the image to be processed is obtained, each channel image of the image to be processed can be obtained firstly, then each pixel in the channel image is traversed, and the brightness value is obtained according to the obtained channel value of each pixel. For example, a G-channel image of the image to be processed is obtained, each pixel point in the G-channel image is traversed, a G-channel value is obtained, and an average value of the G-channel values of all the pixel points is used as a brightness value of the G-channel image. It can be understood that if only the luminance values of the channel images in a certain region are counted, only the pixel points in the region need to be traversed, and the luminance values of the corresponding channel images are obtained.
Step 506, acquiring corresponding character attribute characteristics according to the image to be processed.
The person attribute feature refers to a feature indicating a person attribute of a person in an image, and for example, the person attribute feature may refer to one or more of a gender feature, an age feature, a race feature, and the like. The face region in the image to be processed may be first acquired, and then the corresponding person attribute may be identified according to the face region. Specifically, a face region in the image to be processed is obtained, and the character attribute characteristics corresponding to the face region are obtained through the feature recognition model. The feature recognition model is a model for recognizing character attribute features, and is obtained by training a face sample set. The face sample set is an image set formed by a plurality of face images, and a feature recognition model is obtained through training according to the face sample set. For example, in supervised learning, each face image in the face sample set is labeled with a corresponding label for marking the type of the face image, and a feature recognition model can be obtained through training the face sample set. The feature recognition model can classify the face region to obtain corresponding character attribute features. For example, the face area may be divided into a yellow person, a black person and a white person, and the obtained corresponding person attribute feature is one of the yellow person, the black person and the white person. That is, the classification by the feature recognition model is based on the same criterion. It can be understood that, if people attribute features of different dimensions of the face region are to be obtained, the people attribute features can be obtained through different feature recognition models respectively. Specifically, the character attribute feature may include a race feature parameter, a gender feature parameter, an age feature parameter, a skin color feature parameter, a skin type feature parameter, a face style feature parameter, and a makeup feature parameter, which are not limited herein. For example, race feature parameters corresponding to the face region are obtained through the race recognition model, age feature parameters corresponding to the face region are obtained according to the age recognition model, and gender feature parameters corresponding to the face region are obtained according to the gender recognition model.
And step 508, obtaining the beauty parameters corresponding to the channel images according to the character attribute characteristics and the brightness values.
In one embodiment, the beauty parameters may include a beauty category parameter and a beauty level parameter. The beauty category parameter is a parameter indicating a beauty treatment category, and the beauty degree parameter is a parameter indicating a beauty treatment degree. For example, the beauty category parameter may be a whitening treatment, a peeling treatment, etc., and the beauty degree parameter may be classified into five grades of 1 grade, 2 grade, 3 grade, 4 grade, 5 grade, etc. The degree of beauty treatment increases from level 1 to level 5. After the character attribute features and the brightness values of the images to be processed are obtained, the beauty parameters corresponding to the channel images can be obtained according to the character attribute features and the brightness values. The character attribute characteristics correspond to the beauty category parameters, and the corresponding beauty category parameters can be obtained according to the character attribute characteristics. The brightness value corresponds to the beauty degree parameter, and the corresponding beauty degree parameter can be obtained according to the brightness value. For example, when the face in the image is recognized as a male, the image is subjected to a peeling process, and when the face in the image is recognized as a female, the image is subjected to a whitening process. Specifically, a beauty category parameter corresponding to the image to be processed is obtained according to the character attribute characteristics; and acquiring the beauty degree parameter corresponding to each channel image according to the brightness value. It can be understood that a plurality of faces may exist in the image to be processed, and when a plurality of face regions exist in the image to be processed, each face region may be identified respectively, and the character attribute features and the brightness values corresponding to each face region may be obtained respectively, and then the face beautifying processing may be performed on each face region respectively.
And step 510, performing beauty treatment on each channel image according to the beauty parameters.
The beauty parameters comprise a beauty category parameter and a beauty degree parameter, and each channel image is respectively subjected to beauty treatment according to the beauty category parameter and the beauty degree parameter. Generally, the beauty category parameters corresponding to each channel image are the same, and the corresponding beauty degree parameters may be different. For example, if the images are to be processed by skin grinding, the skin grinding process should be performed on each channel image, and the skin grinding degree may be different for each channel image.
And step 512, fusing the channel images subjected to the beautifying processing to obtain a beautifying image.
The image processing method provided in the foregoing embodiment first obtains the brightness value of each channel image in the image to be processed, obtains the beauty parameter of each channel image according to the brightness value, and then performs beauty processing on each channel image according to the obtained beauty parameter. Therefore, different beautifying treatments can be performed on each channel image, so that the beautifying treatment is optimized, and the image treatment is more accurate.
FIG. 6 is a flowchart of an image processing method in yet another embodiment. As shown in fig. 6, the image processing method includes steps 602 to 614. Wherein:
step 602, acquiring an image to be processed.
Step 604, detecting a face region in the image to be processed, and obtaining a brightness value corresponding to each channel image of the face region.
And 606, obtaining character attribute characteristics corresponding to the face region through a characteristic recognition model, wherein the characteristic recognition model is obtained through training of a face sample set.
Step 608, a beauty category parameter corresponding to the image to be processed is obtained according to the character attribute feature, and the beauty category parameter is a parameter representing a beauty processing category.
Step 610, obtaining a beauty degree parameter corresponding to each channel image according to the brightness value, wherein the beauty degree parameter is a parameter representing a beauty treatment degree.
And step 612, performing beauty treatment on each channel image according to the beauty category parameter and the beauty degree parameter.
And 614, fusing the channel images subjected to the beautifying processing to obtain a beautifying image.
The image processing method provided in the above embodiment includes first obtaining a face region in an image to be processed, obtaining a brightness value of each channel image corresponding to the face region, obtaining a beauty parameter of each channel image according to the brightness value, and then performing beauty processing on each channel image according to the obtained beauty parameter. Therefore, different beautifying treatments can be performed on each channel image, so that the beautifying treatment is optimized, and the image treatment is more accurate.
Fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment. As shown in fig. 7, the image processing apparatus 700 includes an image acquisition module 702, a parameter acquisition module 704, a beauty processing module 706, and an image fusion module 708. Wherein:
an image obtaining module 702, configured to obtain an image to be processed.
A parameter obtaining module 704, configured to obtain a brightness value corresponding to each channel image in the image to be processed, and obtain a beauty parameter corresponding to each channel image according to the brightness value.
A beauty processing module 706, configured to perform beauty processing on each channel image according to the beauty parameters.
An image fusion module 708, configured to fuse the channel images after the beauty processing to obtain a beauty image.
The image processing apparatus provided in the above embodiment first obtains the brightness value of each channel image in the image to be processed, obtains the beauty parameter of each channel image according to the brightness value, and then performs the beauty processing on each channel image according to the obtained beauty parameter. Therefore, different beautifying treatments can be performed on each channel image, so that the beautifying treatment is optimized, and the image treatment is more accurate.
In one embodiment, the parameter obtaining module 704 is further configured to obtain a target region in the image to be processed; and acquiring a brightness value corresponding to each channel image of the target area, and acquiring a beauty parameter corresponding to each channel image according to the brightness value.
In one embodiment, the parameter obtaining module 704 is further configured to detect a face region in the image to be processed, and use the face region as a target region; and/or detecting a face region in the image to be processed, acquiring a portrait region according to the face region, and taking the portrait region as a target region.
In one embodiment, the parameter obtaining module 704 is further configured to obtain corresponding person attribute features according to the image to be processed; and acquiring beauty parameters corresponding to the channel images according to the character attribute characteristics and the brightness values.
In an embodiment, the parameter obtaining module 704 is further configured to obtain a face region in the image to be processed, and obtain a feature of a person corresponding to the face region through a feature recognition model, where the feature recognition model is obtained through training a face sample set.
In an embodiment, the parameter obtaining module 704 is further configured to obtain a beauty category parameter corresponding to the image to be processed according to the character attribute feature, where the beauty category parameter is a parameter representing a beauty processing category; and acquiring a beauty degree parameter corresponding to each channel image according to the brightness value, wherein the beauty degree parameter is a parameter representing beauty treatment degree.
In an embodiment, the beauty processing module 706 is further configured to perform a beauty process on each channel image according to the beauty category parameter and the beauty degree parameter.
The division of the modules in the image processing apparatus is only for illustration, and in other embodiments, the image processing apparatus may be divided into different modules as needed to complete all or part of the functions of the image processing apparatus.
FIG. 8 is a diagram showing a configuration of an image processing system according to an embodiment. As shown in fig. 8, the image processing image includes a feature layer 802, an adaptation layer 804, and a processing layer 806. The feature layer 802 is configured to obtain an image to be processed, and obtain a brightness value in the image to be processed. And then carrying out face detection on the image to be processed, and acquiring corresponding character attribute characteristics according to a face area obtained by the face detection. The character attribute parameters may include a race characteristic parameter, a gender characteristic parameter, an age characteristic parameter, a skin color characteristic parameter, a skin type characteristic parameter, a face shape characteristic parameter, and a makeup characteristic parameter, which are not limited herein. The feature layer 802 sends the obtained brightness value and the character attribute feature to the adaptation layer 804, and the adaptation layer 804 obtains a corresponding beauty parameter according to the brightness value and the character attribute feature corresponding to the image to be processed, and sends the beauty parameter to the processing layer 806. The processing layer 806 performs the beauty processing on the image to be processed according to the received beauty parameters, and then outputs the image after the beauty processing. The beautifying treatment may include, but is not limited to, skin polishing, skin whitening, eye enlarging, face thinning, skin color adjustment, speckle removal, eye brightening, pouch removal, tooth whitening, lip beautifying, and the like.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media embodying a computer program that, when executed by one or more processors, causes the processors to perform the steps of:
acquiring an image to be processed;
acquiring a brightness value corresponding to each channel image in the image to be processed, and acquiring a beauty parameter corresponding to each channel image according to the brightness value;
performing face beautifying processing on each channel image according to the face beautifying parameters;
and fusing the channel images after the beautifying processing to obtain a beautifying image.
In an embodiment, the obtaining, by the processor, a brightness value corresponding to each channel image in the image to be processed, and obtaining a beauty parameter corresponding to each channel image according to the brightness value includes:
acquiring a target area in the image to be processed;
and acquiring a brightness value corresponding to each channel image of the target area, and acquiring a beauty parameter corresponding to each channel image according to the brightness value.
In one embodiment, the acquiring the target region in the image to be processed performed by the processor comprises at least one of the following methods:
detecting a face area in the image to be processed, and taking the face area as a target area;
and detecting a face area in the image to be processed, acquiring a portrait area according to the face area, and taking the portrait area as a target area.
In one embodiment, the method performed by the processor further comprises:
acquiring corresponding character attribute characteristics according to the image to be processed;
the obtaining of the beauty parameters corresponding to the channel images according to the brightness values includes:
and acquiring beauty parameters corresponding to the channel images according to the character attribute characteristics and the brightness values.
In one embodiment, the obtaining of the corresponding person attribute feature according to the image to be processed performed by the processor includes:
and acquiring a face region in the image to be processed, and acquiring character attribute characteristics corresponding to the face region through a characteristic identification model, wherein the characteristic identification model is obtained through training of a face sample set.
In an embodiment, the obtaining, by the processor, the beauty parameters corresponding to the respective channel images according to the character attribute features and the brightness values includes:
acquiring a beauty category parameter corresponding to the image to be processed according to the character attribute characteristics, wherein the beauty category parameter is a parameter representing a beauty processing category;
and acquiring a beauty degree parameter corresponding to each channel image according to the brightness value, wherein the beauty degree parameter is a parameter representing beauty treatment degree.
In one embodiment, the performing, by the processor, the respective beautifying processing on the channel images according to the beautifying parameters includes:
and performing beautifying processing on each channel image according to the beautifying category parameter and the beautifying degree parameter.
The embodiment of the application also provides the electronic equipment. The electronic device includes therein an Image Processing circuit, which may be implemented using hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 9 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 9, for convenience of explanation, only aspects of the image processing technique related to the embodiments of the present application are shown.
As shown in fig. 9, the image processing circuit includes an ISP processor 940 and a control logic 950. The image data captured by the imaging device 910 is first processed by the ISP processor 940, and the ISP processor 940 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of the imaging device 910. The imaging device 910 may include a camera having one or more lenses 912 and an image sensor 914. Image sensor 914 may include an array of color filters (e.g., Bayer filters), and image sensor 914 may acquire light intensity and wavelength information captured with each imaging pixel of image sensor 914 and provide a set of raw image data that may be processed by ISP processor 940. The sensor 920 (e.g., a gyroscope) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 940 based on the type of interface of the sensor 920. The sensor 920 interface may utilize an SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
In addition, image sensor 914 may also send raw image data to sensor 920, sensor 920 may provide raw image data to ISP processor 940 based on the type of interface of sensor 920, or sensor 920 may store raw image data in image memory 930.
The ISP processor 940 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 940 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 940 may also receive image data from image memory 930. For example, the sensor 920 interface sends raw image data to the image memory 930, and the raw image data in the image memory 930 is then provided to the ISP processor 940 for processing. The image Memory 930 may be a part of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving raw image data from image sensor 914 interface or from sensor 920 interface or from image memory 930, ISP processor 940 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 930 for additional processing before being displayed. ISP processor 940 may also receive from image memory 930 processed data for image data processing in the raw domain and in the RGB and YCbCr color spaces. The processed image data may be output to a display 980 for viewing by a user and/or further Processing by a Graphics Processing Unit (GPU). Further, the output of ISP processor 940 may also be sent to image memory 930 and display 980 may read image data from image memory 930. In one embodiment, image memory 930 may be configured to implement one or more frame buffers. In addition, the output of the ISP processor 940 may be transmitted to an encoder/decoder 970 for encoding/decoding image data. The encoded image data may be saved and decompressed before being displayed on a display 980 device.
The step of the ISP processor 940 processing the image data includes: the image data is subjected to VFE (Video Front End) Processing and CPP (Camera Post Processing). The VFE processing of the image data may include modifying the contrast or brightness values of the image data, modifying digitally recorded lighting status data, performing compensation processing (e.g., white balance, automatic gain control, gamma correction, etc.) on the image data, performing filter processing on the image data, etc. CPP processing of image data may include scaling an image, providing a preview frame and a record frame to each path. Among other things, the CPP may use different codecs to process the preview and record frames. The image data processed by the ISP processor 940 may be sent to a beauty module 960 for beauty processing of the image before being displayed. The beautifying module 960 may beautify the image data, including: whitening, removing freckles, buffing, thinning face, removing acnes, enlarging eyes and the like. The beauty module 960 may be a Central Processing Unit (CPU), a GPU, a coprocessor, or the like. The data processed by the beauty module 960 may be transmitted to the encoder/decoder 970 in order to encode/decode image data. The encoded image data may be saved and decompressed before being displayed on a display 980 device. The beauty module 960 may also be located between the encoder/decoder 970 and the display 980, i.e., the beauty module performs beauty processing on the imaged image. The encoder/decoder 970 may be a CPU, GPU, coprocessor, or the like in the mobile terminal.
The statistical data determined by the ISP processor 940 may be transmitted to the control logic 950 unit. For example, the statistical data may include image sensor 914 statistics such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 912 shading correction, and the like. The control logic 950 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of the imaging device 910 and control parameters of the ISP processor 940 based on the received statistical data. For example, the control parameters of the imaging device 910 may include sensor 920 control parameters (e.g., gain, integration time for exposure control), camera flash control parameters, lens 912 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 912 shading correction parameters.
The image processing method provided by the above-described embodiment can be realized by using the image processing technology in fig. 9.
A computer program product comprising instructions which, when run on a computer, cause the computer to perform the image processing method provided by the above embodiments.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An image processing method, characterized in that the method comprises:
receiving a beautifying instruction input by a user to obtain an image to be processed;
acquiring each channel image of an image to be processed, traversing each pixel point in each channel image, taking an average value of channel values of all pixel points in each channel image as a brightness value corresponding to each channel image, and acquiring a beauty parameter corresponding to each channel image according to a difference value between the brightness value corresponding to each channel image and a brightness reference value of each channel image; wherein, different difference values correspond to different beauty parameters;
performing face beautifying processing on each channel image according to the face beautifying parameters;
and fusing the channel images after the beautifying processing to obtain a beautifying image.
2. The image processing method according to claim 1, wherein the obtaining each channel image of the image to be processed, traversing each pixel point in each channel image, taking an average value of channel values of all pixel points in each channel image as a brightness value corresponding to each channel image, and obtaining a beauty parameter corresponding to each channel image according to a difference between each brightness value and a brightness reference value comprises:
acquiring a target area in the image to be processed;
traversing each pixel point in each channel image of the target area, taking the average value of all pixel point channel values in each channel image of the target area as the brightness value corresponding to each channel image of the target area, and acquiring the beauty parameter corresponding to each channel image according to the difference value of each brightness value and the brightness reference value.
3. The image processing method according to claim 2, wherein the acquiring the target region in the image to be processed comprises at least one of:
detecting a face area in the image to be processed, and taking the face area as a target area;
and detecting a face area in the image to be processed, acquiring a portrait area according to the face area, and taking the portrait area as a target area.
4. The image processing method according to any one of claims 1 to 3, characterized in that the method further comprises:
acquiring corresponding character attribute characteristics according to the image to be processed;
the obtaining of the beauty parameter corresponding to each channel image according to the difference between each brightness value and the brightness reference value includes:
and acquiring beauty parameters corresponding to the channel images according to the character attribute characteristics and the difference value between each brightness value and the brightness reference value.
5. The image processing method of claim 4, wherein the obtaining of the corresponding person attribute feature from the image to be processed comprises:
and acquiring a face region in the image to be processed, and acquiring character attribute characteristics corresponding to the face region through a characteristic identification model, wherein the characteristic identification model is obtained through training of a face sample set.
6. The image processing method of claim 4, wherein the obtaining of the beauty parameter corresponding to each channel image according to the human attribute feature and the difference between each brightness value and a brightness reference value comprises:
acquiring a beauty category parameter corresponding to the image to be processed according to the character attribute characteristics, wherein the beauty category parameter is a parameter representing a beauty processing category;
and acquiring a beauty degree parameter corresponding to each channel image according to the difference value between each brightness value and the brightness reference value, wherein the beauty degree parameter is a parameter representing the beauty treatment degree.
7. The image processing method according to claim 6, wherein the performing the beauty processing on the channel images according to the beauty parameters respectively comprises:
and performing beautifying processing on each channel image according to the beautifying category parameter and the beautifying degree parameter.
8. An image processing apparatus, characterized in that the apparatus comprises:
the image acquisition module is used for receiving a beautifying instruction input by a user to acquire an image to be processed; the beautifying instruction comprises an image identifier;
the parameter acquisition module is used for acquiring each channel image of an image to be processed, traversing each pixel point in each channel image, taking the average value of channel values of all pixel points in each channel image as the brightness value corresponding to each channel image, and acquiring the beauty parameter corresponding to each channel image according to the difference value between the brightness value corresponding to each channel image and the brightness reference value of each channel image; wherein, different difference values correspond to different beauty parameters;
the beautifying processing module is used for respectively carrying out beautifying processing on each channel image according to the beautifying parameters;
and the image fusion module is used for fusing the image of each channel after the beautifying processing to obtain a beautifying image.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the image processing method according to any one of claims 1 to 7.
10. An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the image processing method of any of claims 1 to 7.
CN201711054076.2A 2017-10-31 2017-10-31 Image processing method, image processing device, computer-readable storage medium and electronic equipment Active CN107862658B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711054076.2A CN107862658B (en) 2017-10-31 2017-10-31 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711054076.2A CN107862658B (en) 2017-10-31 2017-10-31 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN107862658A CN107862658A (en) 2018-03-30
CN107862658B true CN107862658B (en) 2020-09-22

Family

ID=61696508

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711054076.2A Active CN107862658B (en) 2017-10-31 2017-10-31 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN107862658B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108769520B (en) * 2018-05-31 2021-04-13 康键信息技术(深圳)有限公司 Electronic device, image processing method, and computer-readable storage medium
CN108961157B (en) * 2018-06-19 2021-06-01 Oppo广东移动通信有限公司 Picture processing method, picture processing device and terminal equipment
CN108898169B (en) * 2018-06-19 2021-06-01 Oppo广东移动通信有限公司 Picture processing method, picture processing device and terminal equipment
CN109144369B (en) * 2018-09-21 2020-10-20 维沃移动通信有限公司 Image processing method and terminal equipment
CN109345603B (en) * 2018-09-29 2021-08-31 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN109360254B (en) * 2018-10-15 2023-04-18 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN112446832A (en) * 2019-08-31 2021-03-05 华为技术有限公司 Image processing method and electronic equipment
CN113763284A (en) * 2021-09-27 2021-12-07 北京市商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016102753A (en) * 2014-11-28 2016-06-02 大日本印刷株式会社 Portable terminal equipment, color determination program for skin, and color determination method for skin
CN106530252A (en) * 2016-11-08 2017-03-22 北京小米移动软件有限公司 Image processing method and device
CN106611402A (en) * 2015-10-23 2017-05-03 腾讯科技(深圳)有限公司 Image processing method and device
CN107274354A (en) * 2017-05-22 2017-10-20 奇酷互联网络科技(深圳)有限公司 image processing method, device and mobile terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105654435B (en) * 2015-12-25 2018-09-11 武汉鸿瑞达信息技术有限公司 A kind of face skin softening method for whitening
CN106056552A (en) * 2016-05-31 2016-10-26 努比亚技术有限公司 Image processing method and mobile terminal
CN106447606A (en) * 2016-10-31 2017-02-22 南京维睛视空信息科技有限公司 Rapid real-time video beautifying method
CN107180415B (en) * 2017-03-30 2020-08-14 北京奇艺世纪科技有限公司 Skin beautifying processing method and device in image
CN107301626B (en) * 2017-06-22 2020-11-06 成都品果科技有限公司 Buffing algorithm suitable for shooting images by mobile equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016102753A (en) * 2014-11-28 2016-06-02 大日本印刷株式会社 Portable terminal equipment, color determination program for skin, and color determination method for skin
CN106611402A (en) * 2015-10-23 2017-05-03 腾讯科技(深圳)有限公司 Image processing method and device
CN106530252A (en) * 2016-11-08 2017-03-22 北京小米移动软件有限公司 Image processing method and device
CN107274354A (en) * 2017-05-22 2017-10-20 奇酷互联网络科技(深圳)有限公司 image processing method, device and mobile terminal

Also Published As

Publication number Publication date
CN107862658A (en) 2018-03-30

Similar Documents

Publication Publication Date Title
CN107862658B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
EP3477931B1 (en) Image processing method and device, readable storage medium and electronic device
CN107730444B (en) Image processing method, image processing device, readable storage medium and computer equipment
CN107808136B (en) Image processing method, image processing device, readable storage medium and computer equipment
CN107730445B (en) Image processing method, image processing apparatus, storage medium, and electronic device
CN107818305B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107680128B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107886484B (en) Beautifying method, beautifying device, computer-readable storage medium and electronic equipment
CN107766831B (en) Image processing method, image processing device, mobile terminal and computer-readable storage medium
WO2021022983A1 (en) Image processing method and apparatus, electronic device and computer-readable storage medium
CN107993209B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108024107B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107509031B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN108717530B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN107945135B (en) Image processing method, image processing apparatus, storage medium, and electronic device
CN107734253B (en) Image processing method, image processing device, mobile terminal and computer-readable storage medium
CN107493432B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN108108415B (en) Image processing method, image processing apparatus, storage medium, and electronic device
CN107862653B (en) Image display method, image display device, storage medium and electronic equipment
CN107730446B (en) Image processing method, image processing device, computer equipment and computer readable storage medium
CN107368806B (en) Image rectification method, image rectification device, computer-readable storage medium and computer equipment
CN110580428A (en) image processing method, image processing device, computer-readable storage medium and electronic equipment
CN109712177B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107800965B (en) Image processing method, device, computer readable storage medium and computer equipment
CN108055452A (en) Image processing method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

GR01 Patent grant
GR01 Patent grant