CN111062891A - Image processing method, device, terminal and computer readable storage medium - Google Patents

Image processing method, device, terminal and computer readable storage medium Download PDF

Info

Publication number
CN111062891A
CN111062891A CN201911307944.2A CN201911307944A CN111062891A CN 111062891 A CN111062891 A CN 111062891A CN 201911307944 A CN201911307944 A CN 201911307944A CN 111062891 A CN111062891 A CN 111062891A
Authority
CN
China
Prior art keywords
skin
skin color
target
color
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911307944.2A
Other languages
Chinese (zh)
Inventor
阎法典
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911307944.2A priority Critical patent/CN111062891A/en
Publication of CN111062891A publication Critical patent/CN111062891A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/77
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present application belongs to the technical field of image processing, and in particular, to an image processing method, an image processing apparatus, a terminal, and a computer-readable storage medium, where the image processing method includes: acquiring an image to be processed, and detecting skin areas of a target person and skin color types of all the skin areas in the image to be processed; obtaining target skin colors respectively corresponding to the skin color types of the skin areas; respectively carrying out skin color processing on each skin area by utilizing a pre-established mapping table corresponding to the skin color type and the target skin color of each skin area to obtain a target image corresponding to the image to be processed; the technical problem that the skin color processing mode is too single in the image processing process is solved.

Description

Image processing method, device, terminal and computer readable storage medium
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, a terminal, and a computer-readable storage medium.
Background
Along with the popularization of intelligent photographing equipment, more and more photographing equipment can realize beautifying the images obtained by photographing. For example, a person in an image is subjected to skin color processing such as whitening or reddening.
However, the current method for processing skin color such as whitening or ruddy color of the image to be processed is too single, and cannot meet the requirements of different users.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, a terminal and a computer readable storage medium, which can solve the technical problem that the skin color processing mode is too single in the image processing process.
A first aspect of an embodiment of the present application provides an image processing method, including:
acquiring an image to be processed, and detecting skin areas of a target person and skin color types of all the skin areas in the image to be processed;
obtaining target skin colors respectively corresponding to the skin color types of the skin areas;
and respectively carrying out skin color processing on each skin area by utilizing a pre-established mapping table corresponding to the skin color type and the target skin color of each skin area to obtain a target image corresponding to the image to be processed.
A second aspect of the embodiments of the present application provides an image processing apparatus, including:
the detection unit is used for acquiring an image to be processed and detecting skin areas of a target person and skin color types of all the skin areas in the image to be processed;
an obtaining unit, configured to obtain target skin colors respectively corresponding to the skin color types of the skin areas;
and the skin color processing unit is used for respectively carrying out skin color processing on each skin area by utilizing a pre-established mapping table corresponding to the skin color type and the target skin color of each skin area to obtain a target image corresponding to the image to be processed.
A third aspect of the embodiments of the present application provides a terminal, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method when executing the computer program.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of the above method.
In the embodiment of the application, the skin area of a target person in an image to be processed and the skin color type of each skin area are detected, the target skin color corresponding to the skin color type of each skin area is obtained, and then the skin color of each skin area is processed by utilizing the pre-established mapping table corresponding to the skin color type of each skin area and the target skin color, so that when the skin color of the image to be processed is processed, the skin area of the target person in the image to be processed can be processed, and the problem that the color of the part outside the skin area is changed too much can not occur; in addition, different types of skin color processing can be performed on skin areas with different skin color types, namely, the skin areas with different skin color types are processed into different target skin colors by utilizing a pre-established mapping table, so that the technical problem that the skin color processing mode is too single in the image processing process is solved, different skin color processing requirements of different users are met, and the image processing effect of the image to be processed is optimized.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic flow chart of an implementation of an image processing method provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a pre-established first skin color correspondence table provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of a specific implementation of a mapping table generation process provided in an embodiment of the present application;
FIG. 4 is a schematic diagram illustrating a flowchart of a convolution neural network model according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
The skin color adjustment is an important component of the skin-beautifying processing, and the same skin color processing such as whitening or ruddy processing is generally directly performed on the whole image to be processed in the current skin color processing process of whitening, ruddy processing and the like on the image to be processed. However, this skin color processing method causes a problem that a portion other than the skin area, such as an image background, has an excessively large color change. Moreover, different users have different skin color processing requirements on characters with different skin color types, and the single skin color processing mode cannot meet the requirements of different users, so that the technical problem of poor image processing effect exists.
Based on these problems, embodiments of the present application provide an image processing method, an image processing apparatus, a terminal, and a computer-readable storage medium, which can optimize an image processing effect of an image to be processed in an image processing process.
Fig. 1 shows a schematic implementation flow diagram of an image processing method provided by an embodiment of the present application, which is applied to a terminal, and can be executed by an image processing apparatus configured on the terminal, and is suitable for a situation where an image processing effect on an image to be processed needs to be optimized. The terminal may be an intelligent terminal such as a smart phone, a cosmetic mirror, a tablet computer, a Personal Computer (PC), or a learning machine, and the image processing method may include steps 101 to 103.
Step 101, acquiring an image to be processed, and detecting skin areas of a target person and skin color types of all the skin areas in the image to be processed.
In the embodiment of the application, the image to be processed may be an image to be processed of a user acquired in real time by a camera of the terminal or by a shooting device such as a camera of the terminal, or an image to be processed acquired from a local storage of the terminal.
The skin area of the target person in the image to be processed may include one or more skin areas of the target person. Also, the skin area may not be limited to only the face skin area, but may belong to the skin area of the target person for the skin area exposed to the outside of the garment. Such as the skin area of the palm, arm, neck, etc. In addition, the skin tone type may be different between the target persons.
Additionally, the skin tone types may include: white skin color, yellow skin color, wheat skin color, bright white skin color, black skin color and other skin color types.
Step 102, obtaining target skin colors respectively corresponding to the skin color types of the skin areas.
In practical applications, different users may have different skin tone treatment requirements for each skin tone type. For example, the skin tone processing requirements for each skin tone type may be different for users in different regions or different ethnicities.
Specifically, for users in asian regions, it may be preferable to treat target characters with yellow skin color as a whitish-reddish-transparent skin color when performing skin color processing, while for users in latin america regions, it may be preferable to treat target characters with yellow skin color as a wheaty skin color when performing skin color processing, and so on. Therefore, it is necessary to obtain target skin colors respectively corresponding to the skin color types of the skin areas to obtain the skin color processing effect of each skin area that the user desires to realize, and process the skin color of each skin area into the target skin color, so as to meet different requirements of different users, solve the technical problem that the skin color processing mode is too single in the image processing process, and optimize the processing effect of the image to be processed.
Specifically, in some embodiments of the present application, the obtaining of the target skin color corresponding to the skin color type of each skin region may include: and detecting a registration area of the terminal, searching a pre-established first skin color corresponding relation table according to the registration area, and acquiring target skin colors respectively corresponding to the skin color types of all skin areas.
For example, as shown in fig. 2, a schematic diagram of a pre-established first skin color correspondence table provided in an embodiment of the present application is provided, where the first skin color correspondence table records correspondence between skin color types corresponding to various regions and target skin colors.
The registration area of the terminal may refer to a network registration area corresponding to a wireless network used by the terminal or a home area of a SIM card of the terminal.
In practical application, because the skin color preferences of users having the same registered region of the terminal are generally consistent, the target skin color corresponding to the skin color type of each skin region, which meets the user requirement (i.e., corresponds to the registered region of the terminal), can be obtained by detecting the registered region of the terminal and searching the first skin color corresponding relation table according to the registered region.
In other embodiments of the present application, the obtaining of the target skin color corresponding to the skin color type of each skin region may further include: and detecting the language use type corresponding to the terminal, searching a pre-established second skin color corresponding relation table according to the language use type, and acquiring target skin colors respectively corresponding to the skin color types of all the skin areas.
The second skin color corresponding relation table records the corresponding relation between the skin color type corresponding to various language use types and the target skin color.
In practical application, except that the skin color preferences of the users with the same registered area of the terminal are generally consistent, the skin color preferences of the users with the same language use type corresponding to the terminal are also generally consistent, so that the target skin color corresponding to the skin color type of each skin area, which meets the user requirement (i.e., corresponding to the language use type corresponding to the terminal), can be obtained by detecting the language use type corresponding to the terminal and searching the second skin color corresponding relation table according to the language use type corresponding to the terminal.
It should be noted that, the above is only an example of an implementation manner of obtaining the target skin color corresponding to the skin color type of each skin region, and in some embodiments of the present application, the target skin color corresponding to the skin color type of each skin region may also be obtained in other manners. And, the establishment of the first skin color corresponding relation table and the second skin color corresponding relation table can be obtained by conducting demand investigation on users in different areas or different language use types.
103, respectively carrying out skin color processing on each skin area by using a pre-established mapping table corresponding to the skin color type and the target skin color of each skin area to obtain a target image corresponding to the image to be processed.
In the embodiment of the application, the skin color processing is respectively carried out on each skin area by utilizing the pre-established mapping table corresponding to the skin color type and the target skin color of each skin area, so that when the skin color processing is carried out on the image to be processed, the skin area of a target person in the image to be processed can be processed, the problem that the color of the part outside the skin area is changed too much can not occur, the skin color processing of different types can be carried out on the skin areas with different skin color types, the technical problem that the skin color processing mode is too single in the image processing process is solved, the different skin color processing requirements of different users are met, and the image processing effect of the image to be processed is optimized.
In the embodiment of the application, when the image to be processed comprises the target characters with various skin color types, different mapping tables can be used for processing, so that different types of skin color processing can be performed on the target characters with different skin color types.
It should be noted that the pre-established mapping table is a mapping table corresponding to a target skin color, which is simultaneously corresponding to the skin color type of the skin area and the skin color type of the skin area.
In some embodiments of the present application, before performing the skin color processing on each skin region by using the pre-established mapping table corresponding to the skin color type and the target skin color of each skin region, the method may include: a pre-established mapping table corresponding between each skin color type and its target skin color is obtained.
Specifically, the obtaining of the pre-established mapping table corresponding to each skin color type and the target skin color thereof may include: and acquiring a corresponding pre-established mapping table between each skin color type and the target skin color thereof through a third party application.
For example, a pre-established mapping table of correspondence between a yellow skin tone and a target skin tone (e.g., a wheat skin tone or a light white skin tone) corresponding to the yellow skin tone may be obtained by a third party application.
Specifically, in the process of obtaining the pre-established mapping table corresponding to each skin color type and the target skin color thereof by the third-party application, a first sample skin color image of each skin color type may be obtained first, and the third-party application is used to perform skin color adjustment on the first sample skin color image to obtain a target image corresponding to the target skin color, and then a mapping file in the skin color adjustment process is derived and used as the pre-established mapping table. The target skin color is the skin color which is respectively corresponding to each skin color type and meets the requirements of users, and the target skin color can be obtained by continuously adjusting the skin color of the first sample skin color image.
In some embodiments of the present application, when the first skin color image and the image corresponding to each target skin color have been obtained, as shown in fig. 3, the obtaining a pre-established mapping table corresponding to each skin color type and its target skin color may further include: step 301 to step 306.
Step 301, a first sample skin color image of each skin color type and a second sample skin color image of each target skin color are obtained.
For example, when the first sample skin color image may include a sample skin color image of a yellow skin color and a sample image of a wheat skin color, and the target skin color corresponding to the yellow skin color is the wheat skin color, and the target skin color corresponding to the wheat skin color is a bright white skin color, the second sample skin color image may include the sample image of the wheat skin color and the sample image of the bright white skin color.
Accordingly, the pre-established mapping table obtained by using the first and second sample skin color images may include: a pre-established mapping table corresponding between yellow skin color and wheat skin color corresponding to yellow skin color, and a pre-established mapping table corresponding between wheat skin color and wheat skin color corresponding to wheat skin color,
step 302, obtaining a first color vector of each skin color type according to the pixel value of each pixel point in the first sample skin color image of each skin color type; and obtaining a second color vector of each target skin color according to the pixel value of each pixel point in the second sample skin color image of each target skin color.
The first color vector and the second color vector are color vectors that can represent colors of the first sample skin-color image and the second sample skin-color image.
The color of each pixel in the first sample skin-color image is basically similar to the color of each pixel in the second sample skin-color image, and therefore, one first color vector can be used to represent the first sample skin-color image, and one second color vector can be used to represent the color vector of the color of the second sample skin-color image.
For example, the first color vector may be a color vector formed by an average value, a median value, or a color value with the largest color ratio of respective colors of three color channels of red, green, and blue (R, G, B) of each pixel point in the first sample skin color image; the second color vector may be a color vector formed by an average value, a median, or a color value with a maximum color ratio of respective colors of three color channels of red, green, and blue (R, G, B) of each pixel point in the second sample skin color image.
Specifically, when a pre-established mapping table corresponding to a yellow skin color and a wheat skin color corresponding to the yellow skin color is obtained, the first color vector may be a color average value (a1, a2, a3) of each of three color channels of red, green, and blue (R, G, B) of each pixel point in a sample skin color image according to the yellow skin color, and the second color vector may be a color average value (b1, b2, b3) of each of three color channels of each pixel point R, G, B in the sample skin color image according to the wheat skin color.
Step 303, calculate a difference vector between the first color vector of each skin tone type and the second color vector of the corresponding target skin tone.
For example, if the first color vector is (a1, a2, a3) and the second color vector is (b1, b2, b3), the difference vector between the second color vector and the first color vector is (b1-a1, b2-a2, b3-a3), or (a1-b1, a2-b2, a3-b 3).
Step 304, obtain an initial mapping table.
In some embodiments of the present application, the initial mapping table is a lookup table that records R, G, B color vectors for different shades of three color channels.
For example, taking 256 gray levels as an example, the initial mapping table is 256 levels of R color values, 256 levels of G color values and 256 levels of B color values, which are obtained by combining the R color values, the G color values and the B color values3A look-up table of color vectors.
Specifically, in some embodiments of the present application, the initial mapping table may be a two-dimensional mapping table that is expanded along a Z axis by a three-dimensional mapping table that takes an R color value as an X axis, a G color value as a Y axis, and a B color value as a Z axis, and a coordinate value of each coordinate corresponds to a color vector recorded by the coordinate value. For example, if the coordinate value is (100, 100, 100), the color vector corresponding to the coordinate value is (100, 100, 100).
305, superposing the target color vector and the difference vector in the initial mapping table to obtain an adjusted initial mapping table; the target color vector is a color vector having the same pixel value as the pixel point of the first sample skin color image of each skin color type.
In this embodiment, the generation formula of the adjusted initial mapping table may be: LUT1(R, G, B) ═ blu (LUT0(R, G, B) + (Skin-blemis) × is _ blemis (R, G, B)).
Wherein, LUT0(R, G, B) is the color vector of the initial mapping table; Skin-Blemish is a difference vector; is _ blue (R, G, B) is used to indicate whether the color vector (R, G, B) in the initial mapping table is the same as the pixel value of the pixel point of the first sample skin color image (i.e., the color vector (R, G, B) of the pixel point); if the color vector (R, G, B) in the initial mapping table is the same as the pixel value of the pixel point of the first sample skin color image, then is _ blephish (R, G, B) is 1, and if the color vector (R, G, B) in the initial mapping table is not the same as the pixel value of the pixel point of the defect region, then is _ blephish (R, G, B) is 0.
Because the pre-established mapping table is used for carrying out skin color processing on each skin area, and the skin color of each skin area is converted to the target skin color according with the requirements of users, only the color vector (target color vector) with the same pixel value as that of the pixel point of the first skin color image needs to be adjusted when the initial mapping table is adjusted.
Specifically, in the embodiment of the present application, after the initial mapping table is obtained, a color vector, that is, a target color vector, in the initial mapping table, which is the same as a pixel value of a pixel point of the first sample skin color image needs to be determined; and then, overlapping the color values of the R, G, B three color channels of the target color vector and the color values of the R, G, B three color channels of the difference vector to obtain the adjusted initial mapping table.
For example, if the difference vector is (30, 20), and the pixel value of a pixel of the first sample skin color image is (100, 100, 100), the target color vector corresponding to the pixel is (100, 100, 100), and the target color vector (100, 100, 100) needs to be adjusted to (100+30, 100+20, 100+20), that is, the color vector corresponding to the coordinate value (100, 100, 100) is (130, 120, 120).
Step 306, performing smoothing processing on the adjusted initial mapping table to obtain a pre-established mapping table corresponding to each skin color type and the target skin color thereof.
After the initial mapping table is adjusted, in order to avoid a large difference between color values of adjacent coordinates in the adjusted initial mapping table, in some embodiments of the present application, after the adjusted initial mapping table is obtained, smoothing processing needs to be performed on a target color vector and color vectors in a neighborhood thereof to obtain the pre-established mapping table, so that adjustment amplitudes between adjacent color vectors are continuous. The problem that when the color value difference between each color vector in the pre-established mapping table and the color value of the color vector in the neighborhood is too large, and skin color processing is performed on each skin area by using the pre-established mapping table, the texture of the processed skin area changes and blurring occurs is avoided.
Specifically, in some embodiments of the present application, the smoothing process may include mean filtering, median filtering, bilateral filtering, and gaussian filtering. The gaussian filtering is a process of performing weighted average on the whole image, and the value of each pixel point is obtained by performing weighted average on the value of each pixel point and other pixel values in the neighborhood.
In some embodiments of the present application, in the step 101, detecting the skin area of the target person and the skin color type of each skin area in the image to be processed may include: and inputting the image to be processed into a preset convolutional neural network model, and outputting the skin area of the target person in the image to be processed and the skin color type of each skin area by using the preset convolutional neural network model.
Before the image to be processed is input into a preset convolutional neural network model, training the convolutional neural network model to be trained to obtain the preset convolutional neural network model;
specifically, as shown in fig. 4, the training of the convolutional neural network model to be trained may include: step 401 to step 403.
Step 401, obtaining a plurality of third sample pictures, and obtaining skin areas of sample persons marked in advance in each third sample picture and skin color types of each skin area.
In this embodiment of the application, one or more sample persons may be present in the third sample picture, and the skin color type of each sample person may not be the same.
Step 402, inputting a target third sample picture in the third sample pictures into a convolutional neural network model to be trained, and outputting the skin area of the sample person in the target third sample picture and the skin color type of each skin area by the convolutional neural network model to be trained.
In this embodiment, the target third sample picture refers to any one of the second sample pictures in the third sample pictures. In the embodiment of the application, a large number of third sample pictures are used for training the convolutional neural network model to be trained in sequence, so that the obtained preset convolutional neural network model can identify target people containing various skin areas and various skin color types.
Step 403, calculating similarity between the skin area and the skin color type of each skin area of the sample person in the target third sample picture output by the convolutional neural network model to be trained and the skin area and the skin color type of each skin area of the sample person in the pre-marked target third sample picture, if the similarity is smaller than a similarity threshold, adjusting parameters of the convolutional neural network model to be trained, and training the convolutional neural network model to be trained by reusing the target third sample picture until the similarity is greater than or equal to a second similarity threshold, or training the convolutional neural network model to be trained by reusing the target third sample picture until the number of times of training the convolutional neural network model to be trained is greater than or equal to a first time threshold, training the convolutional neural network model to be trained by using a next target second sample picture in the third sample pictures And training until the total training times of the convolutional neural network model to be trained is greater than or equal to a second time threshold, or the change rate of the similarity is smaller than a change rate threshold, and obtaining the preset convolutional neural network model.
For example, 100 third sample pictures are obtained, any one of the 100 third sample pictures is input into a convolutional neural network model to be trained, the convolutional neural network model to be trained outputs skin regions of sample figures in the any one third sample picture and skin color types of the skin regions, the similarity is calculated, if the similarity is greater than a similarity threshold, a next third sample picture is used for training the convolutional neural network model to be trained until the total training times of the convolutional neural network model to be trained is greater than or equal to a second time threshold, or the change rate of the second similarity is less than a change rate threshold, and a preset convolutional neural network model is obtained.
Calculating the similarity between the skin area of the sample person in the third sample picture output by the convolutional neural network model to be trained and the skin area of the sample person in the pre-labeled third sample image may include: the similarity may be calculated by a structural similarity metric (SSIM), a cosine similarity, a histogram comparison, or other methods that may obtain the similarity between skin regions of sample people, which is not limited in this application.
In the embodiment of the application, when the similarity is greater than or equal to the similarity threshold, or the training times for training the convolutional neural network model to be trained by reusing the third sample picture is greater than or equal to the first time threshold, the convolutional neural network model can accurately identify the skin area of the sample figure and the skin color type of each skin area in the third sample image of the convolutional neural network model to be trained input at this time, when the total training times of the convolutional neural network model to be trained is greater than or equal to the second time threshold, or when the change rate of the similarity is smaller than the change rate threshold, it is indicated that the accuracy of the skin regions of the sample person and the skin color types of the skin regions output by the convolutional neural network model to be trained tends to be stable, so that it can be indicated that the convolutional neural network model to be trained is trained.
It should be noted that, in some embodiments of the present application, the skin regions of the target person in the detected image to be processed and the skin color types of the skin regions may also be identified by other target identification algorithms. For example, a Local Binary Pattern (LBP) algorithm, an orientation gradient feature combined support vector machine model, and the like, where, compared with other target detection algorithms, the convolutional neural network model may implement more accurate and rapid detection of skin color types of a skin area of a target person and each skin area in an image to be processed, and therefore, the preset convolutional neural network model may be selected to detect the skin color types of the skin area of the target person and each skin area in the image to be processed.
While, for purposes of simplicity of explanation, the foregoing method embodiments have been described as a series of acts or combination of acts, it will be appreciated by those skilled in the art that the present application is not limited by the order of acts or combination of acts described, as some steps may occur in other orders based on the present application.
Fig. 5 shows a schematic structural diagram of an image processing apparatus 500 provided in an embodiment of the present application, which includes a detection unit 501, an acquisition unit 502, and a skin color processing unit 503.
The detection unit 501 is configured to acquire an image to be processed, and detect a skin area of a target person and a skin color type of each skin area in the image to be processed;
an obtaining unit 502, configured to obtain target skin colors respectively corresponding to the skin color types of the skin areas;
a skin color processing unit 503, configured to perform skin color processing on each skin area by using a pre-established mapping table corresponding to the skin color type and the target skin color of each skin area, respectively, to obtain a target image corresponding to the image to be processed.
It should be noted that, for convenience and brevity of description, the specific working process of the image processing apparatus 500 described above may refer to the corresponding process of the method described in fig. 1 to fig. 4, and is not described herein again.
As shown in fig. 6, the present application provides a terminal for implementing the image processing method, including: a processor 61, a memory 62, one or more input devices 63 (only one shown in fig. 6), and one or more output devices 64 (only one shown in fig. 6). The processor 61, memory 62, input device 63 and output device 64 are connected by a bus 65.
It should be understood that, in the embodiment of the present Application, the Processor 61 may be a Central Processing Unit (CPU), and the Processor may also be other general processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 63 may include a virtual keyboard, a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, etc., and the output device 64 may include a display, a speaker, etc.
The memory 62 may include a read-only memory and a random access memory, and provides instructions and data to the processor 61. Some or all of the memory 62 may also include non-volatile random access memory. For example, the memory 62 may also store device type information.
The memory 62 stores a computer program that can be executed by the processor 61, and the computer program is, for example, a program of an image processing method. The processor 61 implements the steps in the image processing method embodiments, such as the steps 101 to 103 shown in fig. 1, when executing the computer program. Alternatively, the processor 61 may implement the functions of the modules/units in the device embodiments, such as the functions of the units 501 to 503 shown in fig. 5, when executing the computer program.
The computer program may be divided into one or more modules/units, which are stored in the memory 62 and executed by the processor 61 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used for describing the execution process of the computer program in the terminal for image processing. For example, the computer program may be divided into a detection unit, an acquisition unit and a skin color processing unit, and each unit has the following specific functions:
the detection unit is used for acquiring an image to be processed and detecting skin areas of a target person and skin color types of all the skin areas in the image to be processed;
an obtaining unit, configured to obtain target skin colors respectively corresponding to the skin color types of the skin areas;
and the skin color processing unit is used for respectively carrying out skin color processing on each skin area by utilizing a pre-established mapping table corresponding to the skin color type and the target skin color of each skin area to obtain a target image corresponding to the image to be processed.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal are merely illustrative, and for example, the division of the above-described modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units described above, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above may be implemented by a computer program, which may be stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer readable medium may include: any entity or device capable of carrying the above-described computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier signal, telecommunications signal, software distribution medium, and the like. It should be noted that the computer readable medium described above may include content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media that does not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An image processing method, comprising:
acquiring an image to be processed, and detecting skin areas of a target person and skin color types of all the skin areas in the image to be processed;
obtaining target skin colors respectively corresponding to the skin color types of the skin areas;
and respectively carrying out skin color processing on each skin area by utilizing a pre-established mapping table corresponding to the skin color type and the target skin color of each skin area to obtain a target image corresponding to the image to be processed.
2. The image processing method according to claim 1, comprising: the obtaining of the target skin color corresponding to the skin color type of each skin area includes:
detecting a registration area of a terminal, searching a pre-established first skin color corresponding relation table according to the registration area, and acquiring target skin colors respectively corresponding to the skin color types of all skin areas; the first skin color corresponding relation table is used for recording the corresponding relation between the skin color type corresponding to each area and the target skin color;
alternatively, the first and second electrodes may be,
detecting a language use type corresponding to a terminal, searching a pre-established second skin color corresponding relation table according to the language use type, and acquiring target skin colors respectively corresponding to the skin color types of all skin areas; the second skin color corresponding relation table is used for recording the corresponding relation between the skin color types corresponding to various language use types and the target skin color.
3. The image processing method of claim 1, wherein before performing the skin color processing on each skin region using the pre-established mapping table corresponding to the skin color type and the target skin color of each skin region, respectively, comprising:
a pre-established mapping table corresponding between each skin color type and its target skin color is obtained.
4. The image processing method of claim 3, wherein said obtaining a pre-established mapping table of correspondence between each skin tone type and its target skin tone comprises:
and obtaining a mapping table generated after the skin color adjustment is carried out on the first sample skin color image of each skin color type through a third party application to obtain a target image corresponding to the target skin color.
5. The image processing method of claim 3, wherein said obtaining a pre-established mapping table of correspondence between each skin tone type and its target skin tone further comprises:
acquiring a first sample skin color image of each skin color type and a second sample skin color image of each target skin color;
obtaining a first color vector of each skin color type according to the pixel value of each pixel point in the first sample skin color image of each skin color type; obtaining a second color vector of each target skin color according to the pixel value of each pixel point in the second sample skin color image of each target skin color;
calculating a difference vector between the first color vector of each skin tone type and the second color vector of the corresponding target skin tone;
acquiring an initial mapping table;
superposing the target color vector and the difference vector in the initial mapping table to obtain an adjusted initial mapping table; the target color vector is a color vector which is the same as the pixel value of the pixel point of the first skin color image of each skin color type;
and smoothing the adjusted initial mapping table to obtain a pre-established mapping table corresponding to each skin color type and the target skin color.
6. The image processing method according to claim 1, wherein the detecting skin color types of the skin area of the target person and each skin area in the image to be processed comprises:
and inputting the image to be processed into a preset convolutional neural network model, and outputting the skin area of the target person in the image to be processed and the skin color type of each skin area by using the preset convolutional neural network model.
7. The image processing method according to claim 6, before inputting the image to be processed into a preset convolutional neural network model, comprising:
training a convolutional neural network model to be trained to obtain the preset convolutional neural network model;
training the convolutional neural network model to be trained to obtain the preset convolutional neural network model comprises the following steps:
obtaining a plurality of third sample pictures, and obtaining skin areas of sample figures marked in advance in each third sample picture and skin color types of all the skin areas;
inputting a target third sample picture in the plurality of third sample pictures into a convolutional neural network model to be trained, and outputting the skin area of the sample figure in the target third sample picture and the skin color type of each skin area by using the convolutional neural network model to be trained;
calculating the similarity between the skin area of the sample figure and the skin color type of each skin area in the target third sample picture output by the convolutional neural network model to be trained and the skin area of the sample figure and the skin color type of each skin area in the pre-marked target third sample picture, if the similarity is smaller than a similarity threshold, adjusting the parameters of the convolutional neural network model to be trained, and training the convolutional neural network model to be trained by reusing the target third sample picture until the similarity is larger than or equal to a second similarity threshold, or training the convolutional neural network model to be trained by reusing the target third sample picture when the training times of the convolutional neural network model to be trained are larger than or equal to a first time threshold, training the convolutional neural network model to be trained by utilizing the next target second sample picture in the third sample pictures, and obtaining the preset convolutional neural network model until the total training times of the convolutional neural network model to be trained are greater than or equal to a second time threshold value, or the change rate of the similarity is smaller than a change rate threshold value.
8. An image processing apparatus characterized by comprising:
the detection unit is used for acquiring an image to be processed and detecting skin areas of a target person and skin color types of all the skin areas in the image to be processed;
an obtaining unit, configured to obtain target skin colors respectively corresponding to the skin color types of the skin areas;
and the skin color processing unit is used for respectively carrying out skin color processing on each skin area by utilizing a pre-established mapping table corresponding to the skin color type and the target skin color of each skin area to obtain a target image corresponding to the image to be processed.
9. A terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN201911307944.2A 2019-12-16 2019-12-16 Image processing method, device, terminal and computer readable storage medium Pending CN111062891A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911307944.2A CN111062891A (en) 2019-12-16 2019-12-16 Image processing method, device, terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911307944.2A CN111062891A (en) 2019-12-16 2019-12-16 Image processing method, device, terminal and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN111062891A true CN111062891A (en) 2020-04-24

Family

ID=70302245

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911307944.2A Pending CN111062891A (en) 2019-12-16 2019-12-16 Image processing method, device, terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111062891A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111831193A (en) * 2020-07-27 2020-10-27 北京思特奇信息技术股份有限公司 Automatic skin changing method, device, electronic equipment and storage medium
CN111881789A (en) * 2020-07-14 2020-11-03 深圳数联天下智能科技有限公司 Skin color identification method and device, computing equipment and computer storage medium
CN112150392A (en) * 2020-09-30 2020-12-29 普联技术有限公司 Low-illumination image restoration method and device
CN113411507A (en) * 2021-05-10 2021-09-17 深圳数联天下智能科技有限公司 Skin measurement image acquisition method, device, equipment and storage medium
CN113570581A (en) * 2021-07-30 2021-10-29 北京市商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium
CN113610723A (en) * 2021-08-03 2021-11-05 展讯通信(上海)有限公司 Image processing method and related device
CN113947568A (en) * 2021-09-26 2022-01-18 北京达佳互联信息技术有限公司 Image processing method and device, electronic equipment and storage medium
WO2024055333A1 (en) * 2022-09-16 2024-03-21 深圳传音控股股份有限公司 Image processing method, smart device, and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5900860A (en) * 1995-10-20 1999-05-04 Brother Kogyo Kabushiki Kaisha Color conversion device for converting an inputted image with a color signal in a specific color range into an output image with a desired specific color
US6678407B1 (en) * 1998-03-31 2004-01-13 Nec Corporation Method and device of light source discrimination, skin color correction, and color image correction, and storage medium thereof capable of being read by computer
US20070031033A1 (en) * 2005-08-08 2007-02-08 Samsung Electronics Co., Ltd. Method and apparatus for converting skin color of image
US20070065006A1 (en) * 2005-09-22 2007-03-22 Adobe Systems Incorporated Color correction based on skin color
CN107437072A (en) * 2017-07-18 2017-12-05 维沃移动通信有限公司 A kind of image processing method, mobile terminal and computer-readable recording medium
CN110287809A (en) * 2019-06-03 2019-09-27 Oppo广东移动通信有限公司 Image processing method and Related product
CN110443747A (en) * 2019-07-30 2019-11-12 Oppo广东移动通信有限公司 Image processing method, device, terminal and computer readable storage medium
CN110443769A (en) * 2019-08-08 2019-11-12 Oppo广东移动通信有限公司 Image processing method, image processing apparatus and terminal device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5900860A (en) * 1995-10-20 1999-05-04 Brother Kogyo Kabushiki Kaisha Color conversion device for converting an inputted image with a color signal in a specific color range into an output image with a desired specific color
US6678407B1 (en) * 1998-03-31 2004-01-13 Nec Corporation Method and device of light source discrimination, skin color correction, and color image correction, and storage medium thereof capable of being read by computer
US20070031033A1 (en) * 2005-08-08 2007-02-08 Samsung Electronics Co., Ltd. Method and apparatus for converting skin color of image
US20070065006A1 (en) * 2005-09-22 2007-03-22 Adobe Systems Incorporated Color correction based on skin color
CN107437072A (en) * 2017-07-18 2017-12-05 维沃移动通信有限公司 A kind of image processing method, mobile terminal and computer-readable recording medium
CN110287809A (en) * 2019-06-03 2019-09-27 Oppo广东移动通信有限公司 Image processing method and Related product
CN110443747A (en) * 2019-07-30 2019-11-12 Oppo广东移动通信有限公司 Image processing method, device, terminal and computer readable storage medium
CN110443769A (en) * 2019-08-08 2019-11-12 Oppo广东移动通信有限公司 Image processing method, image processing apparatus and terminal device

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111881789A (en) * 2020-07-14 2020-11-03 深圳数联天下智能科技有限公司 Skin color identification method and device, computing equipment and computer storage medium
CN111831193A (en) * 2020-07-27 2020-10-27 北京思特奇信息技术股份有限公司 Automatic skin changing method, device, electronic equipment and storage medium
CN112150392A (en) * 2020-09-30 2020-12-29 普联技术有限公司 Low-illumination image restoration method and device
CN112150392B (en) * 2020-09-30 2024-03-19 普联技术有限公司 Low-illumination image restoration method and device
CN113411507A (en) * 2021-05-10 2021-09-17 深圳数联天下智能科技有限公司 Skin measurement image acquisition method, device, equipment and storage medium
CN113570581A (en) * 2021-07-30 2021-10-29 北京市商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium
CN113610723A (en) * 2021-08-03 2021-11-05 展讯通信(上海)有限公司 Image processing method and related device
CN113610723B (en) * 2021-08-03 2022-09-13 展讯通信(上海)有限公司 Image processing method and related device
CN113947568A (en) * 2021-09-26 2022-01-18 北京达佳互联信息技术有限公司 Image processing method and device, electronic equipment and storage medium
CN113947568B (en) * 2021-09-26 2024-03-29 北京达佳互联信息技术有限公司 Image processing method and device, electronic equipment and storage medium
WO2024055333A1 (en) * 2022-09-16 2024-03-21 深圳传音控股股份有限公司 Image processing method, smart device, and storage medium

Similar Documents

Publication Publication Date Title
CN111062891A (en) Image processing method, device, terminal and computer readable storage medium
CN110443747B (en) Image processing method, device, terminal and computer readable storage medium
JP7413400B2 (en) Skin quality measurement method, skin quality classification method, skin quality measurement device, electronic equipment and storage medium
CN109829930B (en) Face image processing method and device, computer equipment and readable storage medium
WO2019228473A1 (en) Method and apparatus for beautifying face image
CN108701217A (en) A kind of face complexion recognition methods, device and intelligent terminal
CN110111245B (en) Image processing method, device, terminal and computer readable storage medium
CN110852160A (en) Image-based biometric identification system and computer-implemented method
CN108734126B (en) Beautifying method, beautifying device and terminal equipment
CN110363088B (en) Self-adaptive skin inflammation area detection method based on multi-feature fusion
CN113222973B (en) Image processing method and device, processor, electronic equipment and storage medium
CN109919030B (en) Black eye type identification method and device, computer equipment and storage medium
CN109242760B (en) Face image processing method and device and electronic equipment
CN110569784A (en) Human body size measuring method and system, storage medium and electronic equipment
CN111444555A (en) Temperature measurement information display method and device and terminal equipment
CN109583330B (en) Pore detection method for face photo
Santos et al. Eye gaze as a human-computer interface
CN113642358B (en) Skin color detection method, device, terminal and storage medium
CN111080754B (en) Character animation production method and device for connecting characteristic points of head and limbs
US10909351B2 (en) Method of improving image analysis
CN113610723B (en) Image processing method and related device
CN113128373B (en) Image processing-based color spot scoring method, color spot scoring device and terminal equipment
CN113421197B (en) Processing method and processing system of beautifying image
CN112785683B (en) Face image adjusting method and device
CN111080743B (en) Character drawing method and device for connecting head and limb characteristic points

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination