CN111260581B - Image processing method, device and storage medium - Google Patents

Image processing method, device and storage medium Download PDF

Info

Publication number
CN111260581B
CN111260581B CN202010055818.9A CN202010055818A CN111260581B CN 111260581 B CN111260581 B CN 111260581B CN 202010055818 A CN202010055818 A CN 202010055818A CN 111260581 B CN111260581 B CN 111260581B
Authority
CN
China
Prior art keywords
pixel
hair
target pixel
direction corresponding
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010055818.9A
Other languages
Chinese (zh)
Other versions
CN111260581A (en
Inventor
刘莹
侯沛宏
马重阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202010055818.9A priority Critical patent/CN111260581B/en
Publication of CN111260581A publication Critical patent/CN111260581A/en
Application granted granted Critical
Publication of CN111260581B publication Critical patent/CN111260581B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure relates to an image processing method, apparatus, and storage medium, the image processing method including: identifying the image, determining a hair area in the image, determining a hair direction corresponding to each pixel in the hair area, and blurring each pixel along the hair direction corresponding to each pixel to obtain a processed image. The hair direction corresponding to each pixel in the hair area is determined, the pixels are subjected to blurring treatment along the hair direction, the outline of the hair can be kept while the hair becomes smooth, glossy and textured, the hair with the edge texture is obtained, and the beautifying effect of the image is improved.

Description

Image processing method, device and storage medium
Technical Field
The present disclosure relates to the field of image processing, and in particular, to an image processing method, apparatus, and storage medium.
Background
In image processing including human hair or animal hair, it is necessary to identify a hair region in an image and blur the identified hair region, so that the hair in the hair region becomes smooth, glossy and textured, thereby meeting the beautifying requirements of users and improving user experience.
Currently, mainly the gaussian blur technique is used to blur a hair region in an image. However, the main function of gaussian blur is to perform smoothing, i.e., to reduce detail features between pixels, to reduce differences between pixels, and to smooth the entire image. Therefore, after the hair region is processed by the gaussian blur technique, the hair region and other regions in the image are smoothed, and the hair having the edge texture cannot be obtained, and the beautifying effect of the image is reduced.
Disclosure of Invention
The disclosure provides an image processing method, an image processing device and a storage medium, which at least solve the problem that hairs with edge texture cannot be obtained in image processing. The technical scheme of the present disclosure is as follows:
according to a first aspect of an embodiment of the present disclosure, there is provided an image processing method including:
identifying an image to determine a hair region in the image;
determining a hair direction corresponding to each pixel in the hair region;
and respectively carrying out blurring processing on each pixel along the hair direction corresponding to each pixel to obtain a processed image.
Optionally, before the blurring processing is performed on each pixel along the hair direction corresponding to each pixel, the method further includes:
And carrying out Gaussian blur treatment on the hair area.
Optionally, the determining the hair direction corresponding to each pixel in the hair area includes:
obtaining a minimum gradient value of a first target pixel, wherein the first target pixel is any one pixel in each pixel;
and determining the hair direction corresponding to the first target pixel according to the minimum gradient value of the first target pixel.
Optionally, the determining, according to the minimum gradient value of the first target pixel, the hair direction corresponding to the first target pixel includes:
taking the first direction pointed by the minimum gradient value of the first target pixel as the hair direction corresponding to the first target pixel; or alternatively, the process may be performed,
and taking the first direction and the second direction as the hair direction corresponding to the first target pixel, wherein the second direction is the opposite direction of the first direction.
Optionally, the determining the hair direction corresponding to each pixel in the hair area includes:
determining a first associated pixel corresponding to a first target pixel, wherein the first target pixel is any one pixel in each pixel, and the first associated pixel is adjacent to the first target pixel and has the smallest difference value between gray values of the first target pixel;
And determining the hair direction corresponding to the first target pixel according to the first associated pixel.
Optionally, the determining, according to the first associated pixel, the hair direction corresponding to the first target pixel includes:
taking the straight line direction of the first target pixel and the first associated pixel as the hair direction corresponding to the first target pixel; or alternatively, the process may be performed,
and taking the first target pixel as an endpoint, and taking the direction of pointing the endpoint to the first associated pixel as the hair direction corresponding to the first target pixel.
Optionally, the blurring processing is performed on each pixel along the hair direction corresponding to each pixel, so as to obtain a processed image, which includes:
determining a preset number of second associated pixels along the hair direction corresponding to the second target pixels, wherein the second target pixels are any one pixel in each pixel;
determining a second replacement pixel according to the second target pixel and the second associated pixels with the preset number, wherein the color value of the second replacement pixel is a weighted average value of the color value of the second target pixel and the color value of the second associated pixels with the preset number;
And replacing the second target pixel with the second replacement pixel to obtain the processed image.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
an identification module configured to identify an image to determine a hair region in the image;
a determining module configured to determine a hair direction for each pixel in the hair region;
and the first processing module is configured to blur each pixel along the hair direction corresponding to each pixel respectively to obtain a processed image.
Optionally, the method further comprises: and the second processing module is configured to perform Gaussian blur processing on the hair area before performing blur processing on each pixel along the hair direction corresponding to each pixel.
Optionally, the determining module includes:
an acquisition unit configured to acquire a minimum gradient value of a first target pixel, the first target pixel being any one of the pixels;
and the first determining unit is configured to determine the hair direction corresponding to the first target pixel according to the minimum gradient value of the first target pixel.
Optionally, the first determining unit is specifically configured to take, as the hair direction corresponding to the first target pixel, a first direction to which the minimum gradient value of the first target pixel points; or alternatively, the process may be performed,
and taking the first direction and the second direction as the hair direction corresponding to the first target pixel, wherein the second direction is the opposite direction of the first direction.
Optionally, the determining module includes:
a second determining unit configured to determine a first associated pixel corresponding to a first target pixel, the first target pixel being any one of the pixels, the first associated pixel being a pixel adjacent to the first target pixel and having a smallest difference between gray values of the first target pixel;
and a third determining unit configured to determine a hair direction corresponding to the first target pixel according to the first associated pixel.
Optionally, the third determining unit is specifically configured to take a straight line direction in which the first target pixel and the first associated pixel are located as a hair direction corresponding to the first target pixel; or alternatively, the process may be performed,
and taking the first target pixel as an endpoint, and taking the direction of pointing the endpoint to the first associated pixel as the hair direction corresponding to the first target pixel.
Optionally, the first processing module is specifically configured to determine a preset number of second associated pixels along a hair direction corresponding to a second target pixel, where the second target pixel is any one pixel in each pixel;
determining a second replacement pixel according to the second target pixel and the second associated pixels with the preset number, wherein the color value of the second replacement pixel is a weighted average value of the color value of the second target pixel and the color value of the second associated pixels with the preset number;
and replacing the second target pixel with the second replacement pixel to obtain the processed image.
According to a third aspect of the embodiments of the present disclosure, there is provided another image processing apparatus including:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the image processing method as provided in the first aspect of the embodiments of the present disclosure.
According to a fourth aspect of embodiments of the present disclosure, there is provided a storage medium, which when executed by a processor of an image processing apparatus, enables the image processing apparatus to perform the image processing method as provided in the first aspect of embodiments of the present disclosure described above.
According to a fifth aspect of embodiments of the present disclosure there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the image processing method as provided in the first aspect of embodiments of the present disclosure.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
the image processing method, the device and the storage medium provided by the embodiment of the disclosure can identify the image, determine the hair area in the image, determine the hair direction corresponding to each pixel in the hair area, and blur each pixel along the hair direction corresponding to each pixel to obtain the processed image. The hair direction corresponding to each pixel in the hair area is determined, the pixels are subjected to blurring treatment along the hair direction, the outline of the hair can be kept while the hair becomes smooth, glossy and textured, the hair with the edge texture is obtained, and the beautifying effect of the image is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure and do not constitute an undue limitation on the disclosure.
FIG. 1 is a flowchart illustrating a method of image processing according to an exemplary embodiment;
FIG. 2 is a partial schematic view of a hair region shown in accordance with an exemplary embodiment;
FIG. 3 is a flowchart illustrating another image processing method according to an exemplary embodiment;
FIG. 4 is a partial schematic view of another hair region shown in accordance with an exemplary embodiment;
fig. 5 is a block diagram of an image processing apparatus according to an exemplary embodiment;
FIG. 6 is a block diagram of another image processing apparatus shown according to an exemplary embodiment;
fig. 7 is a block diagram of yet another image processing apparatus according to an exemplary embodiment.
Detailed Description
In order to enable those skilled in the art to better understand the technical solutions of the present disclosure, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the foregoing figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the disclosure described herein may be capable of operation in sequences other than those illustrated or described herein. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
Fig. 1 is a flowchart illustrating an image processing method according to an exemplary embodiment, and referring to fig. 1, the image processing method provided in this embodiment may be applied to image processing to improve the edge texture of hair in an image. The image processing method provided in this embodiment may be executed by an image processing apparatus, which is typically implemented in software and/or hardware, and the image processing apparatus may be provided in an electronic device, and the method includes the steps of:
step 101, the image is identified to determine a hair region in the image.
The image may be an image stored in the electronic device in advance, or may be an image captured by the electronic device in real time. The hair region is a region in which a person or an animal hair is located in an image, for example, in an image including a person, the hair region is a region in which a person's hair is located in an image including an animal, and in an image including an animal, the hair region is a region in which an animal's fur is located.
In this embodiment, the image may be first identified and the hair region in the image determined. For example, a predetermined number of sample image data may be acquired in advance, the sample image data may be labeled, and the target recognition model may be obtained through training of the labeled sample image data. And identifying the image through the target identification model, and determining the hair area in the image. The specific training to obtain the target recognition model and the process of determining the hair region in the image by the target recognition model may refer to the prior art, and this embodiment will not be described in detail herein.
It should be noted that, the method of identifying the image and determining the hair area in the image may be implemented in other manners, which is not limited in this embodiment.
Step 102, determining a hair direction corresponding to each pixel in the hair region.
The direction of the hair where the pixel is located at the pixel is taken as the hair direction corresponding to the pixel. Referring to fig. 2, fig. 2 is a partial schematic view of a hair area according to an exemplary embodiment, as shown in fig. 2, a pixel 0 is one of pixels on a hair 202, and a direction of the hair 202 at the pixel 0 (a direction in which a straight line 202 is located) is a hair direction corresponding to the pixel 0.
In this embodiment, after determining the hair region in the image, the hair direction corresponding to each pixel in the hair region may be determined in turn. In practical use, the method for determining the hair direction corresponding to each pixel may be set according to the requirement, which is not limited in this embodiment.
And 103, respectively carrying out blurring processing on each pixel along the hair direction corresponding to each pixel to obtain a processed image.
In this embodiment, after determining the hair direction corresponding to each pixel in the hair area, blurring processing may be performed on each pixel along the hair direction corresponding to each pixel. Referring to fig. 2, after determining the hair direction corresponding to pixel 0, blurring processing may be performed on pixel 0 along the direction of line 201. For example, if the image is a three-primary color image, it is possible to determine the pixel 4 and the pixel 8 in the hair direction (direction of the straight line 201) corresponding to the pixel 0, and acquire the channel values X1 of the pixel 0 in the R (red ) channel, the channel values Y1 of the G (green ) channel, the channel values Z1 of the B (blue) channel, and the channel values X2, Y2, and Z2 of the pixel 4 in the R channel, the G channel, and the B channel, respectively, and the channel values X3, Y3, and Z3 of the pixel 8 in the R channel, the G channel, and the B channel, respectively. The weighted average value x=0.5×x1+0.25×x2+0.25×x3 of the pixel 0, the pixel 4, and the pixel 8 in the R channel, the weighted average value y=0.5×y1+0.25×y2+0.25×y3 of the G channel, and the weighted average value z=0.5×z1+0.25×z2+0.25×z3 of the B channel are calculated, respectively. The channel values X, Y and Z of the R, G, and B channels are obtained for the new pixel corresponding to the pixel 0, and the pixel 0 is replaced with the new pixel. Similarly, the same processing as that of pixel 0 is performed for each pixel in the hair region, resulting in a processed image. In the actual use process, the number of pixels selected along the hair direction corresponding to the pixels and the weight corresponding to each pixel may be set according to the requirement, which is not limited in this embodiment. In the process of blurring processing on the pixels, average values of the pixel 0, the pixel 4 and the pixel 8 in the R channel, the G channel and the B channel respectively can be calculated to obtain channel values X, Y and Z of new pixels in the R channel, the G channel and the B channel respectively, and the specific blurring processing method for each pixel can be set according to requirements.
In the prior art, a hair region in an image is mainly subjected to blurring processing through gaussian blurring, in combination with fig. 2, when a plurality of pixels close to a pixel 0 are determined by using a gaussian template in the gaussian blurring process, for example, when the radius of the gaussian template is 1 pixel, 8 pixels from a pixel 1 to a pixel 8 close to the pixel 0 can be determined, weighted average values of the pixels 0 to 8 in an R channel, a G channel and a B channel are calculated respectively, channel values of new pixels corresponding to the pixel 0 in the R channel, the G channel and the B channel are obtained, the pixel 0 is replaced by the new pixels, and the same processing as that of the pixel 0 is performed on each pixel in the hair region in turn, so that a processed image is obtained. In the gaussian blur processing, a plurality of pixels (pixel 1 to pixel 8) having a radius in the range of 1 pixel are determined, and the blur processing is performed on the pixel 0 while reducing the gradation and detail between the pixel 0 and the adjacent pixels (pixel 1 to pixel 8) so that the pixel 0 and the adjacent pixels (pixel 1 to pixel 8) become smooth. In this embodiment, the blurring process is performed on the pixel 0 by selecting the pixel only along the direction of the straight line 201, and the level and detail between the pixel 0 and the other pixels (pixel 1 to pixel 3, pixel 5 to pixel 7) not in the hair direction are preserved, so that the hair 202 becomes smooth in the direction of the straight line 201, and the outline of the hair 202 is preserved.
In this embodiment, the image is identified, a hair region in the image is determined, a hair direction corresponding to each pixel in the hair region is determined, and blurring processing is performed on each pixel along the hair direction corresponding to each pixel, so as to obtain a processed image. The hair direction corresponding to each pixel in the hair area is determined, the pixels are subjected to blurring treatment along the hair direction, the outline of the hair can be kept while the hair becomes smooth, glossy and textured, the hair with the edge texture is obtained, and the beautifying effect of the image is improved.
Fig. 3 is a flow chart illustrating another image processing method according to an exemplary embodiment, which may include the steps of:
in step 301, the image is identified to determine a hair region in the image.
Step 302, a gaussian blur process is performed on the hair area.
In this embodiment, after determining the hair region in the image, the gaussian blur processing may be first performed on the hair region. The specific process of the gaussian blur processing of the hair region may refer to the gaussian blur processing process in the prior art, and this embodiment will not be described in detail here.
In step 303, a hair direction corresponding to each pixel in the hair area is determined.
In this embodiment, determining the hair direction corresponding to each pixel in the hair region may be achieved by:
mode one: obtaining a minimum gradient value of a first target pixel, wherein the first target pixel is any one pixel in each pixel;
and determining the hair direction corresponding to the first target pixel according to the minimum gradient value of the first target pixel.
In this embodiment, the hair direction corresponding to each pixel may be determined according to the minimum gradient value of each pixel. Thus, after determining the hair region, the gradient value of each pixel may be first calculated, the minimum gradient value among the gradient values of each pixel is determined, and the hair direction corresponding to the pixel is determined according to the minimum gradient value. Referring to fig. 4, fig. 4 is a schematic diagram of a portion of another hair region according to an exemplary embodiment, and taking pixel 0 in fig. 4 as a first target pixel for explanation, first, channel values of the pixel 0 in the R channel, the G channel, and the B channel are respectively determined, gray values of the pixel 0 are calculated according to the channel values of the R channel, the G channel, and the B channel, and similarly, gray values of the pixels 1 to 8 are respectively calculated. And then calculating gradient values of the pixels 0 corresponding to the pixels 1 to 8 respectively according to the gray values of the pixels 0 and the gray values of the pixels 1 to 8 respectively, and determining the minimum gradient value. Finally, the hair direction corresponding to pixel 0 may be determined from the minimum gradient of pixel 0. The process of calculating the gray value of the pixel according to the channel values of the R channel, the G channel and the B channel of the pixel and the process of calculating the gradient value of the pixel according to the gray value of the pixel may refer to the prior art, and this embodiment will not be described in detail herein.
The gradient value indicates the change rate of the gray value between the first target pixel and the adjacent pixel, the smaller the change rate is, the smaller the color difference between the pixels is, and the color difference between the two pixels is close, which indicates that the two pixels are the same part of the pixel, like the pixel on the same hair. As shown in fig. 4, the minimum gradient value of the pixel 0 is 45 degrees (the direction indicated by the arrow 401), it can be determined that the change rate of the gradation value between the pixel 0 and the pixel 7 is minimum among the pixels 1 to 8, it can be determined that the pixel 0 and the pixel 7 are two pixel points located on the same hair, and the direction indicated by the arrow 401 is the direction of the hair where the pixel 0 is located at the pixel 0.
In this embodiment, the first direction to which the minimum gradient value of the first target pixel points may be regarded as the hair direction corresponding to the first target pixel; alternatively, the first direction and the second direction are the hair direction of the first target pixel, and the second direction is the opposite direction to the first direction. As shown in fig. 3, if the minimum gradient value of the pixel 0 is 45 degrees, the first direction pointed by the minimum gradient value of 45 degrees may be the hair direction corresponding to the pixel 0, that is, the direction indicated by the arrow 401 in fig. 3. Alternatively, the first direction and the opposite direction to the first direction (the opposite direction to which the arrow 401 points in fig. 3) are taken as the hair direction corresponding to the pixel 0, i.e., the straight line direction in which the arrow 401 is located.
Due to the continuity of the hair, in actual use, the first direction and the second direction are taken as the hair directions of the first target pixels, and the first target pixels are subjected to blurring processing along the first direction and the second direction, so that the hair can be smoother along the hair path. As shown in fig. 4, blurring of each pixel on the hair 402 in the direction of the hair 402 may make the path of the hair 402 smoother.
Mode two: determining a first associated pixel corresponding to a first target pixel, wherein the first target pixel is any one pixel in each pixel, and the first associated pixel is a pixel adjacent to the first target pixel and having the smallest difference value between gray values of the first target pixel;
and determining the hair direction corresponding to the first target pixel according to the first associated pixel.
In this embodiment, the difference between the gray value of the first target pixel and the gray value of the pixel adjacent to the first target pixel may be directly calculated, and the pixel with the smallest difference is determined as the first associated pixel. As shown in fig. 4, the difference between the gray value of pixel 0 and the gray values of pixels 1 to 8 may be sequentially calculated, the pixel 7 with the smallest difference of the determined gray values is the first associated pixel, and the hair direction corresponding to pixel 0 is determined from the first associated pixel (pixel 7). The smaller the difference between the gray values of adjacent pixels, the smaller the color difference between the pixels, and the color difference between the two pixels is close.
Specifically, a straight line direction in which the first target pixel and the first associated pixel are located may be used as a hair direction corresponding to the first target pixel; or taking the first target pixel as an endpoint, and pointing the endpoint to the direction of the first associated pixel as the hair direction corresponding to the first target pixel. As shown in fig. 4, the direction in which the line in which the pixel 0 and the pixel 7 are located (i.e., the line in which the arrow 401 is located) may be determined as the straight line direction of the pixel 0, or the direction in which the pixel 0 points to the pixel 7 is determined as the hair direction corresponding to the pixel 0 (i.e., the direction indicated by the arrow 401) with the pixel 0 as an end point. Similarly, the hair can be smoother by taking the straight line direction of the first target pixel and the first related pixel as the hair direction corresponding to the first target pixel.
In the second mode, the difference between the gray value of the first target pixel and the gray value of the adjacent pixel is directly calculated, the first associated pixel is determined, and the hair direction corresponding to the first target pixel is determined according to the first associated pixel.
And step 304, respectively carrying out blurring processing on each pixel along the hair direction corresponding to each pixel to obtain a processed image.
The blurring process is performed on each pixel along the hair direction corresponding to each pixel, so as to obtain a processed image, which can be realized through the following steps:
step 3041, determining a preset number of second associated pixels along the hair direction corresponding to the second target pixels, wherein the second target pixels are any one pixel in each pixel;
step 3042, determining a second replacement pixel according to the second target pixel and a preset number of second associated pixels, wherein the color value of the second replacement pixel is a weighted average value of the color value of the second target pixel and the color value of the preset number of second associated pixels;
step 3043, replacing the second target pixel with the second replacement pixel to obtain a processed image;
in this embodiment, a preset number of second associated pixels may be determined along the hair direction corresponding to the second target pixel, and a second replacement pixel is obtained according to the second target pixel and the preset number of second associated pixels, where the color value of the second replacement pixel is obtained by weighted average of the color values of the second target pixel and the preset number of second associated pixels. As shown in fig. 3, two second associated pixels (pixel 7 and pixel 21) may be determined along the direction indicated by arrow 401, the channel values of pixel 0, pixel 7 and pixel 21 in the R channel, the G channel and the B channel respectively are obtained, the weighted average of pixel 0, pixel 7 and pixel 21 in the R channel, the weighted average of pixel 7 and pixel 21 in the G channel and the weighted average in the B channel are calculated respectively, and the channel values of the second replacement pixel in the R channel, the G channel and the B channel respectively are obtained, and the pixel 0 is replaced by the second replacement pixel. Similarly, each pixel in the hair region is processed identically, resulting in a processed image. The weight of each second associated pixel may be set according to the requirement, for example, the weights of the pixel 7 and the pixel 21 may be the same or different, or the line where the arrow 401 is located may be also determined, and the pixel 3, the pixel 13, the pixel 7, and the pixel 21 are determined to be second replacement pixels, which is not limited in this embodiment.
In practical use, after determining the hair direction corresponding to each pixel, other methods may be used to process the pixels along the hair direction, which is not limited in this embodiment.
In this embodiment, firstly, a hair region in an image is determined, then gaussian blurring is performed on the hair region, and finally blurring is performed on pixels along a hair direction corresponding to each pixel in the hair region. Firstly, the hair area is subjected to uniform Gaussian blur processing, so that the whole hair area becomes smooth and soft, and then each pixel is subjected to blur processing along the hair direction corresponding to the pixel, so that the hair becomes smooth and soft along the hair path while the hair contour is maintained, and the smooth, glossy and textured hair with edge texture is obtained.
Referring to fig. 5, fig. 5 is a block diagram of an image processing apparatus according to an exemplary embodiment. The image processing apparatus 500 may be applied to image processing, and may include: an identification module 501, a determination module 502 and a first processing module 503.
The identification module 501 is configured to identify an image to determine a hair region in the image.
The determination module 502 is configured to determine a hair direction for each pixel in the hair region.
The first processing module 503 is configured to blur each pixel along the hair direction corresponding to each pixel, to obtain a processed image.
In this embodiment, the recognition module recognizes the image, determines a hair region in the image, the determination module determines a hair direction corresponding to each pixel in the hair region, and the first processing module performs blurring processing on each pixel along the hair direction corresponding to each pixel, so as to obtain a processed image. The hair direction corresponding to each pixel in the hair area is determined, the pixels are subjected to blurring treatment along the hair direction, the outline of the hair can be kept while the hair becomes smooth, glossy and textured, the hair with the edge texture is obtained, and the beautifying effect of the image is improved.
Optionally, a second processing module may be further included, the second processing module being configured to perform gaussian blur processing on the hair area before blur processing on each pixel along the hair direction corresponding to each pixel, respectively.
Alternatively, the determining module 502 may include an acquiring unit and a first determining unit.
The acquisition unit is configured to acquire a minimum gradient value of a first target pixel, which is any one of each pixel.
The first determining unit is configured to determine a hair direction corresponding to the first target pixel based on the minimum gradient value of the first target pixel.
Optionally, the first determining unit is specifically configured to take, as the hair direction corresponding to the first target pixel, a first direction to which the minimum gradient value of the first target pixel points; or the first direction and the second direction are taken as the hair direction corresponding to the first target pixel, wherein the second direction is the opposite direction of the first direction.
Alternatively, the determining module may include a second determining unit and a third determining unit.
The second determining unit is configured to determine a first associated pixel corresponding to a first target pixel, the first target pixel being any one of the pixels, the first associated pixel being a pixel adjacent to the first target pixel and having a smallest difference between gray values of the first target pixel;
the third determination unit is configured to determine a hair direction corresponding to the first target pixel from the first associated pixel.
Optionally, the third determining unit is specifically configured to take a straight line direction in which the first target pixel and the first associated pixel are located as a hair direction corresponding to the first target pixel; or taking the first target pixel as an endpoint, and taking the direction of the endpoint pointing to the first associated pixel as the hair direction corresponding to the first target pixel.
Optionally, the first processing module is specifically configured to determine a preset number of second associated pixels along a hair direction corresponding to a second target pixel, where the second target pixel is any one of the pixels. And determining a second replacement pixel according to the second target pixel and the second associated pixels with the preset number, wherein the color value of the second replacement pixel is a weighted average value of the color value of the second target pixel and the color value of the second associated pixels with the preset number. And replacing the second target pixel with the second replacement pixel to obtain a processed image.
Referring to fig. 6, fig. 6 is a block diagram of another image processing apparatus according to an exemplary embodiment. The image processing apparatus 600 includes:
a processor 601.
A memory 602 for storing instructions executable by the processor 601.
Wherein the processor 601 is configured to execute executable instructions stored in the memory 602 to implement the image processing method in the embodiment shown in fig. 1 or 3.
In an exemplary embodiment, a storage medium is also provided, such as a memory 602, comprising instructions executable by the processor 601 of the image processing apparatus 600 to perform the image processing method of the embodiment shown in fig. 2 or 3. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
In an exemplary embodiment, a computer program product comprising instructions which, when run on a computer, cause the computer to perform the image processing method in the embodiment as shown in fig. 1 or fig. 3 is also provided.
Referring to fig. 7, fig. 7 is a block diagram of yet another image processing apparatus, which apparatus 700 may include one or more of the following components, according to an example embodiment: a processing component 702, a memory 704, a power component 706, a multimedia component 708, an audio component 710, an input/output (I/O) interface 712, a sensor component 714, and a communication component 716.
The processing component 702 generally controls overall operation of the apparatus 700, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 702 may include one or more processors 720 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 702 can include one or more modules that facilitate interaction between the processing component 702 and other components. For example, the processing component 702 may include a multimedia module to facilitate interaction between the multimedia component 708 and the processing component 702.
Memory 704 is configured to store various types of data to support operations at device 700. Examples of such data include instructions for any application or method operating on the apparatus 700, contact data, phonebook data, messages, pictures, videos, and the like. The memory 704 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 706 provides power to the various components of the device 700. The power components 706 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 700.
The multimedia component 708 includes a screen between the device 700 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 708 includes a front-facing camera and/or a rear-facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 700 is in an operational mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 710 is configured to output and/or input audio signals. For example, the audio component 710 includes a Microphone (MIC) configured to receive external audio signals when the device 700 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 704 or transmitted via the communication component 716. In some embodiments, the audio component 710 further includes a speaker for outputting audio signals.
The I/O interface 712 provides an interface between the processing component 702 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 714 includes one or more sensors for providing status assessment of various aspects of the apparatus 700. For example, the sensor assembly 714 may detect an on/off state of the device 700, a relative positioning of the components, such as a display and keypad of the apparatus 700, a change in position of the apparatus 700 or one component of the apparatus 700, the presence or absence of user contact with the apparatus 700, an orientation or acceleration/deceleration of the apparatus 700, and a change in temperature of the apparatus 700. The sensor assembly 714 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 714 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 714 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 716 is configured to facilitate communication between the apparatus 700 and other devices in a wired or wireless manner. The apparatus 700 may access a wireless network based on a communication standard, such as WiFi, an operator network (e.g., 2G, 3G, 4G, or 5G), or a combination thereof. In one exemplary embodiment, the communication component 716 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 716 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing the above-described image processing methods.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 704, including instructions executable by processor 720 of apparatus 700 to perform the above-described image processing method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be stored by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (14)

1. An image processing method, comprising:
identifying an image to determine a hair region in the image;
determining the hair direction corresponding to each pixel in the hair area, wherein the hair direction corresponding to each pixel is the direction of the hair where the pixel is positioned at the pixel;
Respectively carrying out blurring processing on each pixel along the hair direction corresponding to each pixel to obtain a processed image;
and blurring each pixel along the hair direction corresponding to the pixel to obtain a processed image, wherein the blurring processing comprises the following steps:
determining a preset number of second associated pixels along the hair direction corresponding to the second target pixels, wherein the second target pixels are any one pixel in each pixel;
determining a second replacement pixel according to the second target pixel and the second associated pixels with the preset number, wherein the color value of the second replacement pixel is a weighted average value of the color value of the second target pixel and the color value of the second associated pixels with the preset number;
and replacing the second target pixel with the second replacement pixel to obtain the processed image.
2. The method according to claim 1, further comprising, before said blurring each of said pixels in the hair direction corresponding to each of said pixels, respectively:
and carrying out Gaussian blur treatment on the hair area.
3. The method of claim 1, wherein the determining a hair direction for each pixel in the hair region comprises:
Obtaining a minimum gradient value of a first target pixel, wherein the first target pixel is any one pixel in each pixel;
and determining the hair direction corresponding to the first target pixel according to the minimum gradient value of the first target pixel.
4. A method according to claim 3, wherein said determining the hair direction corresponding to the first target pixel from the minimum gradient value of the first target pixel comprises:
taking the first direction pointed by the minimum gradient value of the first target pixel as the hair direction corresponding to the first target pixel; or alternatively, the process may be performed,
and taking the first direction and the second direction as the hair direction corresponding to the first target pixel, wherein the second direction is the opposite direction of the first direction.
5. The method of claim 1, wherein the determining a hair direction for each pixel in the hair region comprises:
determining a first associated pixel corresponding to a first target pixel, wherein the first target pixel is any one pixel in each pixel, and the first associated pixel is adjacent to the first target pixel and has the smallest difference value between gray values of the first target pixel;
And determining the hair direction corresponding to the first target pixel according to the first associated pixel.
6. The method of claim 5, wherein determining the hair direction corresponding to the first target pixel from the first associated pixel comprises:
taking the straight line direction of the first target pixel and the first associated pixel as the hair direction corresponding to the first target pixel; or alternatively, the process may be performed,
and taking the first target pixel as an endpoint, and taking the direction of pointing the endpoint to the first associated pixel as the hair direction corresponding to the first target pixel.
7. An image processing apparatus, comprising:
an identification module configured to identify an image to determine a hair region in the image;
a determining module configured to determine a hair direction corresponding to each pixel in the hair region, wherein the hair direction corresponding to each pixel is a direction of the hair where the pixel is located at the pixel;
the first processing module is configured to blur each pixel along the hair direction corresponding to each pixel to obtain a processed image;
The first processing module is specifically configured to determine a preset number of second associated pixels along a hair direction corresponding to a second target pixel, where the second target pixel is any one pixel in each pixel;
determining a second replacement pixel according to the second target pixel and the second associated pixels with the preset number, wherein the color value of the second replacement pixel is a weighted average value of the color value of the second target pixel and the color value of the second associated pixels with the preset number;
and replacing the second target pixel with the second replacement pixel to obtain the processed image.
8. The apparatus as recited in claim 7, further comprising: and the second processing module is configured to perform Gaussian blur processing on the hair area before performing blur processing on each pixel along the hair direction corresponding to each pixel.
9. The apparatus of claim 7, wherein the means for determining comprises:
an acquisition unit configured to acquire a minimum gradient value of a first target pixel, the first target pixel being any one of the pixels;
And the first determining unit is configured to determine the hair direction corresponding to the first target pixel according to the minimum gradient value of the first target pixel.
10. The apparatus according to claim 9, wherein the first determination unit is specifically configured to set a first direction to which the minimum gradient value of the first target pixel points as the hair direction to which the first target pixel corresponds; or alternatively, the process may be performed,
and taking the first direction and the second direction as the hair direction corresponding to the first target pixel, wherein the second direction is the opposite direction of the first direction.
11. The apparatus of claim 7, wherein the determining module comprises:
a second determining unit configured to determine a first associated pixel corresponding to a first target pixel, the first target pixel being any one of the pixels, the first associated pixel being a pixel adjacent to the first target pixel and having a smallest difference between gray values of the first target pixel;
and a third determining unit configured to determine a hair direction corresponding to the first target pixel according to the first associated pixel.
12. The apparatus according to claim 11, wherein the third determining unit is specifically configured to take a straight line direction in which the first target pixel and the first associated pixel are located as a hair direction corresponding to the first target pixel; or alternatively, the process may be performed,
and taking the first target pixel as an endpoint, and taking the direction of pointing the endpoint to the first associated pixel as the hair direction corresponding to the first target pixel.
13. An image processing apparatus, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the image processing method of any one of claims 1 to 6.
14. A storage medium, which when executed by a processor of an image processing apparatus, causes the image processing apparatus to perform the image processing method of any one of claims 1 to 6.
CN202010055818.9A 2020-01-17 2020-01-17 Image processing method, device and storage medium Active CN111260581B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010055818.9A CN111260581B (en) 2020-01-17 2020-01-17 Image processing method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010055818.9A CN111260581B (en) 2020-01-17 2020-01-17 Image processing method, device and storage medium

Publications (2)

Publication Number Publication Date
CN111260581A CN111260581A (en) 2020-06-09
CN111260581B true CN111260581B (en) 2023-09-26

Family

ID=70950690

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010055818.9A Active CN111260581B (en) 2020-01-17 2020-01-17 Image processing method, device and storage medium

Country Status (1)

Country Link
CN (1) CN111260581B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112330570B (en) * 2020-11-27 2024-03-12 北京达佳互联信息技术有限公司 Image processing method, device, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102970463A (en) * 2011-08-31 2013-03-13 索尼公司 Image processing device, image processing method and program
CN103914861A (en) * 2013-01-08 2014-07-09 腾讯科技(武汉)有限公司 Picture processing method and device
CN107103619A (en) * 2017-04-19 2017-08-29 腾讯科技(上海)有限公司 A kind of processing method of hair grain direction, apparatus and system
CN107256555A (en) * 2017-05-25 2017-10-17 腾讯科技(上海)有限公司 A kind of image processing method, device and storage medium
CN107392099A (en) * 2017-06-16 2017-11-24 广东欧珀移动通信有限公司 Extract the method, apparatus and terminal device of hair detailed information
CN108198127A (en) * 2017-11-27 2018-06-22 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal
CN109087377A (en) * 2018-08-03 2018-12-25 北京字节跳动网络技术有限公司 Method and apparatus for handling image
CN109816764A (en) * 2019-02-02 2019-05-28 深圳市商汤科技有限公司 Image generating method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007264828A (en) * 2006-03-27 2007-10-11 Fujifilm Corp Image processing method and device and printer

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102970463A (en) * 2011-08-31 2013-03-13 索尼公司 Image processing device, image processing method and program
CN103914861A (en) * 2013-01-08 2014-07-09 腾讯科技(武汉)有限公司 Picture processing method and device
CN107103619A (en) * 2017-04-19 2017-08-29 腾讯科技(上海)有限公司 A kind of processing method of hair grain direction, apparatus and system
CN107256555A (en) * 2017-05-25 2017-10-17 腾讯科技(上海)有限公司 A kind of image processing method, device and storage medium
CN107392099A (en) * 2017-06-16 2017-11-24 广东欧珀移动通信有限公司 Extract the method, apparatus and terminal device of hair detailed information
CN108198127A (en) * 2017-11-27 2018-06-22 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal
CN109087377A (en) * 2018-08-03 2018-12-25 北京字节跳动网络技术有限公司 Method and apparatus for handling image
CN109816764A (en) * 2019-02-02 2019-05-28 深圳市商汤科技有限公司 Image generating method and device, electronic equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Hair removal methods: A comparative study for dermoscopy images;Qaisar Abbas等;《Biomedical Signal Processing and Control》;第6卷(第4期);全文 *
图像抽象化的实时增强型绘制;王山东等;《计算机辅助设计与图形学学报》;第25卷(第2期);全文 *
赵伟等.基于Gabor 方向直方图与猪体毛发模式特征的猪个体身份识别.《江苏农业科学》.第第46卷卷(第第46卷期),全文. *

Also Published As

Publication number Publication date
CN111260581A (en) 2020-06-09

Similar Documents

Publication Publication Date Title
CN107692997B (en) Heart rate detection method and device
CN109859144B (en) Image processing method and device, electronic equipment and storage medium
CN107967459B (en) Convolution processing method, convolution processing device and storage medium
CN107798654B (en) Image buffing method and device and storage medium
CN110580688B (en) Image processing method and device, electronic equipment and storage medium
CN110944230B (en) Video special effect adding method and device, electronic equipment and storage medium
CN108154466B (en) Image processing method and device
CN111243011A (en) Key point detection method and device, electronic equipment and storage medium
CN112330570B (en) Image processing method, device, electronic equipment and storage medium
CN107015648B (en) Picture processing method and device
CN109255784B (en) Image processing method and device, electronic equipment and storage medium
CN112188091B (en) Face information identification method and device, electronic equipment and storage medium
CN112614064A (en) Image processing method, image processing device, electronic equipment and storage medium
CN107730443B (en) Image processing method and device and user equipment
CN111260581B (en) Image processing method, device and storage medium
CN107527072B (en) Method and device for determining similar head portrait and electronic equipment
CN112004020B (en) Image processing method, image processing device, electronic equipment and storage medium
CN106469446B (en) Depth image segmentation method and segmentation device
CN115512116B (en) Image segmentation model optimization method and device, electronic equipment and readable storage medium
CN107633490B (en) Image processing method, device and storage medium
CN113160099B (en) Face fusion method, device, electronic equipment, storage medium and program product
CN115914721A (en) Live broadcast picture processing method and device, electronic equipment and storage medium
CN107085822B (en) Face image processing method and device
CN110502993B (en) Image processing method, image processing device, electronic equipment and storage medium
CN105653623B (en) Picture collection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant