CN113129315B - Pore region acquisition method, pore region acquisition device, computer equipment and storage medium - Google Patents

Pore region acquisition method, pore region acquisition device, computer equipment and storage medium Download PDF

Info

Publication number
CN113129315B
CN113129315B CN202110363508.8A CN202110363508A CN113129315B CN 113129315 B CN113129315 B CN 113129315B CN 202110363508 A CN202110363508 A CN 202110363508A CN 113129315 B CN113129315 B CN 113129315B
Authority
CN
China
Prior art keywords
area
target
pore
image
target position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110363508.8A
Other languages
Chinese (zh)
Other versions
CN113129315A (en
Inventor
乔峤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Rongzhifu Technology Co ltd
Original Assignee
Xi'an Rongzhifu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Rongzhifu Technology Co ltd filed Critical Xi'an Rongzhifu Technology Co ltd
Priority to CN202110363508.8A priority Critical patent/CN113129315B/en
Publication of CN113129315A publication Critical patent/CN113129315A/en
Application granted granted Critical
Publication of CN113129315B publication Critical patent/CN113129315B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • G06T5/70
    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Abstract

The application discloses a pore region acquisition method, a pore region acquisition device, computer equipment and a storage medium, and belongs to the technical field of image processing. The method comprises the following steps: preprocessing a face image to be detected to obtain a target binarized image; for each target position area in the target binarized image, calculating the similar roundness corresponding to each target position area, wherein the target position area is an image area formed by each pixel point with a pixel value of a pixel point in the binarized image as a first value, and the similar roundness is used for indicating the approaching degree of the outline shape and the circle of the target position area; and acquiring each pore region in the face image to be detected according to the roundness corresponding to each target position region. In the method, by adopting the roundness-like characteristic, each pore region is acquired from each target position region, the influence of slender contours such as hair, wrinkles and the like is eliminated, and the accuracy of acquiring the pore region in the facial image is improved.

Description

Pore region acquisition method, pore region acquisition device, computer equipment and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a pore region acquiring method, a pore region acquiring device, a computer device, and a storage medium.
Background
Facial skin detection is an important means for skin aging and anti-aging research, and is also one of important indexes for objectively evaluating the effect of skin care products.
With the improvement of the living standard of people, facial cosmetology and skin care are receiving more and more attention from people. The method for detecting the pores of the human body in the facial image mainly comprises the steps of acquiring a digital facial image by a camera or a special instrument in a micro-distance manner; and carrying out gray level processing on the acquired facial image, and segmenting pores in the face image after gray level processing by adopting a threshold segmentation method, namely extracting the pores by adopting a proper threshold segmentation method according to the characteristic that the pores have obvious color difference from other position pixels after gray level processing, so as to realize the recognition and extraction of the pores.
In the above-described technical solution, the result of the above-described recognition solution is easily affected due to the influence of the facial features having an elongated contour, such as hair and wrinkles, in the facial image, and the accuracy of recognition is reduced.
Disclosure of Invention
The embodiment of the application provides a pore region acquisition method, a pore region acquisition device, computer equipment and a storage medium, which can improve the accuracy of pore region acquisition in facial images.
In one aspect, an embodiment of the present application provides a pore region acquiring method, including:
preprocessing a face image to be detected to obtain a target binarized image;
calculating the similar roundness corresponding to each target position area in the target binarized image, wherein the target position area is an image area formed by each pixel point with a first value of the pixel points in the binarized image, and the similar roundness is used for indicating the approaching degree of the outline shape and the circle of the target position area;
and acquiring each pore region in the face image to be detected according to the roundness corresponding to each target position region.
In another aspect, an embodiment of the present application provides a pore region acquiring apparatus, including:
the image acquisition module is used for preprocessing the face image to be detected and acquiring a target binarized image;
the roundness-like calculation module is used for calculating the roundness-like degree corresponding to each target position area in the target binarized image, wherein the target position area is an image area formed by each pixel point with a first pixel value of the pixel points in the binarized image, and the roundness-like degree is used for indicating the approaching degree of the outline shape of the target position area to the circle;
And the pore acquisition module is used for acquiring each pore region in the face image to be detected according to the roundness corresponding to each target position region.
In another aspect, an embodiment of the present application provides a computer device, including a memory and a processor, where the memory stores a computer program, where the computer program when executed by the processor causes the processor to implement a pore region acquiring method according to one of the above aspects and any optional implementation manner of the above aspect.
In another aspect, embodiments of the present application provide a computer readable storage medium having a computer program stored thereon, the computer program when executed by a processor implementing a pore region acquisition method as described in the other aspect and an alternative thereto.
The technical scheme provided by the embodiment of the application at least comprises the following beneficial effects:
preprocessing a face image to be detected to obtain a target binarized image; for each target position area in the target binarized image, calculating the similar roundness corresponding to each target position area, wherein the target position area is an image area formed by each pixel point with a pixel value of a pixel point in the binarized image as a first value, and the similar roundness is used for indicating the approaching degree of the outline shape and the circle of the target position area; and acquiring each pore region in the face image to be detected according to the roundness corresponding to each target position region. In the method, by adopting the roundness-like characteristic, each pore region is acquired from each target position region, the influence of slender contours such as hair, wrinkles and the like is eliminated, and the accuracy of acquiring the pore region in the facial image is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a method flow diagram of a pore region acquisition method provided in an exemplary embodiment of the present application;
FIG. 2 is an image schematic of a target binarized image according to an exemplary embodiment of the present application;
FIG. 3 is a method flow diagram of a pore region acquisition method provided in an exemplary embodiment of the present application;
FIG. 4 is a schematic diagram of various sub-regions of a face image to be detected after cropping, according to an exemplary embodiment of the present application;
FIG. 5 is a schematic image view of a target location area of FIG. 2 in accordance with an exemplary embodiment of the present application;
fig. 6 is a block diagram of a pore region acquiring apparatus according to an exemplary embodiment of the present application;
fig. 7 is a schematic diagram of a computer device according to an exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
References herein to "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
It should be noted that the terms "first," "second," "third," and "fourth," etc. in the description and claims of the present application are used for distinguishing between different objects and not for describing a particular sequential order. The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
The scheme provided by the application can be used in the process of adapting to different use scenes by switching the optical filters when the electromagnetic double-optical-filter switcher is included in the terminal used in daily life, and in order to facilitate understanding, the nouns and the application architecture related to the embodiment of the application are simply introduced.
Local Edge-Preserving (LEP) filtering is to perform multi-scale decomposition on an image, decompose the image into a basic layer and two detail layers, and perform S-shaped curve enhancement on the detail layers by Preserving the basic layer according to the characteristics of each decomposed layer and the enhancement requirement, so as to realize the enhancement effect of the image detail layers.
Gaussian filtering is a linear smoothing filtering, is suitable for eliminating Gaussian noise, and is widely applied to a noise reduction process of image processing.
Homomorphic filtering is an image processing method combining frequency filtering and gray level transformation, and relies on an illumination or reflectance model of an image as a basis for frequency domain processing, utilizing a compressed brightness range and enhanced contrast to improve the quality of the image. The method can reduce low frequency, increase high frequency, adjust the gray scale range of the image, eliminate the problem of uneven illumination on the image, and enhance the image details of the dark area without losing the image details of the bright area.
The morphological filtering is to perform open operation and close operation treatment on the graph based on basic expansion corrosion, so as to achieve the effects of eliminating noise points, surface burrs, pits and the like, and enable the image contour to be smoother.
With the improvement of the living standard of people, people have increasingly pursued beauty, and faces often determine the first impression of others on themselves, so people spend more time and energy on facial maintenance. Pores are an important index for reflecting the skin condition of people, and factors such as age, sleep quality, skin moisture and the like can influence the shape, size and the like of facial pores, so that the knowledge of the pore condition is helpful for nursing the skin.
Wherein the detection of pores in a facial image typically comprises two important steps, image processing and thresholding. The image processing mainly solves the problems of uneven illumination and uneven brightness of the image caused by uneven surface of the skin when the face image is imaged, so that the contrast between pores and the background is more obvious, and the boundary definition is enhanced.
For example, in a conventional image processing method for detecting pores, a binarized image after image processing may be obtained by performing LEP filtering, gaussian filtering, homomorphic filtering, morphological filtering, and the like on the obtained face image. Then, a threshold segmentation method is adopted from the facial image after image processing, and the principle is that pores are extracted by a proper threshold segmentation method according to the characteristic that the pores have obvious color difference from normal skin after image processing.
After pores in the facial image are obtained, the pores are often required to be evaluated, and at present, a doctor can evaluate and diagnose the condition of the pores according to subjective experience and objective facts, but the pores are influenced by a plurality of factors, and a unified pore evaluation standard does not exist. The prior pore evaluation method comprises the following two methods: one is a pore standard photo evaluation method; the other is a rating method. The photo evaluation method relies on subjective judgment, and a photo to be evaluated is compared with a standard photo to give an evaluation result; the present grading evaluation method is to divide pores into a plurality of grades according to the number and the size of the pores. However, due to insufficient screening of pores, facial wrinkles, hair and the like have similar size to the size of pores, and also have a certain influence on pore grading, and no objective theoretical method has been proposed in the aspect of the degree of density which is important for pore grading. The current grade evaluation method is more general, and differences among a plurality of facial pore pictures of the same grade cannot be reflected, so that the research significance is great on how to objectively and accurately score each facial pore picture.
In the above scheme, the threshold segmentation method is easy to be influenced by the facial features with slender outlines such as hairs and wrinkles in the facial image when determining pores only according to the characteristic that the pores have obvious color difference from other position pixels after gray processing, so that the recognition result is inaccurate, and the recognition accuracy is reduced.
In order to improve accuracy of acquisition of pore areas in facial images, the method and the device have the advantages that roundness-like characteristics can be utilized to acquire pore areas from target position areas, influences of slender contours such as hairs and wrinkles are eliminated, and accuracy of acquisition of pore areas in facial images is improved.
Referring to fig. 1, a method flowchart of a pore region acquiring method according to an exemplary embodiment of the present application is shown. As shown in fig. 1, the pore region acquisition method may include the following steps.
Step 101, preprocessing a face image to be detected, and obtaining a target binarized image.
Wherein the face image to be detected is any one of the face images acquired or acquired. For example, the computer device may capture the face of the user through the camera to obtain a face image, where the face image may be used as the face image to be detected, or the computer device may receive the face image sent by other computer devices or terminals, and use the received face image as the face image to be detected.
Alternatively, in the present application, the preprocessing may sequentially include: clipping the face image, carrying out gray level processing on the clipped image, and carrying out homomorphic filtering and local equalization processing, binarization processing and morphological processing on the gray level image. The clipping of the face image may be clipping of a forehead area, a left face area, and a right face area in the face image to be detected, so as to obtain each sub-area. Alternatively, the face image to be detected may also refer to a face image corresponding to each sub-region obtained by clipping the face image.
Step 102, for each target position area in the target binarized image, calculating a roundness degree corresponding to each target position area, where the target position area is an image area composed of each pixel point with a pixel value of a pixel point in the binarized image as a first value, and the roundness degree is used for indicating the proximity degree of the outline shape and the circle of the target position area.
The pixel value of each pixel point in the binarized image is represented by 0 or 1, and in this step, the first value may be 0 or 1, where the first value depends on the pixel value set after the computer device itself performs the above preprocessing on each pixel point, which may be a pore area, in the face image to be detected. For example, if after preprocessing, the value of each pixel that may be a pore in the face image to be detected is 1, the first value is 1, and if after preprocessing, the value of each pixel that may be a pore in the face image to be detected is 0, the first value is 0.
Referring to fig. 2, an image schematic diagram of a target binarized image according to an exemplary embodiment of the present application is shown. As shown in fig. 2, each pixel 201 and a target position area 202 are included in the target binarized image 200. Where each pixel has a corresponding pixel value of "0" or "1", where 1 represents a pore, 0 represents a non-pore, and each target location area 202 in the target binarized image 200 may be an image area composed of each pixel having a pixel value of 1.
After each target position area is obtained, the computer device can calculate the roundness corresponding to each target position area according to the respective contour shape of each target position area.
And step 103, acquiring each pore area in the face image to be detected according to the similar roundness corresponding to each target position area.
Optionally, the computer device may compare the roundness value corresponding to each target position area with a preset roundness value, and determine each target position area with the roundness value greater than the preset roundness value as each pore area in the face image to be detected, so as to screen the pore area.
In summary, a target binarized image is obtained by preprocessing a face image to be detected; for each target position area in the target binarized image, calculating the similar roundness corresponding to each target position area, wherein the target position area is an image area formed by each pixel point with a pixel value of a pixel point in the binarized image as a first value, and the similar roundness is used for indicating the approaching degree of the outline shape and the circle of the target position area; and acquiring each pore region in the face image to be detected according to the roundness corresponding to each target position region. In the method, by adopting the roundness-like characteristic, each pore region is acquired from each target position region, the influence of slender contours such as hair, wrinkles and the like is eliminated, and the accuracy of acquiring the pore region in the facial image is improved.
In a possible implementation manner, the method and the device can evaluate pores of the face image to be detected through the obtained pore areas, evaluate the pores according to a uniform evaluation standard and improve the accuracy of pore evaluation. Referring to fig. 3, a method flowchart of a pore region acquiring method according to an exemplary embodiment of the present application is shown. As shown in fig. 3, the pore region acquisition method may include the following steps.
Step 301, preprocessing a face image to be detected, and obtaining a target binarized image.
Optionally, in the preprocessing process of the face image to be detected, the computer device may cut the face image to obtain forehead regions, left face regions, and right face regions in the face image to be detected, cut out each sub-region, and execute the subsequent steps on each sub-region. For example, please refer to fig. 4, which illustrates a schematic diagram of each sub-region after clipping of a face image to be detected according to an exemplary embodiment of the present application. As shown in fig. 4, by clipping the face image 400 to be detected, a forehead sub-region image 401, a left face sub-region image 402, and a right face sub-region image 403 are obtained.
Alternatively, the clipping scheme shown in fig. 4 may be that the computer device extracts feature points of the face image to be detected according to a mode of detecting 68-point face feature points in the opencv function library, clips the left face part, the right face part and the forehead part, and takes corresponding sub-areas obtained by clipping according to actual conditions in the detection process. For example, the upper boundary of the picture is taken as the upper boundary of the forehead region, the ordinate of the 17 th feature point is taken as the lower boundary of the forehead region, and the abscissa of the 0 th feature point and the 16 th feature point are taken as the left boundary and the right boundary of the nose tip region, so that the forehead region is obtained; and taking the ordinate of the No. 47 characteristic points and the No. 33 characteristic points as the upper boundary and the lower boundary of the nose tip region, and taking the abscissa of the No. 36 characteristic points and the No. 45 characteristic points as the left boundary and the right boundary of the nose tip region, so as to obtain the cheek region.
After each subarea is obtained, gray scale processing can be continuously carried out on the images of each subarea to obtain respective gray scale images of each subarea, homomorphic filtering and local equalization are sequentially carried out on the obtained gray scale images, and the gray scale images are enhanced, so that pores are partially highlighted. By using the picture preprocessing method, the influence of the oil light in the facial image on the extraction pores can be reduced.
And carrying out binarization processing on the image subjected to gray level processing to obtain an initial binarized image. Optionally, the computer device may perform screening according to the pixel values of each pixel in the image after the gray-scale processing, and mark the pixel smaller than the preset pixel value as 1 and the pixel larger than the preset pixel value as 0, so as to obtain an initial binarized image. For example, the preset pixel value is 25, the computer device screens the pixel value of each pixel point of the image after gray processing, the pixel point with the pixel value smaller than 25 is marked as 1, and the pixel point with the pixel value larger than 25 is marked as 0, so as to obtain the initial binarized image.
Optionally, morphological processing can be further performed on the initial binary image, some noise in the initial binary image is eliminated, and the initial binary image after morphological processing is used as the target binary image.
Step 302, for each target position area in the target binarized image, calculating a roundness degree corresponding to each target position area, where the target position area is an image area composed of each pixel point with a pixel value of a pixel point in the binarized image as a first value, and the roundness degree is used to indicate a proximity degree of a contour shape of the target position area to a circle.
After the computer device obtains the target binarized image, the computer device can obtain an image area formed by each pixel point with each pixel value being a first value according to the pixel values of the pixel points in the binarized image, obtain each target position area, and calculate the similar circularity corresponding to each target position area. Wherein, each target location area is similar to each target location area in fig. 2, and is not described herein.
Optionally, the computer device obtains, for a first target location area in each target location area, a first number of pixels and a second number of pixels, where the first target location area is any one location area in each target location area, the first number of pixels is a total number of pixels occupied by an area contour of the first target location area, and the second number of pixels is a total number of pixels in an area surrounded by an area contour of the first target location area; and calculating the roundness of the first target position area according to the first pixel number and the second pixel number.
That is, the computer device obtains, for each of the target position areas, the total number of the respective pixel points occupied by the respective area outlines of the target position areas and the total number of the respective pixel points within the area surrounded by the area outlines, thereby calculating the roundness-like degree corresponding to the respective target position areas. Referring to fig. 5, an image of a target location area according to an exemplary embodiment of the present application is shown. As shown in fig. 5, the target binarized image 500 includes a first target position area 501 and a first contour 502, and the computer device may obtain the total number of the occupied pixels of the first contour 502 as the first pixel number of the first target position area 501; the total number of pixels occupied by the first contour 502 (i.e., the first target location area 501) may also be obtained, where the total number of pixels occupied by the first target location area 501 is the second number of pixels of the first target location area 501.
Alternatively, the computer device may bring the first number of pixels and the second number of pixels into the first calculation formula to calculate the roundness-like degree of the first target position area.
The first calculation formula is:
where S is the second number of pixels and C is the first number of pixels.
For example, in fig. 5 described above, if the computer device acquires that the first number of pixels of the first target position area is 50 and the second number of pixels of the first target position area is 2600, the roundness of the first target position area is calculated to be 4.16 pi according to the first calculation formula.
For each target position area, the computer device may calculate the roundness according to the first calculation formula, so as to obtain the roundness corresponding to each target position area.
Step 303, obtaining each pore area in the face image to be detected according to the roundness corresponding to each target position area.
Optionally, after obtaining the roundness corresponding to each target position area, the computer device may perform screening by presetting a roundness threshold. For example, the preset roundness threshold is 0.7, when the roundness corresponding to a target position area is greater than 0.7, the target position area is taken as a pore area, and when the roundness corresponding to a target position area is not greater than 0.7, the target position area is not a pore area. Through the above screening, individual pore regions can be obtained.
Step 304, acquiring the relative area and density of pores in the target binary image, wherein the relative area is used for indicating the proportion of the total area of each pore area to the total area of the target binary image.
Optionally, after obtaining the respective pore areas, the computer device may continue to evaluate the pores, where the manner in which the relative area and density of the pores in the target binarized image is obtained may be as follows.
Acquiring a relative area according to the total pixel point number occupied by each pore area in the target binarization image and the total pixel point number of the target binarization image; acquiring respective barycentric coordinates of each pore region in a pixel coordinate system of the target binarization image according to respective zero-order moment and first-order moment of each pore region in the target binarization image; and calculating the density of pores in the target binarized image according to the gravity center coordinates of each pore region.
For example, the total number of pixels occupied by each pore region in the target binarized image is 9000, and the total number of pixels in the target binarized image is 36000, and then the relative area calculated by the computer device is 9000 divided by 36000 to be equal to 0.25. That is, the relative area was 0.25.
Optionally, the zero-order moment represents the sum of pixel values of the pixels of the image, the first moment represents the product of the abscissa of the pixels of the image and the corresponding pixel value and the ordinate of the pixels of the image and the corresponding pixel value, and then, for each pore region, the respective zero-order moment of each pore region in the target binarized image is calculated according to the second calculation formula, and the respective first moment of each pore region in the target binarized image is calculated according to the third calculation formula and the fourth calculation formula;
the second calculation formula is as follows:
the third calculation formula is:
the fourth calculation formula is:
M 00 is zero order moment, M 10 And M 01 The first moment, V, is the pixel value of any one pixel point in the first pore region, the first pore region is any one pore region in each pore region, I is the abscissa in the pixel coordinate system of the target binarized image, and J is the ordinate in the pixel coordinate system of the target binarized image.
Optionally, the computer device may respectively substitute the zero-order moment and the first-order moment of each pore region into a fifth calculation formula, to calculate the barycentric coordinates of each pore region;
wherein, the fifth calculation formula is:
x 0p ,y 0p respectively representing the abscissa and the ordinate of the barycenter coordinate of the p-th pore region, M 00p Zero order moment representing the p-th pore region, M 10p And M 01p Representing the first moment of the p-th pore region.
Optionally, the computer device may obtain respective barycentric coordinates of each pore region according to the second to fifth calculation formulas, and calculate the density according to the respective barycentric coordinates of each pore region.
Optionally, the computer device may substitute the barycentric coordinates of each pore region into a sixth calculation formula, and calculate an abscissa variance and an ordinate variance of each pore region in a pixel coordinate system in the target binarized image; and taking the abscissa variance and the ordinate variance as the density of pores in the target binarized image.
Wherein, the sixth calculation formula is:varx represents the abscissa variance, vary represents the ordinate variance, N represents the total number of pore regions in the binarized image, p represents the p-th pore region,/>
Step 305, obtaining a test score of pores in the target binarized image according to the relative area and density, wherein the test score is used for indicating an evaluation result of the pores in the target binarized image.
Optionally, when performing test scoring on the pores, the application may comprehensively consider the obtained relative area M, the abscissa variance Varx and the ordinate variance Vary, and obtain the test score of the pores in the target binarized image according to the relative area M, the abscissa variance Varx and the ordinate variance Vary.
In one possible implementation manner, different relative area thresholds, an abscissa variance threshold and an ordinate variance threshold may be set in the computer device, and through the obtained relative area M, the abscissa variance Varx and the ordinate variance Vary, the relation between the obtained relative area M, the obtained abscissa variance Varx and the obtained ordinate variance Vary and the set relative area threshold and the set relative variance threshold respectively, a score calculation formula is determined, and a test score of the pore is calculated according to the score calculation formula.
Referring to table 1, a correspondence between a determination result and a score calculation formula according to an exemplary embodiment of the present application is shown.
TABLE 1
Wherein M is 1 ,M 2 ,M 3 ,M 4 The computer device queries the corresponding scoring calculation formula according to the judgment results of the obtained relative area M, the abscissa variance Varx and the ordinate variance Vary and the relative area threshold and the variance threshold, as shown in table 1.
In the above-described method, when the target binarized image is scored, since the target binarized image may be a left face or a right face sub-region of the face image to be detected, the computer device may average the scores of the left face sub-region and the right face sub-region, respectively, when the cheek region of the face image to be detected is scored, and use the average value as the score of the cheek region of the face image to be detected.
In summary, a target binarized image is obtained by preprocessing a face image to be detected; for each target position area in the target binarized image, calculating the similar roundness corresponding to each target position area, wherein the target position area is an image area formed by each pixel point with a pixel value of a pixel point in the binarized image as a first value, and the similar roundness is used for indicating the approaching degree of the outline shape and the circle of the target position area; and acquiring each pore region in the face image to be detected according to the roundness corresponding to each target position region. In the method, by adopting the roundness-like characteristic, each pore region is acquired from each target position region, the influence of slender contours such as hair, wrinkles and the like is eliminated, and the accuracy of acquiring the pore region in the facial image is improved.
In addition, the gravity center of each pore contour is calculated by combining the zero-order moment and the first-order moment, namely, one coordinate point is used for representing the position of one pore, the variance of the abscissa and the ordinate is calculated respectively, a variance threshold is set, and the variance threshold and the relative area of the pore are used as the pore scoring standard, so that the accuracy of pore evaluation is improved.
The following are device embodiments of the present application, which may be used to perform method embodiments of the present application. For details not disclosed in the device embodiments of the present application, please refer to the method embodiments of the present application.
Referring to fig. 6, which shows a block diagram of a pore region acquiring apparatus according to an exemplary embodiment of the present application, the pore region acquiring apparatus 600 may include: an image acquisition module 601, a circularity-like calculation module 602, and a pore acquisition module 603.
The image obtaining module 601 is configured to pre-process a face image to be detected, and obtain a target binarized image.
The roundness calculation module 602 is configured to calculate, for each target position area in the target binarized image, a roundness corresponding to each target position area, where the target position area is an image area formed by each pixel point in the binarized image, where a pixel value of each pixel point is a first value, and the roundness is used to indicate a degree of proximity between a contour shape and a circle of the target position area.
The pore obtaining module 603 is configured to obtain each pore area in the face image to be detected according to the roundness corresponding to each target position area.
In summary, a target binarized image is obtained by preprocessing a face image to be detected; for each target position area in the target binarized image, calculating the similar roundness corresponding to each target position area, wherein the target position area is an image area formed by each pixel point with a pixel value of a pixel point in the binarized image as a first value, and the similar roundness is used for indicating the approaching degree of the outline shape and the circle of the target position area; and acquiring each pore region in the face image to be detected according to the roundness corresponding to each target position region. In the method, by adopting the roundness-like characteristic, each pore region is acquired from each target position region, the influence of slender contours such as hair, wrinkles and the like is eliminated, and the accuracy of acquiring the pore region in the facial image is improved.
Referring to fig. 7, a schematic diagram of a computer device according to an exemplary embodiment of the present application is shown. As shown in fig. 7, may include: radio Frequency (RF) circuitry 710, memory 720, input unit 730, display unit 740, sensor 750, audio circuitry 760, wiFi module 770, processor 780, and power supply 790. In the above embodiment, the computer device may be a terminal. Those skilled in the art will appreciate that the terminal structure shown in fig. 7 is not limiting of the computer device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
Alternatively, the computer device may be a cell phone, tablet computer, electronic book reader, smart glasses, smart watch, MP3 player (Moving Picture Experts Group Audio Layer III, dynamic video expert compression standard audio plane 3), MP4 (Moving Picture Experts Group Audio Layer IV, dynamic video expert compression standard audio plane 4) player, notebook computer, laptop portable computer, desktop computer, and the like.
The various constituent elements of the computer device are described below in conjunction with FIG. 7:
The RF circuit 710 may be configured to receive and transmit signals during a message or a call, and specifically, receive downlink information of a base station and process the downlink information with the processor 780; in addition, the data of the design uplink is sent to the base station. Typically, the RF circuitry 710 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (Low Noise Amplifier, LNA), a duplexer, and the like. In addition, the RF circuitry 710 may also communicate with networks and other devices via wireless communications. The wireless communications may use any communication standard or protocol including, but not limited to, global system for mobile communications (Global System of Mobile communication, GSM), general packet radio service (General Packet Radio Service, GPRS), code division multiple access (Code Division Multiple Access, CDMA), wideband code division multiple access (Wideband Code Division Multiple Access, WCDMA), long term evolution (Long Term Evolution, LTE), email, short message service (Short Messaging Service, SMS), and the like.
The memory 720 may be used to store software programs and modules that the processor 780 performs various functional applications and data processing of the computer device by running the software programs and modules stored in the memory 720. The memory 720 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, application programs required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to the use of the computer device (such as audio data, phonebooks, etc.), and the like. In addition, memory 720 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The input unit 730 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the computer device. In particular, the input unit 730 may include a touch panel 731 and other input devices 732. The touch panel 731, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on or thereabout the touch panel 731 using any suitable object or accessory such as a finger, a stylus, etc.), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch panel 731 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into touch point coordinates, which are then sent to the processor 780, and can receive commands from the processor 780 and execute them. In addition, the touch panel 731 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 730 may include other input devices 732 in addition to the touch panel 731. In particular, the other input devices 732 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, mouse, joystick, etc.
The display unit 740 may be used to display information input by a user or provided to the user as well as various menus of the computer device. The display unit 740 may include a display panel 741, and alternatively, the display panel 741 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 731 may cover the display panel 741, and when the touch panel 731 detects a touch operation thereon or thereabout, the touch operation is transferred to the processor 780 to determine the type of touch event, and then the processor 780 provides a corresponding visual output on the display panel 741 according to the type of touch event. Although in fig. 7, the touch panel 731 and the display panel 741 are two separate components to implement the input and input functions of the computer device, in some embodiments, the touch panel 731 and the display panel 741 may be integrated to implement the input and output functions of the computer device.
The computer device may also include at least one sensor 750, such as a light sensor, a motion sensor, and other sensors. In particular, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 741 according to the brightness of ambient light and a proximity sensor that may turn off the display panel 741 and/or the backlight when the computer device is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for recognizing the gesture of the computer equipment (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that may also be configured with the computer device are not described in detail herein.
Audio circuitry 760, speaker 761, microphone 762 may provide an audio interface between a user and a computer device. The audio circuit 760 may transmit the received electrical signal converted from audio data to the speaker 761, and the electrical signal is converted into a sound signal by the speaker 761 to be output; on the other hand, microphone 762 converts the collected sound signals into electrical signals, which are received by audio circuit 760 and converted into audio data, which are processed by audio data output processor 780 for transmission to, for example, another computer device via RF circuit 710, or for output to memory 720 for further processing.
WiFi belongs to a short-distance wireless transmission technology, and computer equipment can help a user to send and receive e-mails, browse web pages, access streaming media and the like through the WiFi module 770, so that wireless broadband Internet access is provided for the user. Although fig. 7 shows a WiFi module 770, it is understood that it does not belong to the essential constitution of the computer device, and can be omitted entirely as required within the scope of not changing the essence of the invention.
The processor 780 is a control center of the computer device, connects various parts of the entire computer device using various interfaces and lines, and performs various functions of the computer device and processes data by running or executing software programs and/or modules stored in the memory 720, and calling data stored in the memory 720, thereby performing overall monitoring of the computer device. Optionally, the processor 780 may include one or more processing units; preferably, the processor 780 may integrate an application processor that primarily processes operating systems, user interfaces, applications, etc., with a modem processor that primarily processes wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 780.
The computer device also includes a power supply 790 (e.g., a battery) for powering the various components, which may be logically connected to the processor 780 through a power management system, such as to perform charge, discharge, and power management functions by the power management system.
Although not shown, the computer device may further include a camera, a bluetooth module, etc., which will not be described herein.
The application embodiment also discloses a computer readable storage medium storing a computer program, wherein the computer program realizes the method in the method embodiment when being executed by a processor.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Those skilled in the art will also appreciate that the embodiments described in the specification are all alternative embodiments and that the acts and modules referred to are not necessarily required in the present application.
In various embodiments of the present application, it should be understood that the size of the sequence numbers of the above processes does not mean that the execution sequence of the processes is necessarily sequential, and the execution sequence of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer-accessible memory. Based on such understanding, the technical solution of the present application, or a part contributing to the prior art or all or part of the technical solution, may be embodied in the form of a software product stored in a memory, including several requests for a computer device (which may be a personal computer, a server or a network device, etc., in particular may be a processor in the computer device) to perform part or all of the steps of the above-mentioned method of the various embodiments of the present application.
Those of ordinary skill in the art will appreciate that all or part of the steps of the various methods of the above embodiments may be implemented by a program that instructs associated hardware, the program may be stored in a computer readable storage medium including Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), programmable Read-Only Memory (Programmable Read-Only Memory, PROM), erasable programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), one-time programmable Read-Only Memory (OTPROM), electrically erasable programmable Read-Only Memory (EEPROM), compact disc Read-Only Memory (Compact Disc Read-Only Memory, CD-ROM) or other optical disk Memory, magnetic disk Memory, tape Memory, or any other medium that can be used for carrying or storing data that is readable by a computer.
The foregoing describes, by way of example, a pore region acquiring method, apparatus, computer device, and storage medium disclosed in the embodiments of the present application, and applies examples herein to illustrate principles and embodiments of the present application, where the foregoing examples are provided to assist in understanding the methods and core ideas of the present application; meanwhile, as those skilled in the art will have variations in embodiments and application ranges based on the ideas of the present application, the present disclosure should not be construed as limiting the present application in view of the above.

Claims (9)

1. A method of pore region acquisition, the method comprising:
preprocessing a face image to be detected to obtain a target binarized image;
calculating the similar roundness corresponding to each target position area in the target binarized image, wherein the target position area is an image area formed by each pixel point with a first value of the pixel points in the binarized image, and the similar roundness is used for indicating the approaching degree of the outline shape and the circle of the target position area;
acquiring each pore region in the face image to be detected according to the roundness corresponding to each target position region;
acquiring the relative area and density of pores in the target binarized image, wherein the relative area is used for indicating the proportion of the total area of each pore area to the total area of the target binarized image;
and obtaining a test score of pores in the target binarized image according to the opposite area and the density, wherein the test score is used for indicating an evaluation result of the pores in the target binarized image.
2. The method of claim 1, wherein the calculating, for each target location area in the target binarized image, a roundness value corresponding to each target location area comprises:
For a first target position area in each target position area, acquiring a first pixel number and a second pixel number, wherein the first target position area is any one position area in each target position area, the first pixel number is the total number of all pixel points occupied by the area outline of the first target position area, and the second pixel number is the total number of all pixel points in the area surrounded by the area outline of the first target position area;
and calculating the roundness of the first target position area according to the first pixel number and the second pixel number.
3. The method of claim 2, wherein calculating the roundness of the first target location area from the first number of pixels and the second number of pixels comprises:
bringing the first pixel number and the second pixel number into a first calculation formula to calculate the roundness of the first target position area;
wherein, the first calculation formula is:s is the second number of pixels and C is the first number of pixels.
4. The method of claim 1, wherein said obtaining the relative area and density of pores in the target binarized image comprises:
Acquiring the relative area according to the total pixel point number occupied by each pore area in the target binarization image and the total pixel point number of the target binarization image;
acquiring respective barycentric coordinates of the pore areas in a pixel coordinate system of the target binarization image according to respective zero-order moment and first-order moment of the pore areas in the target binarization image;
and calculating the density according to the gravity center coordinates of each pore region.
5. The method of claim 4, wherein the obtaining, from the respective zeroth and first moments of the respective pore regions in the target binarized image, respective barycentric coordinates of the respective pore regions in a pixel coordinate system of the target binarized image comprises:
for each pore region, calculating the respective zero-order moment of each pore region in the target binarized image according to a second calculation formula, and calculating the respective first-order moment of each pore region in the target binarized image according to a third calculation formula and a fourth calculation formula;
wherein, the second calculation formula is:
The third calculation formula is as follows:
the fourth calculation formula is as follows:
M 00 is the zero order moment, M 10 And M 01 The first moment is V, the pixel value of any pixel point in a first pore area, the first pore area is any pore area in the pore areas, I is the abscissa in the pixel coordinate system of the target binarized image, and J is the ordinate in the pixel coordinate system of the target binarized image;
substituting the zero-order moment and the first-order moment of each pore region into a fifth calculation formula respectively, and calculating the gravity center coordinates of each pore region;
wherein, the fifth calculation formula is:
x 0p ,y 0p respectively representing the abscissa and the ordinate of the barycenter coordinate of the p-th pore region, M 00p Zero order moment representing the p-th pore region, M 10p And M 01p Representing the first moment of the p-th pore region.
6. The method of claim 4, wherein said calculating said density from the respective barycentric coordinates of said respective pore regions comprises:
substituting the gravity center coordinates of each pore region into a sixth calculation formula, and calculating the abscissa variance and the ordinate variance of each pore region in the pixel coordinate system;
And taking the abscissa variance and the ordinate variance as the density.
Wherein, the sixth calculation formula is:varx represents the abscissa variance, vary represents the ordinate variance, N represents the total number of pore regions in the binarized image, p represents the p-th pore region,/and->
7. A pore region acquisition apparatus, characterized in that the apparatus comprises:
the image acquisition module is used for preprocessing the face image to be detected and acquiring a target binarized image;
the roundness-like calculation module is used for calculating the roundness-like degree corresponding to each target position area in the target binarized image, wherein the target position area is an image area formed by each pixel point with a first pixel value of the pixel points in the binarized image, and the roundness-like degree is used for indicating the approaching degree of the outline shape of the target position area to the circle;
the pore acquisition module is used for acquiring each pore region in the face image to be detected according to the roundness corresponding to each target position region;
the apparatus further comprises:
the first acquisition module is used for acquiring the relative area and density of pores in the target binary image, wherein the relative area is used for indicating the proportion of the total area of each pore area to the total area of the target binary image;
And the second acquisition module is used for acquiring a test score of pores in the target binarized image according to the opposite area and the density, wherein the test score is used for indicating an evaluation result of the pores in the target binarized image.
8. A computer device comprising a memory and a processor, the memory having stored therein a computer program which, when executed by the processor, causes the processor to implement the pore region acquisition method as claimed in any one of claims 1 to 6.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the pore region acquisition method as claimed in any one of claims 1 to 6.
CN202110363508.8A 2021-04-02 2021-04-02 Pore region acquisition method, pore region acquisition device, computer equipment and storage medium Active CN113129315B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110363508.8A CN113129315B (en) 2021-04-02 2021-04-02 Pore region acquisition method, pore region acquisition device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110363508.8A CN113129315B (en) 2021-04-02 2021-04-02 Pore region acquisition method, pore region acquisition device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113129315A CN113129315A (en) 2021-07-16
CN113129315B true CN113129315B (en) 2024-04-09

Family

ID=76774788

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110363508.8A Active CN113129315B (en) 2021-04-02 2021-04-02 Pore region acquisition method, pore region acquisition device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113129315B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102445257B1 (en) * 2022-02-23 2022-09-23 주식회사 룰루랩 Method and apparatus for detecting pores based on artificial neural network and visualizing the detected pores

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109063598A (en) * 2018-07-13 2018-12-21 北京科莱普云技术有限公司 Face pore detection method, device, computer equipment and storage medium
CN112037162A (en) * 2019-05-17 2020-12-04 华为技术有限公司 Facial acne detection method and equipment
CN112258396A (en) * 2020-12-17 2021-01-22 恒银金融科技股份有限公司 Method for scaling character image

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10368795B2 (en) * 2014-06-30 2019-08-06 Canfield Scientific, Incorporated Acne imaging methods and apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109063598A (en) * 2018-07-13 2018-12-21 北京科莱普云技术有限公司 Face pore detection method, device, computer equipment and storage medium
CN112037162A (en) * 2019-05-17 2020-12-04 华为技术有限公司 Facial acne detection method and equipment
CN112258396A (en) * 2020-12-17 2021-01-22 恒银金融科技股份有限公司 Method for scaling character image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Nermin Kamal Abdel Wahab ; Elsayed Essa Hemayed ; Magda Bahaa Fayek.HEARD: An automatic human EAR detection technique.2012 International Conference on Engineering and Technology (ICET).2012,全文. *
人脸图像中毛孔的检测及定量评价研究;冯相辉;中国优秀硕士学位论文全文数据库;全文 *
便携式人脸肤质检测与评价系统的设计与实现;张景源;中国优秀硕士学位论文全文数据库;正文第7页第1行-第43页第17行,图2-5-图2-14 *

Also Published As

Publication number Publication date
CN113129315A (en) 2021-07-16

Similar Documents

Publication Publication Date Title
CN107633207B (en) AU characteristic recognition methods, device and storage medium
CN107977674B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN108259758B (en) Image processing method, image processing apparatus, storage medium, and electronic device
US9665763B2 (en) Finger/non-finger determination for biometric sensors
CN110163806B (en) Image processing method, device and storage medium
CN109346061B (en) Audio detection method, device and storage medium
US11653873B2 (en) Skin detection device and product information determination method, device and system
EP2879095A1 (en) Method, apparatus and terminal device for image processing
CN108875594B (en) Face image processing method, device and storage medium
CN110287918B (en) Living body identification method and related product
CN103745235A (en) Human face identification method, device and terminal device
CN110570460B (en) Target tracking method, device, computer equipment and computer readable storage medium
CN107451454B (en) Unlocking control method and related product
CN107832784A (en) A kind of method of image beautification and a kind of mobile terminal
CN108198159A (en) A kind of image processing method, mobile terminal and computer readable storage medium
CN110765924A (en) Living body detection method and device and computer-readable storage medium
CN107091704A (en) Pressure detection method and device
CN113129315B (en) Pore region acquisition method, pore region acquisition device, computer equipment and storage medium
CN112995757B (en) Video clipping method and device
CN115482157A (en) Image processing method and device and computer equipment
CN111553854A (en) Image processing method and electronic equipment
CN109359543B (en) Portrait retrieval method and device based on skeletonization
CN114943976B (en) Model generation method and device, electronic equipment and storage medium
CN111402271A (en) Image processing method and electronic equipment
CN110598762A (en) Audio-based trip mode detection method and device and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant