CN112818916A - Method for automatically and truly measuring related parameters of aesthetic standards of human faces - Google Patents
Method for automatically and truly measuring related parameters of aesthetic standards of human faces Download PDFInfo
- Publication number
- CN112818916A CN112818916A CN202110207706.5A CN202110207706A CN112818916A CN 112818916 A CN112818916 A CN 112818916A CN 202110207706 A CN202110207706 A CN 202110207706A CN 112818916 A CN112818916 A CN 112818916A
- Authority
- CN
- China
- Prior art keywords
- face image
- feature points
- distance
- real
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 18
- 238000012360 testing method Methods 0.000 claims abstract description 33
- 238000001514 detection method Methods 0.000 claims abstract description 12
- 238000004364 calculation method Methods 0.000 claims description 9
- 238000013519 translation Methods 0.000 claims description 5
- 238000006243 chemical reaction Methods 0.000 claims description 2
- 238000005259 measurement Methods 0.000 abstract description 5
- 238000004590 computer program Methods 0.000 abstract description 2
- 238000012545 processing Methods 0.000 abstract description 2
- 210000001508 eye Anatomy 0.000 description 18
- 230000009466 transformation Effects 0.000 description 9
- 210000000988 bone and bone Anatomy 0.000 description 4
- 210000004709 eyebrow Anatomy 0.000 description 3
- 210000002837 heart atrium Anatomy 0.000 description 3
- 241000083547 Columella Species 0.000 description 2
- 241000158526 Nasalis Species 0.000 description 2
- 230000003796 beauty Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 210000000887 face Anatomy 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 210000001061 forehead Anatomy 0.000 description 1
- 210000000214 mouth Anatomy 0.000 description 1
- 210000001331 nose Anatomy 0.000 description 1
- 239000011148 porous material Substances 0.000 description 1
- 210000000697 sensory organ Anatomy 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000037303 wrinkles Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Health & Medical Sciences (AREA)
- Mathematical Optimization (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Pure & Applied Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Computational Mathematics (AREA)
- Human Computer Interaction (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Biology (AREA)
- Algebra (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a method for automatically and truly measuring related parameters of human face aesthetic standard, which is applied to the field of image processing and aims at solving the problems of complex measurement and high cost existing in the detection of related parameters of human face aesthetic standard in the prior art; in the invention, a reference face is selected in an off-line stage, and the real size of a unit pixel is measured and obtained; in the testing stage, the real size of the unit pixel of any one face image can be conveniently and accurately calculated, and further related parameters of the face aesthetic standard can be measured; for a computer program, only the real distance of the unit pixel of the reference face image and the feature point of the reference face image need to be stored, and the real size of the unit pixel of any one face image is rapidly calculated in the testing stage, so that the method has the huge advantages of small storage, accurate measurement and convenience in use.
Description
Technical Field
The invention belongs to the field of image processing, and particularly relates to a face measurement technology.
Background
With the improvement of social and economic levels, people pay more and more attention to their appearance, and more people are consumed by institutions such as skin removal management centers, beauty parlors, plastic hospitals and the like. Regardless of a skin management center, a beauty parlor or a plastic hospital, the collection of the basic size information of facial features is an essential link, particularly, the plastic hospital accurately measures the real size of the facial features, including aesthetic standard analysis of santing five eyes and the like and the size of each part of the face including eyebrows, eyes, noses, mouths, chin and the like, and is a necessary condition for plastic decision making. The skin management center needs to know the physical size of skin pores, wrinkles and color spots accurately in order to guide the care treatment.
The prior art has the following defects: a graduated scale must be placed for each photographing, and the real size of the unit pixel is manually calculated; or by means of expensive laser, 3D ranging devices.
Disclosure of Invention
In order to solve the technical problem, the invention provides a method for automatically and truly measuring related parameters of human face aesthetic standards.
The technical scheme adopted by the invention is as follows: a method for automatically and truly measuring parameters related to aesthetic standards of human faces comprises the following steps:
s1, calculating to obtain the real size of the unit pixel of the reference face image;
s2, converting the feature points of the test face image into the feature points of the reference face image by performing similarity conversion;
s3, calculating the pixel distance between any two feature points of the transformed test face;
s4, calculating the real distance between any two feature points on the transformed test face image according to the pixel distance between the two feature points of the transformed test face image;
and S5, calculating the real size of the unit pixel of the tested face image according to the pixel distance calculated in the step S3 and the real distance calculated in the step S4, and further obtaining parameters related to the aesthetic standard of the tested face.
Step S1 includes:
s11, randomly selecting two feature points from the reference face image, and measuring the real distance between the two feature points;
s12, calculating the pixel distance between the two characteristic points of the reference face image;
and S13, calculating the real size of the unit pixel of the reference face image according to the real distance of the step S11 and the pixel distance of the step S12.
The characteristic points in the reference face image or the test face image are obtained by adopting face detection and face characteristic point detection technologies.
The calculation formula of the real size of the unit pixel of the reference face image is as follows:
Rr=Dr/Dp
wherein, Rr is the real size of a unit pixel of the reference face image, Dr is the real distance between two arbitrarily selected feature points of the reference face image, and Dp is the pixel distance between the two arbitrarily selected feature points of the reference face image.
The calculation formula of step S2 is:
where θ represents the rotation angle, s represents the scaling scale, txRepresenting the amount of translation, t, in the x directionyIndicating the amount of translation in the y-direction.
The calculation formula of step S4 is:
Drc=Dpc*Rr
wherein Dpc is the pixel distance between two arbitrarily selected feature points on the test face image, and Drc is the real distance between the two arbitrarily selected feature points on the transformed test face image.
The invention has the beneficial effects that: the invention only needs to select a reference face in an off-line stage and measure to obtain the real size of the unit pixel. In the testing stage, the real size of the unit pixel of any one face image can be conveniently and accurately calculated. For a computer program, only the real distance of the unit pixel of the reference face image and the feature point of the reference face image need to be stored, and the real size of the unit pixel of any one face image is rapidly calculated in the testing stage, so that the method has the huge advantages of small storage, accurate measurement and convenience in use.
Drawings
FIG. 1 is a flowchart of calculating a true dimension of a unit pixel of a reference face image according to an embodiment of the present invention;
FIG. 2 is a face image for calculating the true size of a unit pixel of a reference face image according to an embodiment of the present invention;
FIG. 3 is a flowchart for calculating the true size of a unit pixel of a face image according to an embodiment of the present invention;
fig. 4 is a face image for performing face and face feature point detection on a test face image according to an embodiment of the present invention;
FIG. 5 is a face image for calculating a pixel distance between two feature points of a test face image according to an embodiment of the present invention;
fig. 6 is a face image in which feature points of a test face image are transformed to feature points of a reference face according to an embodiment of the present invention;
fig. 7 is a face image obtained by calculating a pixel distance between two feature points of a transformed test face image according to an embodiment of the present invention.
Fig. 8 is a face image labeled with results of three different aesthetic indicators of a tested face image according to an embodiment of the present invention.
Fig. 9 is a face image labeled with results of five-eye aesthetic indicators of a tested face image according to an embodiment of the present invention.
Fig. 10 is a face image labeled with a result of testing the golden triangle aesthetic index of the face image according to the embodiment of the present invention.
Detailed Description
In order to facilitate understanding of the contents of the present invention, the following technical terms are first explained:
1. face detection and feature point detection
The face detection refers to finding out the position information of a face in an image. The detection of the human face characteristic points refers to that five sense organs are positioned in a human face area in a key point mode.
2. Human face feature point similarity transformation
The characteristic point similarity transformation means that the figure formed by the characteristic points of the human face is changed from one state to another state, the shape is kept unchanged in the transformation process, and the size, the direction and the position are changeable. Can be represented by the following matrix:
the transformation has four degrees of freedom, namely rotation angle, scaling dimension s, and translation tx,ty。
3. Solving similarity transformation matrix
The similarity transformation has four degrees of freedom, namely four unknown variables, and theoretically, only two points are needed to be set into four equations to solve. Since we have multiple point sets, we solve an optimal solution using the least squares method.
4. Parameters related to human face aesthetic standard
The face also has many accepted aesthetic metrics such as santing eyes, golden triangle, etc.
The three-court five-eye is a general standard proportion of the face length to the face width of a person, and the simplest three-court five-eye is that the head is divided into three equal parts from top to bottom and the width of the head is divided into five equal parts.
The golden triangle is an inverted triangle formed by the centers of the two orbital bones (pupil center) and the center of the piriform foramen (columella nasalis).
The method of the invention comprises two steps:
A. offline stage/offline stage
As shown in fig. 1, this stage is to prepare a front face reference picture, and obtain feature points of the face by using face detection and face feature point detection techniques. And measuring the real distance between any two characteristic points by using a measuring tool, marking as Dr, and calculating the pixel distance between the two characteristic points, marking as Dp, wherein the pixel distance Dp is the Euclidean distance.
For pixels p (x1, y1) and q (x2, y2), x1, y1 and x2, y2 are coordinates of pixel points p and q on the image, and the euclidean distance between the two pixel points is defined as follows:
then the selected reference picture will on average represent a true distance per pixel of:
Rr=Dr/Dp
in order to reduce errors, the true distances and the pixel distances between different feature points can be measured multiple times, and an average value Rr is taken. As shown in fig. 2, the real distance Dr between two feature points of the right eye corner of the reference face is 33.4mm, the pixel distance Dp is 167px (px is abbreviated as pixel), and Rr is 33.4/167 — 0.2 mm/px;
B. the measurement phase, as shown in fig. 3, comprises the following sub-steps:
b1, detecting the face and the face characteristic points of the tested face image, as shown in fig. 4.
B2, selecting two feature points (taking two feature points of the right eye corner as an example) of the tested face to calculate the pixel distance between the two feature points, namely the euclidean distance, and recording the pixel distance Dpc between the two feature points of the right eye corner of the tested face as 40 pixels in the same calculation process as the pixel distance Dp in the step a; as shown in fig. 5.
And B3, performing similarity transformation on the feature points of the test face image, and transforming the feature points to the feature points of the reference face image, wherein as shown in FIG. 6, the black feature points are the feature points of the reference image, and the white feature points are the feature points of the test image after transformation.
B4, calculating the pixel distance between the feature points of the transformed test face, and recording the pixel distance Dpc between the two feature points of the right eye corner of the test face as 147 pixels, as shown in fig. 7.
And B5, calculating the real size of the unit pixel of the image of the test person.
The real size of the unit pixel of the reference face image is calculated to be Rr 33.4/167-0.2 mm/px in the off-line stage. After the test face characteristic points are subjected to similarity transformation to the reference face characteristic points, the transformed characteristic points and the reference face characteristic points are in the same scale space, namely have the same unit pixel real size. The true distance of the transformed two test face feature points is denoted as Drc:
Drc=Dpc*Rr=147px*0.2mm/px=29.4mm
the real size of a unit pixel of the image of the test person is recorded as Rc:
Rc=Drc/Dpc=29.4mm/40px=0.735mm/px
b6 testing face image
On the basis of testing the real size Rc of the unit pixel of the face image, the method can calculate the real distance between any two points and related aesthetic index parameters, taking three-family five-eye and golden triangle as examples:
three-family five-eye related aesthetic criteria calculation:
three parts: the length ratio of the face is divided into three equal parts, the part from the forehead hairline to the eyebrow bone is called as the upper atrium, the part from the eyebrow bone to the nasal floor is called as the middle atrium, and the part from the nasal floor to the chin is called as the lower atrium.
According to the real size of the unit pixel of the face image to be tested and the pixel distance of the three-family on the picture, the real distance of the three-family is known by multiplying the real size of the unit pixel of the face image to be tested and the pixel distance of the three-family on the picture. As shown in fig. 8, the three dimensions from top to bottom are: 73.52mm, 78.50mm and 75.23 mm.
Definition of five eyes: the width of the ideal face is the length of five eyes, namely, the length of one eye is taken as a standard, one eye is from the left hairline to the left outer canthus, two eyes are from the left outer canthus to the left inner canthus, the distance between the two inner canthus is three eyes, four eyes are from the right inner canthus to the right outer canthus, and five eyes are called from the right outer canthus to the right hairline.
And multiplying the real size of the unit pixel of the tested face image and the pixel distance of the five eyes on the picture to know the real distance of the five eyes. As shown in fig. 9, the five eye sizes from left to right are: 32.83mm, 30.29mm, 41.30mm, 30.57mm, 30.68 mm.
Calculation of the aesthetic standards related to the golden triangle:
the golden triangle is an inverted triangle formed by the centers of the two orbital bones (pupil center) and the center of the piriform foramen (columella nasalis). Therefore, according to the real size of the unit pixel of the tested face image and the position of each vertex of the golden triangle on the picture, the real side length and each angle of the golden triangle can be known, and only the most important golden triangle angle is marked as 67.38 degrees in fig. 10.
It will be appreciated by those of ordinary skill in the art that the embodiments described herein are intended to assist the reader in understanding the principles of the invention and are to be construed as being without limitation to such specifically recited embodiments and examples. Various modifications and alterations to this invention will become apparent to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.
Claims (6)
1. A method for automatically and truly measuring parameters related to aesthetic standards of human faces is characterized by comprising the following steps:
s1, calculating to obtain the real size of the unit pixel of the reference face image;
s2, converting the feature points of the test face image into the feature points of the reference face image by performing similarity conversion;
s3, calculating the pixel distance between any two feature points of the transformed test face;
s4, calculating the real distance between any two feature points on the transformed test face image according to the pixel distance between the two feature points of the transformed test face image;
and S5, calculating the true size of the unit pixel of the tested face image according to the pixel distance calculated in the step S3 and the true distance calculated in the step S4.
2. The method for automatically and truly measuring the parameters related to the aesthetic standard of the human face as claimed in claim 1, wherein the step S1 comprises:
s11, randomly selecting two feature points from the reference face image, and measuring the real distance between the two feature points;
s12, calculating the pixel distance between the two characteristic points of the reference face image;
and S13, calculating the real size of the unit pixel of the reference face image according to the real distance of the step S11 and the pixel distance of the step S12.
3. The method according to claim 2, wherein the characteristic points in the reference face image or the test face image are obtained by face detection and face characteristic point detection.
4. The method according to claim 3, wherein the calculation formula of the real size of the unit pixel of the reference face image is as follows:
Rr=Dr/Dp
wherein, Rr is the real size of a unit pixel of the reference face image, Dr is the real distance between two arbitrarily selected feature points of the reference face image, and Dp is the pixel distance between the two arbitrarily selected feature points of the reference face image.
5. The method for automatically and truly measuring the parameters related to the aesthetic standard of the human face as claimed in claim 4, wherein the calculation formula of the step S2 is as follows:
where θ represents the rotation angle, s represents the scaling scale, txRepresenting the amount of translation, t, in the x directionyIndicating the amount of translation in the y-direction.
6. The method for automatically and truly measuring the parameters related to the aesthetic standard of the human face as claimed in claim 5, wherein the calculation formula of step S4 is:
Drc=Dpc*Rr
wherein Dpc is the pixel distance between two arbitrarily selected feature points on the test face image, and Drc is the real distance between the two arbitrarily selected feature points on the transformed test face image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110207706.5A CN112818916A (en) | 2021-02-25 | 2021-02-25 | Method for automatically and truly measuring related parameters of aesthetic standards of human faces |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110207706.5A CN112818916A (en) | 2021-02-25 | 2021-02-25 | Method for automatically and truly measuring related parameters of aesthetic standards of human faces |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112818916A true CN112818916A (en) | 2021-05-18 |
Family
ID=75865422
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110207706.5A Pending CN112818916A (en) | 2021-02-25 | 2021-02-25 | Method for automatically and truly measuring related parameters of aesthetic standards of human faces |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112818916A (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150131853A1 (en) * | 2013-11-08 | 2015-05-14 | Electronics And Telecommunications Research Institute | Stereo matching system and method for generating disparity map using same |
CN109657607A (en) * | 2018-12-17 | 2019-04-19 | 中新智擎科技有限公司 | A kind of human face target distance measuring method, device and storage medium based on recognition of face |
CN109961006A (en) * | 2019-01-30 | 2019-07-02 | 东华大学 | A kind of low pixel multiple target Face datection and crucial independent positioning method and alignment schemes |
-
2021
- 2021-02-25 CN CN202110207706.5A patent/CN112818916A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150131853A1 (en) * | 2013-11-08 | 2015-05-14 | Electronics And Telecommunications Research Institute | Stereo matching system and method for generating disparity map using same |
CN109657607A (en) * | 2018-12-17 | 2019-04-19 | 中新智擎科技有限公司 | A kind of human face target distance measuring method, device and storage medium based on recognition of face |
CN109961006A (en) * | 2019-01-30 | 2019-07-02 | 东华大学 | A kind of low pixel multiple target Face datection and crucial independent positioning method and alignment schemes |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1248237B1 (en) | Method for measuring volumetric changes in portions of a human body | |
US9445087B2 (en) | Systems, devices, and methods for providing products and consultations | |
US8391639B2 (en) | Method and apparatus for realistic simulation of wrinkle aging and de-aging | |
JP5849048B2 (en) | Three-dimensional (3D) ultrasound imaging system for scoliosis evaluation | |
JP2001000419A (en) | Skin imaging and analyzing system and method for the same | |
CN106796449A (en) | Eye-controlling focus method and device | |
JP4682373B2 (en) | Face image synthesis device, face image synthesis method, and face image synthesis program | |
Katsumi et al. | Quantitative analysis of facial palsy using a three-dimensional facial motion measurement system | |
TWI452998B (en) | System and method for establishing and analyzing skin parameters using digital image multi-area analysis | |
CN101071473A (en) | Feature point detector and its method | |
Deng et al. | A regional method for craniofacial reconstruction based on coordinate adjustments and a new fusion strategy | |
Vezzetti et al. | 3D human face soft tissues landmarking method: an advanced approach | |
CN112349391A (en) | Optimized rib automatic labeling method | |
Baksi et al. | Accuracy of an automated method of 3D soft tissue landmark detection | |
CN111275754B (en) | Face acne mark proportion calculation method based on deep learning | |
CN112818916A (en) | Method for automatically and truly measuring related parameters of aesthetic standards of human faces | |
JP5687532B2 (en) | Information processing method, Frankfurt plane calculation method, and information processing apparatus | |
Mao et al. | Constructing dense correspondences for the analysis of 3D facial morphology | |
KR20130036526A (en) | Diagnosis device for face form using facial image and cephalometric image | |
O'Mara | Automated facial metrology | |
Stern et al. | 3D face analysis for healthcare | |
Thomas | Three-dimensional quantification of facial shape | |
JP2009054060A (en) | Method for evaluating face shape | |
Park et al. | Quantitative evaluation of facial sagging in different body postures using a three‐dimensional imaging technique | |
Lin et al. | Reliability and reproducibility of landmarks on three-dimensional soft-tissue cephalometrics using different placement methods |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210518 |
|
RJ01 | Rejection of invention patent application after publication |