CN109886099B - Method for establishing age evaluation standard model - Google Patents

Method for establishing age evaluation standard model Download PDF

Info

Publication number
CN109886099B
CN109886099B CN201910027338.9A CN201910027338A CN109886099B CN 109886099 B CN109886099 B CN 109886099B CN 201910027338 A CN201910027338 A CN 201910027338A CN 109886099 B CN109886099 B CN 109886099B
Authority
CN
China
Prior art keywords
matrix
age
face image
images
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910027338.9A
Other languages
Chinese (zh)
Other versions
CN109886099A (en
Inventor
吕宁
陈晨
朱双四
牛振兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201910027338.9A priority Critical patent/CN109886099B/en
Publication of CN109886099A publication Critical patent/CN109886099A/en
Application granted granted Critical
Publication of CN109886099B publication Critical patent/CN109886099B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a method for establishing an age evaluation standard model, which comprises the following steps: establishing a face image sample set; establishing a plurality of groups of face image pairs according to the face image sample set; establishing a first matrix according to the plurality of groups of face image pairs; assigning values to the first matrix to generate a second matrix; generating a cost matrix according to the second matrix; and generating the age evaluation standard model according to the cost matrix. According to the method and the device, the first matrix is established according to the real age of the face images, then the age of the two face images in the face image pair is estimated to obtain the estimation score, the estimation score and the first matrix are combined to assign a value to each coordinate of the first matrix, the second matrix is generated, and the second matrix is used for generating the judgment standard model, so that human perception factors are considered in the judgment standard model provided by the application, and the judgment result of the age estimation is more accurate.

Description

Method for establishing age evaluation standard model
Technical Field
The invention relates to the technical field of data processing, in particular to a method for establishing an age evaluation standard model.
Background
Age estimation has been a hot problem in computer vision and has great application prospect in drawing analysis and business management. Early, studies on "age estimation" included mainly two approaches, either multi-classification or metric regression. In the multi-classification approach, the class labels are considered independent of each other, and therefore, the ordinal relationship between adjacent ages is not taken into account. In the metric regression method, the age labels are modeled as numerical variables, and it is assumed that the difference between adjacent proportional values represents equal distances, but according to facial measurement, the changes of faces corresponding to different adjacent age groups are different, that is, the distances are different, so the metric regression method is not rigorous.
The currently used method for age estimation is based on ordinal regression model, and the used criterion is MAE (Mean Absolute Error) to measure the quality of the prediction result of age estimation. Specifically, MAE is defined as the mean absolute error between the true output and the predicted value. The error model for MAE is:
Figure BDA0001942980560000011
wherein y isiDenotes the true age label, f (x)i) Indicating the predicted age. According to the MAE error model, the prediction cost between adjacent ages in different periods is equal. However, according to human facial measurements, this cost should vary from one age period to another. For example, predicting an age cost between 12 and 13 years and predicting an age cost between 36 and 37 years would appear to be significantly different in our human, while the end result would be the same based on this error in MAE measurements. That is, the MAE does not take human perception into account, which has a great impact on the problem of studying age estimation.
Disclosure of Invention
In order to overcome the problems in the prior art, the invention provides a method for establishing an age assessment standard model, which comprises the following specific implementation modes:
the embodiment of the invention provides a method for establishing an age evaluation criterion model, which comprises the following steps:
establishing a face image sample set;
establishing a plurality of groups of face image pairs according to the face image sample set;
establishing a first matrix according to the plurality of groups of face image pairs;
assigning values to the first matrix to generate a second matrix;
generating a cost matrix according to the second matrix;
and generating the age evaluation standard model according to the cost matrix.
In a specific embodiment, the establishing of the face image sample set includes:
acquiring N face images, wherein N is an integer greater than or equal to 2;
performing similarity screening on the N face images to obtain M sample images, wherein M is less than or equal to N;
wherein the face image sample set comprises the M sample images.
In a specific embodiment, establishing a plurality of sets of face image pairs from the face image sample set includes:
randomly selecting two images from the face image sample set to generate a group of face image pairs;
and generating a plurality of groups of the human face image pairs.
In a specific embodiment, building a first matrix from a plurality of sets of the pairs of face images includes:
generating position coordinates of the human face image pair according to the real ages of the two human face images in the human face image pair;
generating a plurality of position coordinates according to the plurality of groups of face image pairs;
generating the first matrix from a plurality of the position coordinates.
In a specific embodiment, assigning the first matrix and generating the second matrix includes:
acquiring an age difference judgment result of each group of the face image pairs;
assigning the position coordinate with the age difference judgment result to generate an age difference matrix;
and carrying out normalization processing on the age difference matrix to generate the second matrix.
In a specific embodiment, after generating the second matrix, the method further includes:
setting an age difference threshold;
acquiring the real ages of two face images in the face image pair, and solving the absolute value of the age difference according to the real ages;
and when the absolute value of the age difference is larger than the age difference threshold value, assigning 1 to the position coordinates of the human face image pair.
In a specific embodiment, generating the cost matrix according to the second matrix includes:
acquiring an absolute cost matrix;
and carrying out operation processing on the second matrix and the absolute cost matrix to generate the cost matrix.
In a specific embodiment, establishing the position coordinates of the pair of face images according to the ages of the two face images in the pair of face images includes:
acquiring the real ages of two face images in the face image pair;
acquiring an age threshold;
processing the real age according to the age threshold to generate a second age value;
tagging the second age value as a position coordinate of the pair of face images.
In a specific embodiment, processing the real age according to the age threshold includes:
subtracting the age threshold from the true age.
In a specific embodiment, obtaining the age difference determination result of each group of the face image pairs includes:
estimating the ages of the two face images in the face image pair to obtain an estimated result;
calculating to obtain an estimated age difference value according to the estimation result;
comparing the estimated age difference with the actual age difference to obtain an evaluation score of the human face image pair;
and marking the evaluation score as the age difference judgment result of the face image pair.
Compared with the prior art, the invention has the beneficial effects that:
1. according to the method for establishing the age estimation and judgment standard model, the first matrix is established according to the real age of the face images, then the age estimation is carried out on the two face images in the face image pair to obtain the estimation score, the estimation score is combined with the first matrix to assign a value to each coordinate of the first matrix to generate the second matrix, and the judgment standard model is generated by utilizing the second matrix.
2. The embodiment of the invention firstly introduces human perception capability by assigning the first matrix, so that the prediction result is more fit with human facial metrology.
Drawings
FIG. 1 is a logic diagram of the method for establishing an age estimation criteria model according to the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
Example one
The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
As shown in fig. 1, fig. 1 is a logic diagram of a method for establishing an age estimation and judgment standard model according to the present invention. The embodiment of the invention provides a method for establishing an age evaluation criterion model, which comprises the following steps:
the method comprises the steps of establishing a face image sample set, wherein the face image sample set needs numerous pictures, the age of a face image needs a relatively stable region, specifically, the face image can be collected on a social network, such as a human network, the human network is a social network for students, a user can publish photos and other contents, and the like, and the face image sample set is widely used by Asian students. Even for those who have already been graduate, the classmates and friends at one time can be contacted by the people. Therefore, the approximate range of the user's age on the top of the human is between 15-40 years, based on which we set the age width of the face image sample set to be [15,70], besides, the birthday time of the user can be obtained from the network, and therefore, the real age of the face image of the user can be obtained.
After acquiring N face images from a network, screening the face images, removing the same images, and finally obtaining M sample images, and setting the M sample images as a face image sample set, it should be noted that not only the face images but also the real ages of the face images need to be obtained.
Further, the images in the face image sample set are paired, the pairing mode is pairwise pairing, and M sample images can form 2MPairs of images.
Further, establishing position coordinates for each image pair, specifically, each image pair includes a first image and a second image, the first image and the second image have an age difference, for example, in a certain image pair, the first image is a face image of a middle-aged female, the second image is a face image of a middle-aged male, the true age of the first image is 45 years, the true age of the second image is 47 years, the age threshold is 15, and the far point coordinates are (15,15), then the position coordinates of the image pair are (30,32), which indicates that the image pair is located in the 30 th row and 32 columns of the first matrix; if the first image is 16 years old and the second image is 17 years old, the coordinates of the pair of images are (1,2), indicating that the pair of images is located in row 1 and column 2 of the first matrix; a first image 17, a second image 16, the coordinates of the pair of images are (2,1), indicating that the pair of images is located in row 2, column 1 of the first matrix; if the first image is 18 years old and the second image is 36 years old, the coordinates of the pair of images are (3,21), indicating that the pair of images is located at row 3 and column 21 of the first matrix; if the first image is 70 years old and the second image is 69 years old, then the coordinates of the pair of images are (55,54), indicating that the pair of images is located in row 55, column 54 of the first matrix; with one position coordinate for each image pair, and thus 2MThe image coordinates. And the coordinate value of each position coordinate is the second age value of the two face images in the image pair.
Whereby a first matrix is established, e.g. as
Figure BDA0001942980560000061
The first matrix is an age matrix composed of the ages of the first image as rows and the ages of the second image as columns.
Further, age evaluation is carried out on each group of image pairs, 10 persons are selected to carry out respective evaluation on a certain image pair, each person gives an age difference value, and therefore 10 age difference values are obtained in total;
comparing the 10 age differences with the actual age difference of the image pair, judging accurately to obtain 1 point, judging inaccurately to obtain 0 point, and then adding the total points. For example:
the true age of the first image is 45 years old, the true age of the second image is 47 years old, and the difference between the true ages is 2 years old;
the first evaluator judges the age difference of the two images to be 3 years old; the first evaluator judges the age difference of the two images to be 2 years old; the third evaluator judges the age difference of the two images to be 4 years old; and the like to form an array 3,2,4,3,2,5,1,2,4, 2. Then, the estimated age differences are compared with the actual age differences one by one, so that scoring results 0,1,0,0,1,0,0,1,0 and 1 can be obtained, then the scoring results are accumulated, an evaluation score of a face image pair is obtained to be 4, the evaluation score is an age difference judgment result, and the judgment of the estimation accuracy is shown to be 40% of the chance.
It should be noted that, due to the image sample set, one age has a plurality of images, for example, a is shown as age 30, B is shown as age 20, C is shown as age 30, D is shown as age 30, E is shown as age 20, in this case, the composition of the image pair has diversity, such as AB (30,20), AC (30,30), AD (30,30), AE (30,20), BC (20,30), BD (20,30), BE (20,20), CD (30,30), CE (30,20), DE (30,20), commonly minus the age threshold 15, position coordinates AB (15,5), AC (15,15), AD (15,15), AE (15,5), BC (5,15), BD (5,15), BE (5,5), CD (15,15), CE (15,5), DE (15,5) can BE obtained, and thus it is known that a plurality of image pairs appear at the same position coordinate. The image pair with coordinates (15,5) has AB, AE, CE, DE; the image pair with coordinates (15,15) has AC, AD, CD; the image pair with coordinates (15,5) has AE, DE; the image pair with coordinates (5,15) has BC, BD; the image pair with coordinates (5,5) has BE.
In this case, the position coordinates are assigned, specifically: scoring each image pair, the scoring process is as above, and is not repeated, taking the image pair with coordinates (15,5) as an example: the evaluation of the AB pair is divided into 4 points, the evaluation of the AE pair is divided into 6 points, the evaluation of the CE pair is divided into 4 points, and the evaluation of the DE pair is divided into 6 points, then the evaluation points of the four sets of image pairs are averaged, and if 4+6+4+6 is 20/4 is 5, then the assignment of the position coordinate (15,5) is 5 points.
According to the above, the position coordinates of the image pair are (30,32), the age difference determination result is 4, the age difference determination result is assigned to the position coordinates, that is, (30,32) is set to 4, and an age difference matrix is generated, where the age difference matrix is
Figure BDA0001942980560000071
The normalization of the age difference matrix means that the values of the age difference matrix are unified to [0,1 ]]Intervals, in particular
Figure BDA0001942980560000072
The second matrix is a second matrix which shows the proportion of age difference between different ages which can be perceived by evaluators, and the subjective perception ability of human beings is expressed in a matrix mode.
Further, after the second matrix is generated, the second matrix needs to be optimized, specifically as shown in formula (1),
Figure BDA0001942980560000073
wherein the content of the first and second substances,
Figure BDA0001942980560000074
in order to optimize the processed second matrix, where k is the true age of the first image, k + σ is the true age of the second image, σ is the absolute age difference between the first image and the second image, and P (k, σ) is the second matrix before optimization,Δ is the age difference threshold and K is the maximum age.
For example, in this embodiment, the age width of the face image sample set is [15,70], the maximum age K is 70, the maximum age difference is 55, the threshold of the age difference is 55, when the age difference of a certain image pair exceeds 55 years, it is indicated that the difference between the two face images is very large, the two face images can be recognized almost completely, and there is a 100% chance to judge that the estimation is accurate, in this case, the position coordinate in the second matrix is assigned to 1 uniformly no matter how many evaluation scores are obtained through the above processing.
Further, an absolute cost matrix a is obtained, as shown in formula 2,
A=|yi-f(xi) Equation (2)
Wherein, yiRepresenting the true age of the face image, f (x)i) Indicating the estimated age.
For example, yiAnd f (x)i) 18 and 20 respectively, then a equals 2, and the age threshold 15 is subtracted together, resulting in the position coordinate of the image pair being (3, 5), then the matrix value of row 3, column 5 in the absolute cost matrix being 2; from this, the absolute cost matrix a corresponding to the second matrix can be obtained.
The cost matrix C is an absolute cost matrix A and an optimized second matrix
Figure BDA0001942980560000081
The specific operation process of the result after the operation processing is as follows:
setting up
Figure BDA0001942980560000082
Then
Figure BDA0001942980560000083
This embodiment is to generate a new evaluation model by substituting the existing MAE error values into a second matrix of human perceptual capability, as shown in equation 3,
Figure BDA0001942980560000084
where N represents the total number of images used to estimate age at the time of testing; c (y)i,f(xi) ) represents elements of the cost matrix C.
According to the standard model provided by the embodiment, the judgment process of the human is introduced into the evaluation model, and the influence of the subjective perception factors of the human on the age evaluation is fully and reasonably considered, so that the judgment of the result of the age evaluation is more accurate, and the evaluation is more suitable for the research of human facial metrology.
According to the method for establishing the age estimation and judgment standard model, the first matrix is established according to the real age of the face images, then the age estimation is carried out on the two face images in the face image pair to obtain the estimation score, the estimation score is combined with the first matrix to assign a value to each coordinate of the first matrix to generate the second matrix, and the judgment standard model is generated by utilizing the second matrix.
In summary, the embodiment of the method for establishing the age assessment and judgment standard model provided by the invention is explained by applying specific examples, and the description of the above examples is only used for helping understanding the scheme and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention, and the scope of the present invention should be defined by the appended claims.

Claims (8)

1. A method for establishing an age assessment standard model is characterized by comprising the following steps:
establishing a face image sample set;
establishing a plurality of groups of face image pairs according to the face image sample set;
establishing a first matrix according to the plurality of groups of face image pairs;
assigning values to the first matrix to generate a second matrix;
generating a cost matrix according to the second matrix;
generating the age assessment criterion model according to the cost matrix,
specifically, according to the real ages of two face images in the face image pair, generating position coordinates of the face image pair, and according to a plurality of groups of face image pairs, generating a plurality of position coordinates; generating the first matrix from a plurality of the position coordinates; obtaining estimated age difference values of two face images in the face image pair, and comparing the estimated age difference values with actual age difference values to obtain evaluation scores of the face image pair; marking the evaluation score as the age difference determination result of the face image pair; and assigning the position coordinate with the age difference judgment result to generate an age difference matrix, and generating the second matrix according to the age difference matrix.
2. The method of claim 1, wherein creating a sample set of facial images comprises:
acquiring N face images, wherein N is an integer greater than or equal to 2;
performing similarity screening on the N face images to obtain M sample images, wherein M is less than or equal to N;
wherein the face image sample set comprises the M sample images.
3. The method of claim 1, wherein creating a plurality of sets of face image pairs from the sample set of face images comprises:
randomly selecting two images from the face image sample set to generate a group of face image pairs;
and generating a plurality of groups of the human face image pairs.
4. The method of claim 1, wherein assigning the first matrix and generating the second matrix comprises:
acquiring an age difference judgment result of each group of the face image pairs;
assigning the position coordinate with the age difference judgment result to generate an age difference matrix;
and carrying out normalization processing on the age difference matrix to generate the second matrix.
5. The method of claim 4, wherein after generating the second matrix, further comprising:
setting an age difference threshold;
acquiring the real ages of two face images in the face image pair, and solving the absolute value of the age difference according to the real ages;
and when the absolute value of the age difference is larger than the age difference threshold value, assigning 1 to the position coordinates of the human face image pair.
6. The method of claim 4, wherein generating a cost matrix from the second matrix comprises:
acquiring an absolute cost matrix;
and carrying out operation processing on the second matrix and the absolute cost matrix to generate the cost matrix.
7. The method of claim 1, wherein establishing the position coordinates of the pair of face images according to the ages of the two face images in the pair of face images comprises:
acquiring the real ages of two face images in the face image pair;
acquiring an age threshold;
processing the real age according to the age threshold to generate a second age value;
tagging the second age value as a position coordinate of the pair of face images.
8. The method of claim 7, wherein processing the true age according to the age threshold comprises:
subtracting the age threshold from the true age.
CN201910027338.9A 2019-01-11 2019-01-11 Method for establishing age evaluation standard model Active CN109886099B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910027338.9A CN109886099B (en) 2019-01-11 2019-01-11 Method for establishing age evaluation standard model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910027338.9A CN109886099B (en) 2019-01-11 2019-01-11 Method for establishing age evaluation standard model

Publications (2)

Publication Number Publication Date
CN109886099A CN109886099A (en) 2019-06-14
CN109886099B true CN109886099B (en) 2020-11-10

Family

ID=66925807

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910027338.9A Active CN109886099B (en) 2019-01-11 2019-01-11 Method for establishing age evaluation standard model

Country Status (1)

Country Link
CN (1) CN109886099B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009271885A (en) * 2008-05-12 2009-11-19 Panasonic Corp Age estimation method and age estimation device
CN105678381A (en) * 2016-01-08 2016-06-15 浙江宇视科技有限公司 Gender classification network training method, gender classification method and related device
KR20170025162A (en) * 2015-08-27 2017-03-08 연세대학교 산학협력단 Method and Apparatus for Transforming Facial Age on Facial Image
CN107330412A (en) * 2017-07-06 2017-11-07 湖北科技学院 A kind of face age estimation method based on depth rarefaction representation
CN107977633A (en) * 2017-12-06 2018-05-01 平安科技(深圳)有限公司 Age recognition methods, device and the storage medium of facial image
CN109035250A (en) * 2018-09-11 2018-12-18 中国科学技术大学 Establish the method and device, age prediction technique and device of age prediction model

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009271885A (en) * 2008-05-12 2009-11-19 Panasonic Corp Age estimation method and age estimation device
KR20170025162A (en) * 2015-08-27 2017-03-08 연세대학교 산학협력단 Method and Apparatus for Transforming Facial Age on Facial Image
CN105678381A (en) * 2016-01-08 2016-06-15 浙江宇视科技有限公司 Gender classification network training method, gender classification method and related device
CN107330412A (en) * 2017-07-06 2017-11-07 湖北科技学院 A kind of face age estimation method based on depth rarefaction representation
CN107977633A (en) * 2017-12-06 2018-05-01 平安科技(深圳)有限公司 Age recognition methods, device and the storage medium of facial image
CN109035250A (en) * 2018-09-11 2018-12-18 中国科学技术大学 Establish the method and device, age prediction technique and device of age prediction model

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
A new method to estimate ages of facial image for large database;Chen, YW 等;《MULTIMEDIA TOOLS AND APPLICATIONS》;20160331;全文 *
Face Verification Based on the Age Progression Rules;Fang K 等;《ICE Transactions on Information and Systems》;20151231;全文 *
Facial Age Estimation With Age Difference;Hu Z 等;《IEEE Trans Image Process》;20171231;全文 *
人脸图像的年龄估计技术研究;王先梅 等;《中国图象图形学报》;20120630;全文 *
基于集成人脸对距离学习的跨年龄人脸验证;吴嘉琪 等;《模式识别与人工智能》;20171231;全文 *

Also Published As

Publication number Publication date
CN109886099A (en) 2019-06-14

Similar Documents

Publication Publication Date Title
Reinecke et al. Predicting users' first impressions of website aesthetics with a quantification of perceived visual complexity and colorfulness
Abudarham et al. Reverse engineering the face space: Discovering the critical features for face identification
CN106817251B (en) Link prediction method and device based on node similarity
CN109615060B (en) CTR estimation method, CTR estimation device and computer-readable storage medium
JP5633080B2 (en) Attribute value estimation device, attribute value estimation method, program, and recording medium
US8774533B2 (en) Quantifying social affinity from a plurality of images
Bo et al. Particle pollution estimation from images using convolutional neural network and weather features
Yadav et al. Recognizing age-separated face images: Humans and machines
CN107743225B (en) A method of it is characterized using multilayer depth and carries out non-reference picture prediction of quality
JP2022027473A5 (en)
CN111367872A (en) User behavior analysis method and device, electronic equipment and storage medium
CN111860101A (en) Training method and device for face key point detection model
De Alcaraz‐Fossoul et al. Ridge width correlations between inked prints and powdered latent fingerprints
Wang et al. How real is reality? A perceptually motivated system for quantifying visual realism in digital images
Madrid-Herrera et al. Human image complexity analysis using a fuzzy inference system
CN109886099B (en) Method for establishing age evaluation standard model
US20190117147A1 (en) Evaluation method for site of color irregularity and color irregularity site evaluation apparatus
Ciocca et al. Complexity perception of texture images
CN111080722A (en) Color migration method and system based on significance detection
Babnik et al. Optimization-based improvement of face image quality assessment techniques
Del Giudice et al. Mosaic brains? A methodological critique of Joel et al.(2015)
CN107610101B (en) Method for measuring visual balance quality of digital image
CN110210522A (en) The training method and device of picture quality Fraction Model
Lin et al. Wood color classification based on color spatial features and k-means algorithm
JP2015001859A (en) Information processing apparatus, information processing system, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant