WO2013005447A1 - 顔印象分析方法、美容カウンセリング方法および顔画像生成方法 - Google Patents
顔印象分析方法、美容カウンセリング方法および顔画像生成方法 Download PDFInfo
- Publication number
- WO2013005447A1 WO2013005447A1 PCT/JP2012/004404 JP2012004404W WO2013005447A1 WO 2013005447 A1 WO2013005447 A1 WO 2013005447A1 JP 2012004404 W JP2012004404 W JP 2012004404W WO 2013005447 A1 WO2013005447 A1 WO 2013005447A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- face
- impression
- subject
- population
- tendency
- Prior art date
Links
- 238000004458 analytical method Methods 0.000 title claims abstract description 238
- 238000000034 method Methods 0.000 title claims description 187
- 238000009223 counseling Methods 0.000 title claims description 53
- 239000002537 cosmetic Substances 0.000 title claims description 48
- 238000000491 multivariate analysis Methods 0.000 claims abstract description 49
- 239000013598 vector Substances 0.000 claims description 162
- 230000003796 beauty Effects 0.000 claims description 140
- 230000001815 facial effect Effects 0.000 claims description 68
- 208000029152 Small face Diseases 0.000 claims description 42
- 230000000875 corresponding effect Effects 0.000 claims description 29
- 238000005259 measurement Methods 0.000 claims description 26
- 238000011282 treatment Methods 0.000 claims description 25
- 210000004209 hair Anatomy 0.000 claims description 23
- 230000002596 correlated effect Effects 0.000 claims description 17
- 230000005540 biological transmission Effects 0.000 claims description 9
- 239000000284 extract Substances 0.000 claims description 6
- 238000000465 moulding Methods 0.000 claims description 6
- 230000032683 aging Effects 0.000 description 192
- 210000001508 eye Anatomy 0.000 description 66
- 210000003128 head Anatomy 0.000 description 53
- 238000012360 testing method Methods 0.000 description 31
- 230000003716 rejuvenation Effects 0.000 description 25
- 238000000513 principal component analysis Methods 0.000 description 21
- 238000011156 evaluation Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 16
- 230000001965 increasing effect Effects 0.000 description 11
- 230000006870 function Effects 0.000 description 10
- 230000036961 partial effect Effects 0.000 description 8
- 238000004364 calculation method Methods 0.000 description 7
- 210000000887 face Anatomy 0.000 description 7
- 238000010606 normalization Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 210000004761 scalp Anatomy 0.000 description 7
- 238000012353 t test Methods 0.000 description 7
- 238000000692 Student's t-test Methods 0.000 description 6
- 239000000523 sample Substances 0.000 description 6
- 230000008961 swelling Effects 0.000 description 6
- 238000007621 cluster analysis Methods 0.000 description 5
- 230000003247 decreasing effect Effects 0.000 description 4
- 210000004709 eyebrow Anatomy 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000004088 simulation Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 230000001186 cumulative effect Effects 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 210000001061 forehead Anatomy 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000002829 reductive effect Effects 0.000 description 3
- 230000001953 sensory effect Effects 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000037237 body shape Effects 0.000 description 2
- 210000000988 bone and bone Anatomy 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000008921 facial expression Effects 0.000 description 2
- 239000003925 fat Substances 0.000 description 2
- 238000007665 sagging Methods 0.000 description 2
- 230000036548 skin texture Effects 0.000 description 2
- 238000007619 statistical method Methods 0.000 description 2
- 210000000216 zygoma Anatomy 0.000 description 2
- 241000272525 Anas platyrhynchos Species 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 241000746998 Tragus Species 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 210000000613 ear canal Anatomy 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 239000003889 eye drop Substances 0.000 description 1
- 229940012356 eye drops Drugs 0.000 description 1
- 210000000744 eyelid Anatomy 0.000 description 1
- 210000001097 facial muscle Anatomy 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 239000003676 hair preparation Substances 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 210000003625 skull Anatomy 0.000 description 1
- 239000000243 solution Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 210000001738 temporomandibular joint Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000003313 weakening effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/213—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
- G06F18/2135—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/7715—Feature extraction, e.g. by transforming the feature space, e.g. multi-dimensional scaling [MDS]; Mappings, e.g. subspace methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/178—Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition
Definitions
- the present invention relates to a face impression analysis method, a beauty counseling method using the face impression analysis method, a face image generation method, and a face impression analysis device and a face impression analysis system that realize the face impression analysis method.
- “Impressions you want to see” other than “You can see younger” for young women include “small face” and “adult face”.
- the adult face (adult face) is the opposite of the baby face (baby face).
- the degree of adult face and apparent age are different concepts. Apparent age is an index indicating how old a subject looks.
- the degree of adult face is an index indicating whether the subject's face is adult-like or child-like regardless of apparent age.
- Patent Document 1 describes a method for predicting an age-related change of a subject's face from a two-dimensional face image of the subject.
- an average face for each age is generated from a large number of two-dimensional images, and the average face and the face of the subject are measured with the dimensions and positions of the face shape, upper eyelid, mouth corner, nose lip groove, lower jaw and the like as factors. Is described.
- Patent Document 2 describes that a three-dimensional shape information of a head including a face is measured by an apparatus, and a curvature distribution of a curved surface at each point of the face is calculated to evaluate the face shape.
- Patent Document 3 a homologous model in which the number of data points (number of nodes) and topology are unified between three-dimensional shape models of the human head is generated, and multivariate analysis such as principal component analysis is performed with a relatively small amount of data. An operation method that enables execution is described. That is, Patent Document 3 relates to a method for generating a homology model.
- Patent Document 4 describes that a principal component analysis is performed on a shape feature vector of a two-dimensional face image in front of a subject to obtain a first principal component, and an eigenvalue of the first principal component is changed to reconstruct an image. Has been. This makes it possible to change the apparent age, facial expression, body shape, etc. of the face image of the subject.
- the expression amount of the feature amount in the face of the subject is calculated from the face shape information and one or more feature amounts, and the facial feature of the subject is calculated based on the expression amount. It is characterized by obtaining the degree of impression tendency.
- the face shape information is information representing the shape of the subject's face.
- the feature amount is obtained by multivariate analysis of population face information representing the three-dimensional shape of the face surface of a population of a plurality of people.
- the first beauty counseling method of the present invention is a beauty counseling method using the above-described face impression analysis method, wherein the beauty information previously associated with the feature amount having the calculated expression level equal to or greater than a predetermined value is obtained. It is characterized by outputting.
- a second beauty counseling method of the present invention is a beauty counseling method using the face impression analysis method described above, and the degree of coincidence of a plurality of weighting factor tendencies relating to a plurality of basis vectors having a high correlation with the impression tendency
- the population is classified into a plurality of groups based on the expression, the group to which the subject belongs is obtained based on the expression level of the subject, and beauty information previously associated with the group to which the subject belongs is output. It is characterized by doing.
- the face image generation method of the present invention calculates the expression amount of the feature amount in the face of the subject from the face shape information and one or more feature amounts, changes the expression amount in the face shape information, and changes An impression change image in which the impression tendency of the subject's facial feature is changed is generated based on the face shape information.
- the face shape information is information representing the shape of the face surface of the subject.
- the feature amount is obtained by multivariate analysis of population face information representing the three-dimensional shape of the face surface of a population of a plurality of people.
- the face impression analysis apparatus of the present invention includes a face shape acquisition means, a storage means, a face component analysis means, and a face impression determination means.
- the face shape acquisition means is means for acquiring face shape information representing the shape of the face surface of the subject.
- the storage means includes one or a plurality of feature amounts obtained by multivariate analysis of population face information representing a three-dimensional shape of a face surface of a plurality of populations, and a face structure associated with the feature amounts. It is a means to memorize
- the face component analyzing means is means for calculating the expression amount of the feature quantity in the face of the subject from the face shape information and the feature quantity.
- the face impression determining means is means for referring to the storage means and acquiring the impression tendency or the degree thereof based on the feature amount and the expression amount.
- the face impression analysis system of the present invention includes receiving means, storage means, face component analyzing means, face impression determining means, and transmitting means.
- the receiving means is means for receiving face shape information representing the shape of the face surface of the subject through a network.
- the storage means includes one or a plurality of feature amounts obtained by multivariate analysis of population face information representing a three-dimensional shape of a face surface of a plurality of populations, and a face structure associated with the feature amounts. It is a means to memorize
- the face component analyzing means is means for calculating the expression amount of the feature quantity in the face of the subject from the face shape information and the feature quantity.
- the face impression determining means is means for referring to the storage means and acquiring the impression tendency or the degree thereof based on the feature amount and the expression amount.
- the transmission means is means for transmitting output information indicating the acquired impression tendency or the degree thereof through the network.
- the impression tendency of the face feature is an attribute relating to the appearance of the face, which is received due to the three-dimensional shape of the whole or part of the face.
- the degree of impression tendency of facial features refers to the level of saliency of the attribute.
- FIG. 1 It is a functional block diagram which shows the face impression analyzer concerning 1st embodiment of this invention. It is a flowchart of the face impression analysis method concerning a first embodiment. It is a table showing the principal component analysis result regarding the population analysis model concerning 1st embodiment and Example 1.
- FIG. It is a table showing the characteristic of the shape change to which the main components from the 1st order to the 15th order corresponding to the analysis result of FIG. 3 belong. It is an example of the table of the trend information by the analysis result of FIG.
- (A) is a perspective view of a virtual shape in which only the first principal component is ⁇ 3 ⁇
- (b) is a perspective view of the entire average face
- (c) is a perspective view of a face shape in which only the first principal component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the second principal component is ⁇ 3 ⁇
- (b) is a perspective view of the entire average face
- (c) is a perspective view of a face shape in which only the second principal component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the third principal component is ⁇ 3 ⁇
- (b) is a perspective view of the entire average face
- (c) is a perspective view of a face shape in which only the third principal component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the fourth principal component is ⁇ 3 ⁇
- (b) is a perspective view of the entire average face
- (c) is a perspective view of a face shape in which only the fourth principal component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the fifth principal component is ⁇ 3 ⁇
- (b) is a perspective view of the overall average face
- (c) is a perspective view of a face shape in which only the fifth principal component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the sixth principal component is ⁇ 3 ⁇
- (b) is a perspective view of the overall average face
- (c) is a perspective view of a face shape in which only the sixth principal component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the seventh principal component is ⁇ 3 ⁇
- (b) is a perspective view of the entire average face
- (c) is a perspective view of a face shape in which only the seventh principal component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the eighth principal component is ⁇ 3 ⁇
- (b) is a perspective view of the entire average face
- (c) is a perspective view of a face shape in which only the eighth principal component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the ninth main component is ⁇ 3 ⁇
- (b) is a perspective view of the entire average face
- (c) is a perspective view of a face shape in which only the ninth main component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the 10th principal component is ⁇ 3 ⁇
- (b) is a perspective view of the overall average face
- (c) is a perspective view of a face shape in which only the 10th principal component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the eleventh principal component is ⁇ 3 ⁇
- (b) is a perspective view of the entire average face
- (c) is a perspective view of a face shape in which only the eleventh principal component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the twelfth principal component is ⁇ 3 ⁇
- (b) is a perspective view of the entire average face
- (c) is a perspective view of a face shape in which only the twelfth principal component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the 13th principal component is ⁇ 3 ⁇
- (b) is a perspective view of the overall average face
- (c) is a perspective view of a face shape in which only the 13th principal component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the 14th principal component is ⁇ 3 ⁇
- (b) is a perspective view of the overall average face
- (c) is a perspective view of a face shape in which only the 14th principal component is + 3 ⁇ . is there.
- (A) is a perspective view of a virtual shape in which only the 15th principal component is ⁇ 3 ⁇
- (b) is a perspective view of the entire average face
- (c) is a perspective view of a face shape in which only the 15th principal component is + 3 ⁇ . is there.
- surface which shows a t test result.
- (A) to (f) are six images in which the weighting coefficient of the base vector is changed from + 1 ⁇ to + 3 ⁇ and from ⁇ 1 ⁇ to ⁇ 3 ⁇ with respect to the 9th-order base vector.
- (A) to (d) are perspective views of a homologous model in which a plurality of aging impression axes are combined. It is a table
- 10 is a table showing principal component analysis results regarding a population analysis model according to Example 2; It is a table showing the characteristic of the shape change to which the main components from the 1st order to the 20th order corresponding to the analysis result of FIG. 36 belong.
- a 10 is a table showing a correlation coefficient between a weighting factor for each base order and an apparent age according to Example 2.
- A is an I type young group
- (b) is a front view of each average face shape of an I type elderly group.
- (C) is a type II young group, and (d) is a front view of each average face shape of a type II elderly group.
- (E) is a front view of each average face shape of a type III young group, and (f) is a type III senior group.
- G) is an IV type young group,
- (h) is a front view of each average face shape of an IV type elderly group.
- A)-(d) is a table which shows the average of the main component score for every aging factor regarding the test subject of type I to IV.
- (A) to (d) are tables showing partial regression coefficients and constant terms for each significant aging factor for subjects of types I to IV.
- (A) is a perspective view of an average face shape model of all subjects belonging to type I
- (b) is a perspective view of a state in which (a) is rejuvenated until 30 years old
- (c) is a perspective view of (a). It is a perspective view of the state which aged to 60 years old.
- (A) is a front view of an average face of 20 people who belong to an I-type younger age group
- (b) is a front view of an average face of 19 people who belong to an I-type older age group.
- (A) is a perspective view of an average face shape model of all subjects belonging to type II
- (b) is a perspective view of a state in which (a) is rejuvenated until 30 years old
- (c) is a perspective view of (a). It is a perspective view of the state which aged to 60 years old.
- (A) is a perspective view of an average face shape model of all subjects belonging to type III
- (b) is a perspective view of a state in which (a) is rejuvenated until 30 years old
- (c) is a perspective view of (a). It is a perspective view of the state which aged to 60 years old.
- (A) is a perspective view of an average face shape model of all subjects belonging to type IV
- (b) is a perspective view of a state in which (a) is rejuvenated until 30 years old
- (c) is a perspective view of (a). It is a perspective view of the state which aged to 60 years old. It is a table which shows the correlation coefficient with the weight coefficient for every base degree concerning Example 3, and the grade of an adult face.
- (A) is an average face of 10 subjects who were the most adult faces among all the populations.
- (B) is the average face of 10 subjects who were the most baby faces among all the population. 10 is a table showing cluster classification results according to Example 3.
- (a) is the average face of subjects belonging to cluster 1.
- (B) is an average face of subjects belonging to cluster 2.
- (C) is the average face of subjects belonging to cluster 3.
- (D) is the average face of subjects belonging to cluster 4. It is a table which shows the correlation coefficient with the weighting coefficient for every base degree concerning Example 4, and the grade of a small face impression.
- (A) is an average face of 10 subjects who had the weakest small face impression among all the population.
- (B) is the average face of 10 subjects with the smallest small face impression among all the population.
- 10 is a table showing cluster classification results according to Example 4; Regarding Example 4, (a) is the average face of subjects belonging to cluster 1.
- (B) is an average face of subjects belonging to cluster 2.
- (C) is the average face of subjects belonging to cluster 3.
- (D) is the average face of subjects belonging to cluster 4.
- Example 5 It is a table which shows the correlation coefficient with the weighting coefficient for every base order concerning Example 5, and the grade of the magnitude
- A is an average face of 10 subjects who had the largest eye impression among all the populations.
- B is the average face of 10 subjects who had the smallest impression among all the population.
- 10 is a table showing cluster classification results according to Example 5; Regarding Example 5, (a) is the average face of subjects belonging to cluster 1. (B) is an average face of subjects belonging to cluster 2. (C) is the average face of subjects belonging to cluster 3. (D) is the average face of subjects belonging to cluster 4.
- Patent Document 1 Since the method described in Patent Document 1 predicts how the face will change in the future due to aging based on the form of a specific part of the face of the subject. Lack of accuracy. This is because it is difficult to universally predict aging based on the form of a specific part because the face shape is slightly different for each subject. Even with the method of Patent Document 2, it is difficult to accurately quantify the impression of the entire face of the subject.
- the inventor acquires a three-dimensional shape of a face surface of a population of a plurality of people to generate population face information, and uses multivariate analysis which is a statistical analysis method for the population face information.
- Base vectors were extracted.
- the inventor has found that the feature quantities (basis vectors) of several orders have a high correlation with the impression tendency of facial features. Accordingly, the present inventor can objectively express the impression tendency of the subject by quantitatively calculating how much the feature quantity of the order is expressed in the face shape information of the subject. I came up with it.
- the visual impression of the subject's face is objectively evaluated, and effective information for accurately performing counseling such as makeup advice is obtained. be able to. Therefore, according to the beauty counseling method using the face impression analysis method, objective information for each subject regarding beauty is provided.
- the subject can receive and obtain an objective evaluation regarding the appearance of his / her face by transmitting the face shape information through the network. .
- a plurality of steps may be described in order, but the description order is not necessarily limited to the order or timing of performing each step. Not what you want.
- the order of the plurality of steps can be changed within a technically acceptable range, and part or all of the execution timings of the plurality of steps. May overlap each other.
- the specific hardware configuration of the various components for realizing the face impression analysis apparatus and the face impression analysis system of the present invention is not particularly limited as long as these functions are realized.
- a dedicated hardware that performs a predetermined function, a data processing device provided with a predetermined function by a computer program, a predetermined function realized in the data processing device by a computer program, an arbitrary combination thereof, etc.
- Various components of the invention can be realized. More specifically, a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), an I / F (Interface) unit, so that a computer program can be read and a corresponding processing operation can be executed.
- the present invention can be implemented using hardware constructed with general-purpose devices such as, dedicated logic circuits constructed to execute predetermined processing operations, combinations thereof, and the like.
- the fact that the component of the present invention stores data means that the device realizing the present invention has a function of storing data, and it is not necessarily that the data is actually stored in the device. I don't need it.
- FIG. 1 is a functional block diagram showing a face impression analyzer 100 according to the first embodiment of the present invention.
- the face impression analysis device 100 includes a face shape acquisition unit 10, a face component analysis unit 50, a face impression determination unit 60, and a storage unit 70.
- the face shape acquisition unit 10 is means for acquiring face shape information representing the shape of the face surface of the subject.
- the storage unit 70 is associated with one or a plurality of feature amounts (basis vectors) and feature amounts obtained by multivariate analysis of the population face information representing the three-dimensional shape of the face surface of a plurality of populations. Tendency information that represents the impression tendency of facial features is stored.
- the storage unit 70 of the present embodiment includes a base storage unit 72 and a trend information storage unit 74.
- the base storage unit 72 is a means for storing one or a plurality of feature amounts extracted from the population face information and respective weighting factors (eigenvalues).
- the trend information storage unit 74 is a means for storing trend information representing the impression tendency of facial features.
- the face component analysis unit 50 is a means for calculating the expression amount of the feature amount in the subject's face from the face shape information of the subject and the feature amount extracted from the population face information.
- the face impression determination unit 60 refers to the storage unit 70 (trend information storage unit 74), and is a means for acquiring the impression tendency of the subject's face or the degree thereof based on the feature amount and the expression level of the subject's face. is there.
- the face impression analysis apparatus 100 of the present embodiment and the face impression analysis method (hereinafter sometimes referred to as the first method) performed using the apparatus will be described in detail.
- the first method is a method of analyzing the tendency of impression received from the appearance of the subject's facial features.
- the degree of impression tendency of the subject's face creation is obtained from the face shape information of the subject and the statistically obtained basis vectors.
- the face shape information is information representing the shape of the face surface of the subject, and specifically is coordinate information of the three-dimensional shape of the face surface of the subject who is the customer.
- the basis vector is a feature amount (eigenvector) obtained by multivariate analysis of population face information representing the three-dimensional shape of the face surface of a population of a plurality of people.
- the feature amount includes at least one of second-order and higher-order basis vectors that are extracted from a plurality of basis vectors obtained by multivariate analysis and have a high correlation with the impression tendency of the subject's facial features.
- the expression level of the basis vector in the face of the subject is calculated from the face shape information of the subject and one or a plurality of basis vectors, and the degree of impression tendency is obtained based on the expression level.
- FIG. 2 is a flowchart of the first method.
- the face impression analyzer 100 and the first method will be described in detail with reference to FIGS. 1 and 2.
- the face impression analysis device 100 includes a normalization unit 20, a condition input unit 30, a model storage unit 40, and beauty information.
- An output unit 80 is further provided. These are connected to each other by a bus 95.
- the face impression analysis apparatus 100 is installed in a cosmetic store or a beauty counseling service store.
- the face shape acquisition unit 10 acquires coordinate information of the three-dimensional shape of the face of the test subject serving as a customer (FIG. 2: Step S10).
- the face shape acquisition unit 10 includes a contact measurement unit 12 and a non-contact measurement unit 14.
- the contact-type measuring unit 12 include a contact-type three-dimensional digitizer that acquires a three-dimensional coordinate at an arbitrary position on the scalp surface by bringing a probe needle into contact with the scalp.
- An example of the non-contact measurement unit 14 is a three-dimensional laser scanner.
- a lens focus method, a stereo method, or the like can be used.
- These non-contact type measuring devices are commercially available.
- an arithmetic device that calculates a three-dimensional coordinate value from a plurality of two-dimensional images may be used.
- the face shape acquisition unit 10 acquires a three-dimensional coordinate value of the surface of the head including the face of the subject as face shape information using a contact-type three-dimensional digitizer. More specifically, the face shape acquisition unit 10 acquires a three-dimensional coordinate value related to a plurality of feature points on the surface of the head using a contact-type three-dimensional digitizer, and uses a non-contact-type three-dimensional measurement device. Used to obtain the three-dimensional coordinate values of other points on the surface of the subject's head.
- the non-contact measurement unit 14 may be used because of the short measurement time.
- the subject's hair may be covered with a protective member such as a net cap.
- the three-dimensional coordinates of the scalp surface can be measured only with an optical measuring device such as a three-dimensional laser scanner.
- the non-contact measurement unit 14 By using the non-contact measurement unit 14, it is possible to measure three-dimensional coordinates of many points exceeding 100,000 points as an example from the head surface of the subject. Furthermore, regarding some feature points on the face of the subject, the three-dimensional coordinates may be measured with high accuracy using the contact measurement unit 12.
- Such feature points can include feature points on the skull surface (anatomical feature points) and feature points on the skin surface. Examples of the anatomical feature point include an orbital lower point, an orbital upper edge center, an orbital inner edge point, an ear canal upper edge point, a nasal root point, and a zygomatic arch point.
- Examples of skin surface feature points include eye points, eye corner points, upper tragus edge points, upper ear base points, lower ear base points, lower nose points, nose apex points, mouth corner points, mouth points or jaw angle points.
- the test subject includes a customer who is a subject of the face impression analysis method of the present embodiment and a data provider that provides a three-dimensional coordinate value for generating population face information.
- the three-dimensional coordinates of feature points common to a large number of subjects it is possible to normalize each other's three-dimensional shape model and make a homologous model.
- the three-dimensional coordinates of the scalp surface can be obtained with high accuracy.
- the three-dimensional coordinates of the surface of the head including the face may be acquired using only one of the non-contact type or the contact type measuring device.
- the face impression analysis apparatus 100 measures a three-dimensional shape model (face shape information) of the subject's face with the face shape acquisition unit 10 (contact type measurement unit 12, non-contact type measurement unit 14).
- the normalization unit 20 is an arithmetic unit for converting the high-resolution three-dimensional three-dimensional shape model acquired by the contact-type measurement unit 12 and the non-contact-type measurement unit 14 into a homologous model composed of a smaller number of points. is there.
- a specific method for generating a homology model is described in Patent Document 3.
- the face shape acquisition unit 10 measures a three-dimensional shape model representing the shape of the head including the face surfaces of many subjects.
- the normalization unit 20 generates population face information by converting the face shape information into a homologous model in which the number of data points (number of nodes) and topology are unified (FIG. 2: step S20).
- Population face information (population analysis model), which is a generated homologous model, is stored in the model storage unit 40.
- the normalization unit 20 converts the face shape information of the subject into a homologous model having the same number of data points (number of nodes) and topology as the population analysis model.
- the model storage unit 40 is means for storing human population face information of a population used for multivariate analysis.
- the model storage unit 40 also stores the face shape information of the subject acquired by the face shape acquisition unit 10.
- the three-dimensional shape model representing the surface shape of the head including the face of the subject is referred to as a subject analysis model.
- the model storage unit 40 may store a population analysis model for each attribute such as gender, age, or region of origin.
- the condition input unit 30 is means for receiving various inputs from the subject or the operator of the face impression analyzer 100. For example, designation of attributes such as the age and sex of the subject and an impression tendency pattern that the subject desires to analyze is accepted (FIG. 2: step S30).
- the face impression analyzing apparatus 100 obtains the degree of impression tendency of the subject's face creation based on the order of the basis vectors and the weighting coefficient (eigenvalue) common to the subject analysis model and the population analysis model (FIG. 2). : Step S400).
- the face component analysis unit 50 selects a population according to the conditions received by the condition input unit 30 and performs multivariate analysis of the subject analysis model. For example, when the subject is a woman, the face component analysis unit 50 extracts a homologous model whose sex classification is female from the model storage unit 40 and generates a population analysis model. Thereby, it is possible to analyze the degree of impression tendency of the subject's face creation in the whole female population. Further, when analyzing a population in a more specific range, a population analysis model may be generated by extracting only a subject's homology model of the same age (for example, subject's age ⁇ 5 years) in addition to the subject's gender. In this case, it is possible to analyze the degree of impression tendency of the subject's face creation in the population of women of the same age as the subject.
- a plurality of methods for calculating the order of the base vector and the weighting coefficient from the face shape information of the subject are exemplified.
- the feature of the subject is subjected to multivariate analysis of the population face information including the face shape information of the subject.
- a method for calculating a quantity (base vector) will be given.
- the population includes a face-shaped sample provider and subjects to whom the first method is provided.
- a basis vector is calculated in advance by multivariate analysis for a population that does not include a subject, and then the face shape information of the subject is converted to the basis vector.
- the weighting coefficient may be obtained by reproducing with.
- a homology model (population analysis model) of a population including a subject is statistically processed to extract a plurality of base vectors.
- a principal component analysis (PCA) is given as an example of a specific multivariate analysis.
- the face component analysis unit 50 performs principal component analysis on the face shape of the population analysis model, and calculates a multi-order basis vector e i (i is a natural number representing the order; the same applies hereinafter) (FIG. 2: Step S40). .
- the basis vector e i is obtained by eigenvector analysis of the covariance matrix of the population analysis model.
- Each next basis vector is orthogonal to each other.
- the maximum order n of the basis vectors is not particularly limited.
- all the basis vectors having a predetermined contribution ratio (for example, 0.1% or more) are extracted, or the cumulative contribution ratio of the primary principal component or less is greater than or equal to a predetermined value (for example, the maximum order n may be set to be 95% or more.
- FIG. 3 is a table showing an example of a principal component analysis result relating to a population analysis model in which a total of 50 Japanese women in their 20s to 60s are included.
- FIG. 3 shows a case where 15 basis vectors having a contribution ratio of 1% or more exist from the first order (pca01) to the 15th order (pca15).
- the head form of Japanese women in their 20s to 60s (hereinafter referred to as Japanese adult women) can be almost explained by 15 axes.
- These basis vectors and contribution rates are stored in the basis storage unit 72.
- the bones of the head itself tend to be thin, and the facial impression tends to change.
- the facial impression is easy to change.
- Such a change in face impression due to a bone change is unlikely to appear in the principal component analysis of the three-dimensional shape of the head surface, and is therefore excluded in this embodiment.
- the population was from the 20s to 60s. Thereby, the progress of aging due to the influence of muscles and fats is extracted with statistical accuracy.
- the subject analysis model is decomposed as shown in Expression (1) by the basis vector obtained from the population analysis model.
- b i is a weighting factor for each order of the basis vector e i .
- the average face shape is a three-dimensional shape obtained by averaging the face shapes of the population analysis model.
- the face component analysis unit 50 decomposes the subject analysis model into a linear sum of basis vectors (principal components) common to many population analysis models (FIG. 2: step S41).
- the weight coefficient b i for each order of the subject analysis model is stored in the base storage unit 72.
- the face component analysis unit 50 according to the present embodiment can reproduce a subject analysis model in which an arbitrary weight coefficient b i is changed. In other words, by performing the product-sum operation of the base vectors e i by changing any weighting coefficients b i of order, while maintaining a natural face shape of the subject, increasing or decreasing the influence of the basis vectors e i Can do.
- FIG. 4 is a table showing the feature of the shape change to which the first to 15th principal components corresponding to the analysis result of FIG. 3 belong. Specific description of each main component will be described later in Examples.
- FIG. 5 shows an example of a table of trend information PI stored in the trend information storage unit 74.
- the trend information PI is information in which one or a plurality of impression tendency patterns are associated with base orders having a high correlation with the impression tendency.
- the trend information PI of the present embodiment includes at least four patterns 1 to 4.
- the trend information PI is information representing an impression factor that governs the impression tendency of facial features.
- FIG. 5 exemplifies the apparent age, actual age, degree of adult face or baby face, or degree of small face impression as the impression factor.
- Pattern 1 shows that the base orders (aging factors) that are highly correlated with the apparent age of the subject are the second, seventh, ninth, and eleventh orders. As will be described in detail later, the inventors' study has revealed that the apparent age of the subject increases by increasing or decreasing the weighting factors of these order basis vectors.
- Pattern 2 shows that the actual age of the subject has a high correlation with the weight coefficients of the second-order, seventh-order, ninth-order, and eleventh-order basis vectors. That is, patterns 1 and 2 indicate that the apparent age tendency and the actual age tendency are correlated with a common order basis vector.
- Pattern 3 represents that the impression tendency of whether the face of the subject appears adult (adult face) or childish (child face) correlates with the weight coefficient of the third-order basis vector.
- Pattern 4 represents that the impression tendency that the subject's face appears small (small face) correlates with the weighting coefficients of the third and twelfth basis vectors.
- the subject or the user When analyzing the apparent age of the subject, the subject or the user operates the condition input unit 30 to select and input the pattern 1.
- the condition input unit 30 receives a pattern 1 selection input.
- designation of an arbitrary impression tendency pattern is not accepted from the subject or the user, but one or more impression tendency to which the subject's face belongs from all patterns preset in the tendency information storage unit 74. It is good also as extracting.
- the condition acquisition step S30 is illustrated before the principal component analysis step S40, but the present invention is not limited to this.
- the condition acquisition step S30 may be executed between the base calculation step S41 and the impression tendency determination step S42.
- a plus 1 ⁇ value is stored in association with each base order of each pattern.
- the plus 1 ⁇ value is a higher-order score in the population when the weighted coefficient of the base degree is ranked in the order in which the tendency of the pattern (for example, increase in apparent age) becomes more prominent.
- the weight coefficient of the seventh-order basis vector tends to increase in apparent age as the score is positive and the absolute value is large (details are given below). Later). In other words, the weight coefficient of the seventh order basis and the apparent age have a positive correlation.
- the 7th-order base changes its face shape in the aging direction by increasing its weighting factor in the positive direction. Therefore, a positive score of an average value + 1 ⁇ (standard deviation) obtained from the score distribution of the weight coefficient of the seventh base in the population is set for the plus 1 ⁇ value related to the seventh base in the pattern 1.
- the score of the average value + 1 ⁇ corresponds to the rank of about 1/3 (more precisely, 31.7% in the top) when the positive score is high and the negative score is low.
- the weighting coefficient of the secondary basis vector among the aging factors tends to increase in apparent age as the score is negative and the absolute value is large (details will be described later). That is, the weight coefficient of the secondary basis and the apparent age have a negative correlation.
- a negative score of an average value of ⁇ 1 ⁇ (standard deviation) obtained from the score distribution of the weighting coefficient of the secondary base in the population is set for the plus 1 ⁇ value related to the secondary base in Pattern 1.
- a score with an average value of ⁇ 1 ⁇ corresponds to a rank of about 1/3 (precisely, lower 31.7%) when the positive score is higher and the negative score is lower.
- the face impression determination unit 60 refers to the trend information storage unit 74 and based on the order of the basis vector in the subject analysis model and its weighting coefficient (expression amount), the pattern of the subject's facial impression tendency and the degree of its expression Ask for.
- the degree of expression is obtained by comparing the weighting coefficient stored in the base storage unit 72 with respect to the subject analysis model and the plus 1 ⁇ value set in the trend information PI of the trend information storage unit 74.
- the face impression determination unit 60 refers to the trend information storage unit 74, and the base orders (secondary, seventh, ninth, and eleventh) associated with the pattern 1 by the trend information PI, Call the plus 1 ⁇ value of each base order.
- the face impression determination unit 60 refers to the base storage unit 72 and calls the weight coefficient of the base order related to the subject analysis model.
- the face impression determination unit 60 divides the weighting coefficient of each base order applied to the subject analysis model by the corresponding plus 1 ⁇ value to obtain the degree of expression of the base vector of the order.
- the face impression determination unit 60 compares the degree of expression with a predetermined positive threshold (for example, +1.0). When the expression degree is equal to or greater than the threshold, it is determined that the subject analysis model possesses the tendency of the base order. Determine the facial impression in consideration of the number of base orders possessed by the subject analysis model among the second, seventh, ninth, and eleventh orders associated with pattern 1 and the magnitude of the expression level.
- the unit 60 determines the degree of impression tendency of the subject's facial features (FIG. 2: step S42). Specifically, as a simple determination method, the number of base orders (0 to 4) in which the subject analysis model has a tendency with respect to the number of base orders associated with the pattern (4 for pattern 1). The ratio of the individual) can be used as the degree of impression tendency of the pattern.
- the face impression determination unit 60 may use the sum of the expression degrees divided by the plus 1 ⁇ value for each base order associated with the pattern as the degree of impression tendency of the pattern.
- the beauty information output unit 80 is a means for outputting the result obtained by the face impression determination unit 60 to the subject.
- An example is a display or printer.
- the beauty information output unit 80 of the present embodiment outputs a pattern to which the impression tendency of the subject's facial feature belongs and a quantitative degree of the impression tendency of the pattern.
- the impression change image generation unit 90 is a means for generating an impression change image by changing the weighting coefficient b i in the subject analysis model.
- the impression change image generation unit 90 refers to the trend information storage unit 74 and extracts one or a plurality of impression factors from the trend information PI.
- the condition input unit 30 receives age range information representing the direction and range of change in the degree of impression tendency (eg, apparent age) from the subject. Specifically, the age range information is minus (in the rejuvenation direction) 5 years old, plus (in the aging direction) 10 years old, or the like.
- the face impression analyzer 100 stores an aging coefficient for each aging factor in the trend information storage unit 74 as described later in the first embodiment.
- the impression change image generation unit 90 reads the aging factor and the aging coefficient with reference to the trend information storage unit 74, and weights the aging factor by the amount of change corresponding to the aging width information input from the condition input unit 30. Increase or decrease the coefficient.
- the impression change image generation unit 90 generates a subject analysis model reconstructed with such weighting coefficients and basis vectors as an impression change image (FIG. 2: step S45).
- FIG. 6 is a functional block diagram showing a face impression analysis system 1000 including the face impression analysis apparatus 100 according to the second embodiment of the present invention.
- FIG. 7 is a flowchart of a face impression analysis method (hereinafter sometimes referred to as a second method) performed using the face impression analysis system 1000 of the present embodiment.
- a face impression analysis method hereinafter sometimes referred to as a second method
- the face impression analysis system 1000 includes a face impression analysis device 100 and a subject terminal 110 connected to each other through a network.
- the face impression analysis apparatus 100 is a WEB server.
- the subject terminal 110 is a portable terminal operated by a subject (user).
- the network may be the Internet or a local area network (LAN), and may be a wireless network or a wired network.
- LAN local area network
- a mobile phone network is illustrated.
- the face impression analyzer 100 displays the WEB application site on the display of the subject terminal 110 based on the connection request from the subject terminal 110.
- the face impression analysis device 100 includes a face shape acquisition unit 10, a normalization unit 20, a model storage unit 40, a face component analysis unit 50, a face impression determination unit 60, a storage unit 70, and a beauty information transmission unit 82.
- the storage unit 70 includes a base storage unit 72 and a trend information storage unit 74.
- the base storage unit 72 stores one or a plurality of feature amounts obtained by multivariate analysis of population face information (population analysis model) representing a three-dimensional shape of the face surface of a plurality of populations. .
- the base storage unit 72 may store feature values for each population extracted by sex and age.
- the trend information storage unit 74 stores trend information PI (see FIG. 5) that represents the impression tendency of the face creation, associated with these feature amounts.
- the model storage unit 40 arbitrarily stores a three-dimensional homologous model representing the shape of the head including the face surfaces of a large number of subjects. However, after the face component analysis unit 50 performs multivariate analysis of the homology model of the population and calculates the basis vector, the data of the homology model may be deleted from the model storage unit 40.
- the face shape acquisition unit 10 of this embodiment is different from that of the first embodiment in that it includes a reception unit 16 and a three-dimensional shape estimation unit 18.
- the receiving unit 16 is means for receiving face shape information representing the shape of the face surface of the subject through a network.
- the receiving unit 16 receives, from the subject terminal 110, a plurality of two-dimensional images taken with respect to the head including the subject's face and having different subject photographing angles.
- the subject in the second method is the user's head including the face.
- the subject uses the camera function of the subject terminal 110 to photograph the subject's head image from the front direction and the diagonal direction.
- the receiving unit 16 receives a two-dimensional image from the subject terminal 110 through the network (FIG. 7: step S11). Along with such reception processing, the receiving unit 16 receives various inputs from the subject terminal 110. The receiving unit 16 receives, for example, designation of attributes such as the age and sex of the subject and an impression tendency pattern desired by the subject from the subject terminal 110 (FIG. 7: step S30).
- the three-dimensional shape estimation unit 18 calculates a three-dimensional coordinate value of the subject's head based on a plurality of two-dimensional images received by the receiving unit 16 and having different shooting angles of the subject (subject's head). It is. That is, in the second method, the subject takes a plurality of two-dimensional images with different photographing angles for the head including the subject's face, and the face shape acquisition unit 10 performs the tertiary of the surface of the head based on these two-dimensional images. It differs from the first method in that the original coordinate value is calculated as face shape information.
- the three-dimensional shape estimation unit 18 calculates the three-dimensional coordinate value of the head including the face surface of the subject by coordinate calculation including the alignment of the feature points of the face surface appearing in common in a plurality of two-dimensional images. (FIG. 7: Step S12).
- the face shape acquisition unit 10 gives a two-dimensional image of the head to the subject terminal 110 together with information for specifying a shooting angle. A message prompting additional shooting and additional transmission may be transmitted.
- the normalization unit 20 converts the three-dimensional face shape information generated by the three-dimensional shape estimation unit 18 into a homologous model (FIG. 7: step S20).
- the second method is different from the first method in that multifaceted analysis is performed on population face information (population analysis model) that does not include subject face shape information (subject analysis model).
- population analysis model population analysis model that does not include a subject analysis model is subjected to multivariate analysis to obtain a plurality of continuous feature amounts from the first order to a predetermined order.
- the face shape information of the subject is reproduced by the product-sum operation of these feature amounts and their weight coefficients.
- the weighting factor constituting the subject analysis model is calculated as the expression amount of the impression tendency.
- the second method it is not necessary to perform a multivariate analysis (principal component analysis) after generating a subject analysis model. For this reason, an analysis result can be answered quickly with respect to a test subject.
- the subject analysis model is reproduced with high accuracy by the formula (1) even if the basis vector calculated from the multivariate analysis of the population analysis model not including the subject is used. be able to.
- the face component analysis unit 50 calculates the expression amount of the feature amount in the subject's face from the face shape information (subject analysis model) of the subject and the feature amount of the population face information (population analysis model) (FIG. 7 :). Step S410).
- the face impression determination unit 60 refers to the storage unit 70 (trend information storage unit 74), and acquires an impression tendency or a degree thereof based on the feature amount and its expression amount in the subject's face.
- the beauty information transmission part 82 transmits the output information which shows the acquired impression tendency or its degree to the test subject terminal 110 through a network.
- selection of impression tendency may be accepted from the subject.
- the receiving unit 16 may receive a subject selection regarding the impression tendency from the subject terminal 110.
- the face component analysis unit 50 refers to the storage unit 70 and extracts a feature amount associated with the impression tendency selected by the subject. Then, the face component analysis unit 50 calculates the degree (expression amount) that the extracted feature amount is expressed in the subject analysis model.
- the face impression determination unit 60 acquires the degree of feature amount associated with the selected impression tendency.
- the face component analysis unit 50 first reads the basis vector with reference to the basis storage unit 72 to generate an eigenvector matrix (FIG. 7: step S43).
- the face impression determination unit 60 performs a matrix operation on the eigenvector matrix and the subject analysis model stored in the model storage unit 40 to calculate a weight coefficient for each base order. Thereby, the expression level of the basis vector is calculated.
- the subject analysis model is reproduced by the basis vector and the weighting coefficient (FIG. 7: Step S44).
- the face impression determination unit 60 refers to the trend information storage unit 74, and the basis associated with some or all impression trend patterns selected by the subject from the trend information PI (see FIG. 5).
- the plus 1 ⁇ value for each order is acquired.
- the face impression determining unit 60 uses the plus 1 ⁇ value and the weighting coefficient calculated in step S44 to determine the degree of impression tendency of the subject's facial feature as in the first method (FIG. 7: step). S42).
- the pattern and the degree of impression tendency of the test subject's face determined as described above are transmitted from the beauty information transmitter 82 to the subject terminal 110.
- the above embodiment allows various modifications.
- 1st and 2nd embodiment the case where the normalization part 20 converted the face shape information of a test subject and a population into a three-dimensional homologous model was illustrated. This makes it possible to perform multivariate analysis of a population analysis model with a small number of coordinate points.
- the subject analysis model can be reproduced with high accuracy by a linear sum of basis vectors extracted from the population analysis model.
- the present invention is not limited to the case where a three-dimensional homologous model is used as the subject analysis model.
- the face component analysis unit 50 may obtain the expression level of the base vector from the high-resolution face shape information of the subject measured by the face shape acquisition unit 10 (contact type measurement unit 12, non-contact type measurement unit 14). .
- the head size of the subject's face shape information is first normalized with the head size of the average face shape of the population.
- the weight coefficients of the basis vectors may be determined in order from the first basis so that the linear sum of basis vectors up to a predetermined order (for example, the 15th order) regarding the population best approximates the face shape information of the subject. More specifically, among the measurement points constituting the face shape information of the subject, the three-dimensional coordinate value of the point closest to the node of the average face shape of the population is calculated.
- this measurement point is referred to as a “homology model corresponding point”.
- the sum of the distances between the nodes of the average face shape and the corresponding points of the homologous model is defined as “inter-model distance”.
- the weighting coefficients b 1 of the first base by changing positively or negatively, Models distance to calculate the weighting coefficients b 1 to the minimum.
- the second base Specifically, the average face shape + b 1 * first basis vector e 1 is replaced with the above average face shape, and among the measurement points constituting the face shape information of the subject, the node of this new average face shape is the most. Calculate three-dimensional coordinate values of adjacent points. These points become new homologous model corresponding points.
- the second basis weight coefficient b 2 is calculated so that the inter-model distance between each node of the new average face shape and the homologous model corresponding point is minimized.
- b k is calculated.
- the three-dimensional shape estimation unit 18 calculates a three-dimensional coordinate value of the subject's head surface from a plurality of two-dimensional images received by the reception unit 16.
- the weighting factor may be calculated by image processing using a two-dimensional image (base image) for each base order indicating the characteristics of the base vector.
- the virtual face shape model is generated by individually adding the basis vectors extracted by the multivariate analysis of the population analysis model to the average face shape.
- the virtual face shape model is generated individually for each base order.
- the weighting coefficient of the base vector added to the average face shape is, for example, a plus 1 ⁇ value (see FIG. 5).
- the virtual face shape model generated in this way is converted into a two-dimensional image (hereinafter referred to as a base image) from the front direction and the diagonal direction.
- the face component analysis unit 50 determines the weight coefficient of the base image so that the head image of the subject received by the reception unit 16 is approximated by the weighted synthesis of each subsequent base image. Specifically, the head image of the subject is first normalized and the texture is discarded. Then, the face component analyzing unit 50 changes the weighting coefficient by which the pixel value of each subsequent base image is multiplied so that the sum of squares of the difference from the pixel value of the subject's head image (normalized image) is minimized. The weight coefficient is determined for each base order.
- the face shape information which is the coordinate information of the three-dimensional shape of the subject's face, and the basis vector (feature value) obtained by multivariate analysis of the population face information.
- the face impression analysis method for calculating the expression level of the base vector in the face of the subject and obtaining the degree of impression tendency based on the expression level has been described.
- the face impression of the subject may be analyzed by changing the degree of impression tendency obtained by this face impression analysis method on the image. That is, the present invention further provides a face image generation method for generating an impression change image in which the impression tendency of the subject's face creation is changed.
- This face image generation method is one or more obtained by multivariate analysis of face shape information representing the shape of the face surface of the subject and population face information representing the three-dimensional shape of the face surface of a plurality of populations.
- the amount of expression in the subject's face was calculated from the feature amount of the subject, the amount of expression in the face shape information was changed, and the impression tendency of the subject's face creation was changed based on the changed face shape information An impression change image is generated.
- the weight coefficient of the basis vector of the predetermined order in the subject analysis model is changed to be large or small, and an impression change image in which the apparent age of the subject is aged or rejuvenated is generated.
- the order of the weighting coefficient to be changed is at least one of feature quantities (basis vectors) having a high correlation with impression tendency (apparent age).
- the feature amount obtained by multivariate analysis of the population face information includes at least one of second-order and higher-order basis vectors having a high correlation with the impression tendency.
- the weight coefficient of the base order in the subject analysis model is changed so that the apparent age, which is the degree of impression tendency, changes by a predetermined amount to plus (aging direction) or minus (rejuvenation direction).
- the expression level in the face shape information of the subject is changed, and an image showing the face shape of the subject at the time of aging or rejuvenation is generated.
- the subject can visually grasp that aging or rejuvenation progresses by changing the expression level of the impression factor. Then, it can be realized that cosmetically changing the expression level of the impression factor is effective for rejuvenating the apparent age.
- This impression change image is a still image or a moving image of the subject analysis model.
- the impression change image has face shape information representing the three-dimensional shape of the subject's face.
- Texture data representing the texture of the skin may be synthesized with the impression change image.
- the texture representing the subject's current skin texture may be combined with the impression change image, but other texture data representing the typical skin texture of the person of apparent age after the impression change may be combined. Also good.
- the impression change image generation unit 90 refers to the trend information storage unit 74 to acquire an aging coefficient for each aging factor, and aging width information that is the change width of the apparent age. Based on the above, the change amount of the weighting factor for each aging factor is calculated (see FIG. 1).
- the aging factors changed by the impression change image generation unit 90 may be all aging factors (secondary, seventh, ninth and eleventh) defined as pattern 1 in the trend information PI (see FIG. 5). Alternatively, only some of the aging factors that the subject analysis model has significantly may be used.
- the population When deciding which aging factor weighting factor to change for each subject, classify the population into multiple groups and select a basis vector that is highly correlated with the impression tendency common to the group to which the subject belongs
- the weight coefficient may be changed. Specifically, the matching of the tendency of the weighting factors related to the impression factor (second order, seventh order, ninth order and eleventh order in the above embodiment) which is a multi-dimensional basis vector highly correlated with the impression tendency (apparent age).
- the population may be classified into a plurality of groups based on the degree.
- a group to which the subject belongs may be selected based on an impression factor that is significantly expressed in the subject analysis model. A more specific face image generation method will be described later using a second embodiment.
- beauty counseling using the result calculated by the face impression determination unit 60 may be provided in addition to determining the degree of impression tendency of the subject's facial features.
- This beauty counseling method (hereinafter referred to as “the present method”) is associated in advance with a feature amount having a calculated expression amount equal to or greater than a predetermined value using the face impression analysis method of the first embodiment or the second embodiment. It outputs beauty information.
- the beauty information used in this method is information representing a beauty treatment method, a hair cosmetic, or a makeup cosmetic including any one of a beauty molding method, a beauty massage method, a hair makeup method, and a makeup makeup method.
- FIG. 8 is a table representing beauty information.
- the beauty information may be the amount of cosmetics used or the usage method as well as the choice of beauty means. Beauty information is associated with each aging factor and stored in the trend information storage unit 74.
- a text message or the like is used to convey a beauty treatment method or cosmetics (also referred to as beauty means) to alleviate or promote the impression tendency analyzed when the subject's face belongs. If the impression tendency is not favorable for the subject, a cosmetic means for alleviating this is transmitted to the subject.
- a cosmetic treatment method shown in FIG. 8 will be described in detail in the following examples.
- the present invention will be described in detail through examples.
- the aging point and aging tendency of the person are clarified by analyzing the three-dimensional data of the head of the subject who is the customer.
- objective and effective beauty counseling information is provided by outputting a beauty treatment method according to an aging tendency.
- the priority order of makeup sites and the like are also clarified, so that the customer's confidence in makeup makes it possible to accurately give a “youthful” impression.
- FIG. 9A shows three-dimensional optical data (high resolution data) of the entire head obtained by measuring the head including the face of the subject with a non-contact type three-dimensional laser scanner. There are about 180,000 measurement points. This high-resolution data has a different number of nodes and topology for each subject.
- FIG. 9B is a diagram showing 13 feature points of the face and scalp of the subject. The three-dimensional coordinates of these points were measured with a contact-type three-dimensional digitizer.
- FIG. 9C shows a generic model.
- the generic model is a model in which the arrangement density of the nodes of the eyes and mouth is large and the arrangement density of the nodes of the scalp is small. The number of nodes is 4703.
- FIG. 10 is a perspective view showing a homology model in which the number of nodes and topology of high-resolution data for each subject are homogenized.
- the homology model is created. Since the number of nodes and topology are standardized in the homologous model, multivariate analysis can be performed by collecting a large number of subjects. A total of 50 homologous models were created, 10 from each age group.
- FIG. 11 (a) is a diagram showing an average face shape (average face shape model) of a homologous model of a subject in his twenties.
- FIG. 11B is a diagram showing an average face shape model of a homology model of subjects in their 30s.
- FIG.11 (c) is a figure which shows the average face shape model of the homology model of the test subject of 40 generations.
- FIG. 11D is a diagram showing an average face shape model of a homology model of subjects in their 50s.
- FIG. 11E is a diagram showing an average face shape model of a homology model of a subject in his 60s.
- the homologous model three-dimensional coordinates are extracted and the texture is discarded. And the influence of the individual difference of a test subject's face is excluded by carrying out the coordinate averaging of this homologous model. For this reason, the feature of the face shape which changes with the progress of the age appears in the average face shape of each age.
- FIG. 12 (a) shows the average face of the homologous model of a total of 20 young female subjects in their 20s and 30s.
- FIG. 12B is an average face (hereinafter sometimes referred to as an overall average face) of a homologous model of 50 female subjects of all ages in their 20s to 60s.
- FIG. 12C is an average face of a homology model of a total of 20 elderly female subjects in their 50s and 60s.
- FIGS. 12 (a) and 12 (c) are compared, it can be seen that as the age progresses, the legal line becomes deeper and the cheeks are loosened and swollen.
- Such features are objective and quantified by principal component analysis of homologous models.
- the basis vectors (principal components) extracted by principal component analysis of a population analysis model with 50 subjects in their 20s to 60s as a population and their contribution rates are shown in FIG.
- the extracted base vectors and their contribution rates change (see Example 2 described later).
- the inventor conducted principal component analysis of the population analysis model, and examined the change in the facial impression by changing the principal component from the average face individually from the lower order basis. Then, it turned out that a specific main component has high correlation with aging (aging). It was also found that other specific main components impress adult faces or child faces and contribute to small face impressions.
- FIGS. 13A to 13C to 27C are perspective views of virtual shapes in which the first principal component (pca01) to the fifteenth principal component (pca15) are individually changed.
- FIG. 13 (b), FIG. 14 (b),..., FIG. 27 (b) are perspective views showing the shape of the average face (overall average face) of homologous models of all ages in their 20s to 60s. These are the same as FIG. 12B.
- FIG. 13 (c) the weighting coefficient of the first basis vector of the above formula (1) (b 1) to +3 times the standard deviation of the population and (+ 3 [sigma]), the weighting coefficients of the other basis vectors (b 2 ⁇ It is a perspective view of a virtual shape when b n ) is zero.
- FIG. 13A shows that the weight coefficient (b 1 ) of the first basis vector is ⁇ 3 times ( ⁇ 3 ⁇ ) of the standard deviation of the population, and the weight coefficients (b 2 to b n ) of other basis vectors are zero.
- FIG. 14C the weight coefficient (b 2 ) of the second basis vector is set to +3 times (+ 3 ⁇ ) the standard deviation of the population, and the weight coefficients (b 1 , b 3 to b n ) of the other basis vectors are zero.
- FIG. FIG. 15C, FIG. 16C,..., And FIG. 27C sequentially show the weight coefficient of the third to fifteenth basis vectors +3 times the population standard deviation (+ 3 ⁇ ), respectively.
- FIG. 5 is a perspective view of a virtual shape when the weight coefficient of another base vector is zero. In FIG.
- the weight coefficient (b 2 ) of the second basis vector is set to ⁇ 3 times ( ⁇ 3 ⁇ ) of the standard deviation of the population, and the weight coefficients (b 1 , b 3 to b n ) of other basis vectors. It is a perspective view of a virtual shape at the time of making zero into zero. 15 (a), FIG. 16 (a),..., And FIG. 27 (a), in turn, the weight coefficients of the third to fifteenth basis vectors are multiplied by -3 times the standard deviation of the population (- 3 ⁇ ) is a perspective view of a virtual shape when the weighting coefficient of another base vector is zero.
- the first main component extracted in Example 1 was a factor contributing to the overall size of the face. As can be seen by comparing FIG. 13C and FIG. 13A, when the weighting factor of the first principal component increases in the positive direction, the face becomes thinner and smaller as a whole. Whether the weighting coefficient of the first principal component is large or small in the face of the subject can be determined relatively easily when the face of the subject is visually observed.
- the secondary and higher order bases were the main components contributing to the shape of the local part of the face. For this reason, it is difficult to easily determine the magnitude of the higher-order basis weight coefficient simply by visually observing the face of the subject.
- the amount of expression of the feature quantity (principal component) in the subject's face is accurately made objective by analyzing the principal component analysis of the subject analysis model and quantifying the next weighting factor as in this embodiment.
- the second main component was a factor contributing to the degree of swelling of the face, swelling of both sides of the nose, and sagging under the nose.
- FIG. 14C and FIG. 14A when the weighting factor of the second principal component increases in the positive direction, the face is tightened and the legal line is thinned.
- the second main component increases in the minus direction, the apparent age and the actual age progress in the aging direction.
- the third main component was a factor contributing to the height of the chin.
- the temporomandibular joint develops and the face looks like an adult when the weighting factor of the third principal component increases in the positive direction. That is, the impression of an adult face progresses.
- the weight coefficient of the third principal component increases in the negative direction, a baby face is formed.
- the third main component advances in the minus direction, the small face impression advances.
- the fourth main component was a factor contributing to occipital elongation and orbital width.
- the weighting coefficient of the fourth principal component increases in the positive direction, the occipital region decreases and the distance between both eyes approaches. That is, the tendency to cross the eye progresses.
- the fifth main component was a factor contributing to the frontal frontal collision and the frontal frontal collision. As can be seen by comparing FIG. 17C and FIG. 17A, when the weighting factor of the fifth principal component increases in the positive direction, the forehead and the mouth protrude forward.
- the sixth main component was a factor contributing to the height of the whole head. As can be seen by comparing FIG. 18C and FIG. 18A, the length above the eyes is reduced when the weighting factor of the sixth principal component increases in the plus direction. Conversely, when the weight coefficient of the sixth principal component increases in the minus direction, the length of the forehead increases.
- the seventh main component was a factor that contributes to the longitudinal position of the outside of the orbit and the elongation below the nose.
- FIG. 19 (c) and FIG. 19 (a) when the weighting coefficient of the seventh principal component increases in the positive direction, the corner of the eye advances and the stereoscopic effect of the eye decreases, and the area under the nose is reduced. It grows and the legal line deepens. For this reason, when the seventh main component is increased in the positive direction, the appearance of the face is deflated as a whole, and the apparent age and the actual age progress in the aging direction. Conversely, when the seventh principal component advances in the minus direction, the apparent age and actual age advance in the younger age direction.
- the eighth main component was a factor contributing to the orbital width. As can be seen by comparing FIG. 20C and FIG. 20A, when the weighting coefficient of the eighth principal component increases in the positive direction, the interval between the outer sides of the orbits approaches. As a result, the tendency to cross the eye progresses.
- the ninth main component was a factor contributing to the internal drooping of the upper part of the eye corners and the degree of closing of the mouth corner.
- FIG. 21 (c) and FIG. 21 (a) when the weighting coefficient of the ninth main component increases in the minus direction, the corner of the eye hangs down on the inner side of the face, and the mouth corner moves backward, thereby The line becomes deeper. For this reason, the apparent age and the actual age progress in the aging direction.
- the weighting coefficient of the ninth main component increases in the positive direction, the corner of the eye hangs up and the mouth corner becomes shallow, giving a youthful impression. That is, the apparent age and actual age progress in the direction of younger age.
- the 10th principal component was a factor contributing to facial distortion. As can be seen by comparing FIG. 22 (c) and FIG. 22 (a), the tenth principal component contributes to the torsion in the left-right direction of the upper part of the face.
- the eleventh main component was a factor contributing to the frontal protrusion of the central part of the mouth and the flattening of the lower cheekbone.
- FIG. 23 (c) and FIG. 23 (a) when the weighting coefficient of the eleventh principal component increases in the negative direction, the cheeks become thin and the center of the mouth protrudes forward, and the mouth corners are recessed. . Thereby, the apparent age and the actual age progress in the aging direction. Further, when the weight coefficient of the eleventh principal component increases in the minus direction, a so-called “duck mouth” impression tendency progresses.
- the twelfth principal component was a factor contributing to the lower jaw blister. As can be seen by comparing FIG. 24C and FIG. 24A, when the weighting factor of the twelfth principal component increases in the negative direction, the periphery of the lower jaw is tightened and a small face impression progresses.
- the thirteenth main component was a factor contributing to the swelling of the lower ears and lower jaw.
- the weight coefficient of the thirteenth principal component increases in the minus direction, the bulge increases as fat is attached to the ear and chin.
- the 14th principal component was a factor contributing to head distortion. As can be seen by comparing FIG. 26C and FIG. 26A, the fourteenth principal component contributes to the left-right asymmetry of the lower part of the face.
- the 15th principal component was a factor contributing to the distortion of the face. As can be seen by comparing FIG. 27 (c) and FIG. 27 (a), the fifteenth principal component contributes to the distortion of the face due to the left-right twist of the lower part of the face.
- the amount of expression of the principal component of part or all of the orders may be an evaluation target. Specifically, it is only necessary to examine whether or not the weight coefficients of these order basis vectors are equal to or greater than a predetermined amount (for example, plus 1 ⁇ value in the aging progress direction).
- a predetermined amount for example, plus 1 ⁇ value in the aging progress direction.
- the expression level of the third principal component may be similarly evaluated. And when determining the degree of a small face impression, it is good to evaluate the expression level of a 3rd, 12th main component similarly.
- a table indicating the relationship between the impression tendency and the base order is the trend information PI shown in FIG.
- the base orders with which the impression tendency related to the apparent age or the actual age correlates are the second, seventh, ninth and eleventh. It changes from the following.
- a base order having a high correlation with an impression tendency (aging) may be obtained in a sensory manner in advance.
- the degree of impression tendency may be determined based on the expression level of the principal component of the base order in the subject model.
- the degree of impression tendency in aging determination is the apparent age or actual age of the subject.
- the feature amount for determining the expression level is at least one of basis vectors having a high correlation with impression tendency (aging) as described above.
- the basis vector having a high correlation with the impression tendency is a basis vector that is statistically required to have a high correlation coefficient with the impression tendency.
- the basis vector is larger than the limit value of the 5% significance level in the number of samples of the population.
- a principal component analysis was performed on a subject's homologous model of a population of 50 Japanese women in each age group in their 20s to 60s.
- Basis vectors from the population up to the 15th order with a contribution rate of 1% or more were obtained (see FIG. 3). Then, for each subject, the respective weighting factors (eigenvalues) of the first to fifteenth bases were calculated.
- FIG. 28 is a table showing the correlation coefficient between the weight coefficient for each base order and the apparent age. There was a high correlation with apparent age for the 4th order of the second, seventh, ninth and eleventh orders. In addition, the second, ninth and eleventh orders have negative correlation coefficients, and the seventh order has a positive correlation coefficient. As a result, it was found that as the aging progresses, the main component advances in the minus direction in the second, ninth, and eleventh orders, and the main component advances in the plus direction in the seventh order.
- the limit value of the 5% significance level when the sample number (N) of the population is 50 is 0.279. Therefore, the absolute values of the 7th and 9th orders are both larger than the limit value of the 5% significance level. That is, the 7th and 9th feature quantities are basis vectors (principal components) with a contribution rate of 1% or more, and the correlation coefficient with the impression tendency is more than the limit value of the 5% significance level in the population sample size. Is also big. Therefore, it can be said that it is statistically likely to determine aging using the 7th and 9th order basis vectors.
- the population was divided into two groups based on the actual age, and whether or not there was a significant difference in the relationship between the weight coefficient for each base order and the apparent age was tested.
- the t-test was performed by dividing the population into two groups: 20 people from 20 to 40 years old and 30 people from 41 to 69 years old.
- FIG. 29 is a table showing the results.
- the case where the t-test value is less than 0.05 is determined to be significant.
- the 7th-order and 9th-order t-test values were less than 0.05, and t-test results of other base orders were 0.05 or more. Therefore, it was found that there is a significant difference depending on the age for the 7th and 9th principal components.
- the t-test values for the second and eleventh orders were relatively small values of less than 0.2, and it was found that there was a slight difference due to age.
- FIG. 30 is a table showing the correlation between the weighting factor for each base order and the actual age regarding the entire population. Comparison between FIG. 28 and FIG. 30 shows that there is a high correlation with the four base orders of the second, seventh, ninth and eleventh orders with respect to the actual age. Similarly to the apparent age, it was found that as the actual age progresses, the main component advances in the minus direction in the second, ninth, and eleventh orders, and the main component advances in the plus direction in the seventh order. When the population is divided into two groups according to the actual age and whether or not there is a significant difference in the relationship between the weighting factor for each base order and the actual age, a tendency similar to the result shown in FIG. 29 is observed. (Not shown).
- the apparent age and the actual age of the subject can be handled in the same way with respect to the aging determination. That is, it has been found that the apparent age of a subject can be determined using the present invention, and the actual age of a subject whose age is unknown can be estimated. Regarding beauty, since the apparent age as viewed from the person or others is more important than the actual age, the apparent age of the subject is used as the degree of impression tendency in Example 1 and Example 2 described later.
- FIGS. 31A to 31F show six images in which the weighting coefficient of the basis vector is changed by 1 ⁇ up to ⁇ 3 ⁇ with respect to the ninth-order basis vector.
- FIG. 31A is a diagram in which the weight coefficient of the ninth-order basis vector is set to + 1 ⁇ with the aging progression direction (direction in which the aging tendency advances) as positive from the overall average face shape.
- the aging tendency progresses as the ninth-order main component progresses in the negative direction. Therefore, by changing the weight coefficient of the ninth-order basis vector by ⁇ 1 ⁇ , the face shape changes by + 1 ⁇ in the aging progression direction.
- FIG. 31B is a diagram in which the weight coefficient of the ninth-order basis vector is set to + 2 ⁇ in the aging progress direction.
- FIG. 31 (c) is also a diagram with + 3 ⁇ , which is the same as FIG. 21 (a).
- FIG. 31 (d) is also a diagram in which ⁇ 1 ⁇ is set.
- FIG. 31 (e) is also a diagram in which ⁇ 2 ⁇ is set.
- FIG. 31 (f) is also the same as ⁇ 3 ⁇ and is the same as FIG. 21 (c).
- FIG. 32 is a graph showing changes in the age impression of the virtual form when the aging impression factor (aging factor) is changed.
- the weight coefficients of the second-order, seventh-order, ninth-order, and eleventh-order basis vectors were changed from + 1 ⁇ to + 3 ⁇ in the aging tendency, the apparent age sensory value increased linearly.
- the 7th and 9th orders rejuvenated almost linearly.
- the 7th and 9th bases function as aging factors in both the aging direction and the rejuvenation direction
- the secondary and 11th bases serve as aging factors only in the aging direction. I found it to work.
- the slope of the graph of FIG. 32 represents the amount of change in apparent age when the weighting factor of the basis vector of the aging factor is changed by a predetermined amount (for example, 1 ⁇ ). This slope is called an aging coefficient.
- the aging coefficient is stored in the trend information storage unit 74 in association with each degree of the aging factor.
- the weighting coefficient of the basis vector and the progression of the apparent age change linearly for the secondary, seventh, ninth, and eleventh principal components (aging impression axis) with respect to the aging progression direction. It was. In addition, with regard to the 7th and 9th orders, it was found that the apparent age can be rejuvenated by setting the weighting coefficient of the basis vector to the opposite sign to the aging tendency.
- the eleventh-order principal component is a cosmetic treatment method such as (i) concealing the dent of the corner of the mouth, (ii) drawing the outline of the lips clearly, (iii) tightening the facial muscles by facial movements, etc. Is effective.
- the beauty treatment method described above may be output as the beauty information output by the beauty counseling method of the embodiment.
- appropriate types of hair cosmetics and makeup cosmetics, product names, usage methods and usage amounts thereof may be output as beauty information.
- the information output unit 80 may output the information.
- FIG. 33 (a) to 33 (d) are perspective views of a homologous model in which a plurality of aging impression axes are combined. From these figures, it was found that when a plurality of aging impression axes are combined, the aging tendency proceeds more remarkably.
- FIG. 33A shows that the second-order and ninth-order weight coefficients are both set to + 3 ⁇ in the aging tendency.
- FIG. 33B is a graph in which the 7th and 9th weighting factors are both set to + 3 ⁇ in an aging tendency.
- the second-order and eleventh-order weighting factors are both set to + 3 ⁇ in the aging tendency.
- FIG. 33 (c) the second-order and eleventh-order weighting factors are both set to + 3 ⁇ in the aging tendency.
- the secondary, seventh, ninth and eleventh weighting coefficients are all set to + 3 ⁇ in the aging tendency.
- 33 (a) to (c) were apparently older than FIGS. 14 (a), 19 (c), 21 (a) and 23 (a).
- FIG. 33 (d) was apparently older than FIGS. 33 (a) to (c). Accordingly, it was found that it is appropriate to determine that the aging tendency is high when the subject has a plurality of aging impression axes.
- FIG. 34 is a table showing the number of aging impression axes possessed by subjects of a population of 10 people of each age group in their 20s to 60s.
- FIG. 34 shows how many second-order, seventh-order, ninth-order, and eleventh-order principal components (aging impression axes) each subject of each age has.
- Three subjects in their 20s had only one aging impression axis (factor). None had more than one factor.
- Two subjects in their 30s had only one aging impression axis (factor).
- Two subjects in their 40s had only one aging impression axis (factor).
- Seven of the subjects in their 50s had only one aging impression axis (factor).
- Three of the subjects in their 60s had only one aging impression axis (factor).
- FIG. 34 cluster analysis was performed based on the degree of the aging impression axis possessed by subjects in their 40s and over.
- the Ward method was used for the analysis.
- FIG. 35 is an example of a table showing a group of aging tendencies.
- the face impression analyzer 100 such a table may be stored in the trend information storage unit 74 (see FIGS. 1 and 6).
- the face impression determination unit 60 determines an aging tendency group to which the subject belongs based on the expression amount for each feature amount calculated by the face component analysis unit 50. And it is good to output this determination result in the beauty information output part 80.
- FIG. 35 is an example of a table showing a group of aging tendencies.
- the face impression analyzer 100 such a table may be stored in the trend information storage unit 74 (see FIGS. 1 and 6).
- the face impression determination unit 60 determines an aging tendency group to which the subject belongs based on the expression amount for each feature amount calculated by the face component analysis unit 50. And it is good to output this determination result
- information representing a group of impression tendencies to which the subject belongs selected based on the magnitude of the contribution rate of the feature amount (basis vector) may be used. Good.
- Type I is a group in which subjects having an aging impression axis other than the second order expressed by 1 ⁇ or more are classified. In other words, subjects whose only one or more of the seventh, ninth, and eleventh principal components have an aging tendency of 1 ⁇ or more are classified as type I.
- Type II is a group in which subjects in which only one or both of the second and ninth orders are expressed in an aging tendency by 1 ⁇ or more are classified.
- Type III is a group in which subjects in which only one or both of the second order and the seventh order are expressed in an aging tendency by 1 ⁇ or more are classified.
- Type IV is a group in which subjects whose only one or both of the second order and the eleventh order are expressed in an aging tendency by 1 ⁇ or more are classified.
- Example 2 The principal component analysis of the population analysis model was similarly performed except that the number of homologous models was increased from that in Example 1.
- the total population of the subjects was Japanese women, with a total of 148 people consisting of 29 people each in their 20s and 30s, and 30 people each in their 40s, 50s and 60s.
- Basis vectors up to the 147th order were obtained by principal component analysis.
- FIG. 36 is a table showing the contribution rate and cumulative contribution rate up to the 20th order base (partially omitted).
- the cumulative contribution ratio of the main component up to the 20th order base exceeded 80%, specifically 87.3%. Further, the contribution ratio of each main component of the 20th order or less (specifically, the 18th order or less) was less than 1%.
- FIG. 37 shows the face shape to which the primary to 36th (partially omitted) principal components belong.
- FIG. 38 is a table showing a single correlation coefficient between the weight coefficient for each base order and the apparent age.
- a base having an absolute value of a single correlation coefficient of 0.2 or more was determined as a significant aging factor.
- Example 2 there was a high correlation with the apparent age with respect to the five basic orders of the first, ninth, tenth, twelfth and twentieth orders. The 9th and 10th orders had negative correlation coefficients, and the 1st, 12th and 20th orders had positive correlation coefficients.
- the facial features in the subject analysis model and the average face model can be changed.
- An impression change image in which the impression tendency was changed was generated.
- Example 2 the population was classified into a plurality of groups based on the facial impression tendency, and the group to which the subject belonged was determined based on the tendency of the expression levels (weighting factors) of the five aging factors. Specifically, the population was classified into a plurality of groups based on the degree of coincidence of the tendency of the weighting factors related to the aging factor.
- the specific method of classifying into groups is not particularly limited. As an example, the main component scores of five aging factors can be classified into a predetermined number of groups as a distance function using a cluster analysis method such as the Ward method.
- Type I included 39 (26%) subjects in the 148 population, with an average apparent age of 44.7 years. Type I subjects were characterized by having many 9th and 12th main components.
- FIG. 39A is a front view of an average face shape model of 20 people (young people) who are type I and ages 24 and under 46.
- FIG. 39 (b) is a front view of an average face shape model of 19 people (aged group) who are I type 46 years old or older and under 65 years old.
- FIG. 39 (c) is an average of 20 people (younger age group) of type II from 22 to 46 years old
- FIG. 39 (d) is an average of 19 people (aged group) of type II from 46 years to less than 68 years old.
- It is a front view of a face shape model.
- Fig. 39 (e) shows the average face shape of 20 people (younger age group) of type III who are 21 to 47 years old
- Fig. 39 (f) is the average face shape of 19 people (aged group) of type III who is 47 years and older and younger than 63 years
- Fig. 39 (g) shows the average face shape of 20 people (younger age group) who are IV type 23 years old and younger than 46 years
- Fig. 39 (h) is the average face shape of 19 people (aged group) who are IV type 46 years old and older than 65 years old. It is a front view of a model.
- FIG. 40 (a) to (d) are tables showing the average of the principal component scores for each aging factor for subjects of type I to type IV.
- FIG. 40 (a) shows the average of the principal component scores of the five aging factors in all 39 young people and all of the elderly people belonging to type I (all), and the average value of the principal component scores for the younger age group. It is a table showing the average value of the main component score regarding an elderly group.
- the variance of the main component scores of these aging factors can be regarded as statistically equal variance. Based on that assumption, t-tests were conducted on two groups, young and old, and the 9th test result was less than 0.01, indicating that there is a significant difference between the young and old. I understood. On the other hand, the 12th test result was 0.01 or more, and no significant difference was observed.
- the positive (+) or negative ( ⁇ ) sign in the margin of FIG. 40 (a) is a sign indicating the aging direction from the younger age group toward the older age group.
- the aging direction from the younger age group to the older age group is expressed by multiplying the average of the main component scores by a negative sign.
- subjects who belong to type I have a 12th-order principal component in common for all ages, and a 9th-order principal component score with a negative sign due to age-related changes from young to old was found to change from a small value ( ⁇ 0.708) to a moderate value (+0.206).
- that the main component score is a small value means a negative value and an absolute value of 0.25 or more.
- the main component score is a medium value means that the absolute value is less than 0.25.
- a main component score is a large value means a positive value and an absolute value of 0.25 or more.
- the 9th-order principal component score with a negative sign changes from a small value ( ⁇ 1.004) to an intermediate value ( ⁇ 0.151) due to an aging change, and a 10th-order score with a negative sign is added. It was found that the principal component score changed from a large value (+0.456) to a larger value (+1.208). From the results of FIG. 40 (d), it was found that subjects belonging to type IV had a significantly large 9th-order principal component score. It was found that the 9th-order principal component score with a negative sign changed from a slightly large value (+0.440) to a larger value (+1.335) due to aging change.
- FIG. 41 (a) is a table showing a partial regression coefficient and a constant term for each significant aging factor (hereinafter referred to as a significant aging factor) related to a type I subject.
- Significant aging factors for Type I subjects are the 9th and 12th principal components.
- FIGS. 41B to 41D are tables showing partial regression coefficients and constant terms for each significant aging factor regarding type II to type IV.
- the significant aging factor of the type IV subject was only the ninth order, but in this specification, the (single) regression coefficient and the partial regression coefficient are not particularly distinguished.
- the number of significant digits of the partial regression coefficient may be appropriately set according to conditions such as the number of populations.
- the apparent age as an objective variable can be expressed by a multiple regression equation using these partial regression coefficients and constant terms, with the principal component score of a significant aging factor as an explanatory variable.
- the face impression analysis apparatus 100 may store the numerical values (aging calculation coefficients) shown in these tables in the trend information storage unit 74 in association with significant aging factors.
- the impression change image generation unit 90 selects, as a distance function, the principal component score corresponding to the expression levels of the five aging factors extracted from the subject analysis model, and selects a group having the minimum distance function from the I type to the IV type. To do.
- the impression change image generation unit 90 acquires an aging calculation coefficient of a significant aging factor corresponding to this group from the trend information storage unit 74.
- the impression change image generation unit 90 receives an input of age range information from the user via the condition input unit 30.
- This age range information is information indicating an age change range for increasing or decreasing the apparent age of the subject from the current apparent age.
- the age range information takes a positive value when the apparent age is aged, and takes a negative value when the age is rejuvenated.
- the impression change image generation unit 90 adds the age range information to the apparent age of the subject, and calculates the age after age change (age after change).
- the impression change image generation unit 90 obtains an explanatory variable (significant aging factor) corresponding to the post-change age based on a (multiple) regression equation represented by an aging calculation coefficient.
- the significant aging factors or combinations thereof are different for each group from type I to type IV into which subjects are classified.
- significant aging factors of type I are 9th and 12th, 10th and 20th in II, 9th and 10th in III, and only 9th in IV .
- the impression change image generation unit 90 uses a multiple regression equation in which a principal component score of a significant aging factor (for example, two of the ninth and twelfth in the case of type I) corresponding to the group to which the subject belongs is used as an explanatory variable. Each explanatory variable that becomes the desired post-change age is calculated. Specifically, the explanatory variables, may be calculated as a multiple of the standard deviation of the weighting factor b i for each significant aging factor.
- the impression change image is generated by reconstructing by applying (see).
- the desired principal component score of the significant aging factor is changed to change the desired score.
- the calculation of the weight coefficient b i of the age after change has been described.
- Age after change based on a multiple regression equation with all aging factors including significant aging factors (in Example 2, five of the first, ninth, tenth, twelfth, and twentieth) as explanatory variables
- the principal component score and the weighting coefficient with the solution may be calculated.
- FIG. 42 (a) is a perspective view of an average face shape model of all type I subjects (the average apparent age is: 44.7 years old).
- FIG. 42 (a) represents an average face shape of an apparent age at the time of approximately 45 years of a subject who has an impression tendency having both the 9th and 12th principal components.
- FIG. 42B is a perspective view showing a state in which the average face shape model of FIG. 42A is rejuvenated until the apparent age reaches about 30 years.
- FIG. 42 (c) is a perspective view showing a state in which the average face shape model of FIG. 42 (a) is aged until the apparent age becomes about 60 years old.
- FIG. 43 (a) is a front view of an average face of 20 people belonging to the I-type younger age group.
- FIG. 40 is a composite photographed image corresponding to the average face model shown in FIG.
- FIG. 43B is a front view of an average face of 19 people belonging to an I-type elderly group.
- FIG. 40 is a composite photographed image corresponding to the average face model shown in FIG.
- the apparent average age of the younger age group was 30 years old, and the apparent average age of the older age group was 57 years old.
- FIG. 43A shows the average face shape of a real young subject
- FIG. 43B shows the average face shape of a real old subject. 42 (b) and FIG. 43 (a), and FIG. 42 (c) and FIG.
- FIG. 44 (a) is a perspective view of an average face shape model of all type II subjects (average apparent age is 42.9 years).
- FIG. 44 (a) represents an average face shape at the time corresponding to the average age of a subject who has an impression tendency that has both the 10th and 20th principal components.
- FIG. 44 (b) is a perspective view showing a state where the average face shape model of a type II subject is rejuvenated until the apparent age is about 30 years old.
- FIG. 44 (c) is a perspective view showing a state in which an average face shape model of a type II subject is aged until the apparent age becomes about 60 years old.
- FIG. 45A is a perspective view of an average face shape model of all type III subjects (average apparent age is 46.3 years old).
- FIG. 45 (a) represents an average face shape at the time corresponding to the average age of a subject who has an impression tendency having both the 9th and 10th principal components.
- FIG. 45 (b) is a perspective view showing a state in which the average face shape model of the type III subject appears rejuvenated until the apparent age reaches about 30 years.
- FIG. 45 (c) is a perspective view showing a state in which the average face shape model of the type III subject is aged until the apparent age is about 60 years old.
- FIG. 46A is a perspective view of an average face shape model of all IV type subjects (average apparent age is 43.6 years old).
- FIG. 46 (a) represents an average face shape at the time corresponding to the average age of a subject who has an impression tendency that has only the ninth-order principal component among aging factors.
- FIG. 46B is a perspective view showing a state in which the average facial shape model of the IV type test subject is rejuvenated until the apparent age becomes about 30 years old.
- FIG. 46C is a perspective view showing a state in which the average face shape model of the IV type test subject is aged until the apparent age becomes about 60 years old.
- the expression levels of aging factors can be reduced by comparing the average faces (not shown) of the young and elderly subjects in the same manner as in FIGS. It was confirmed that the trend was well simulated for each group. From the above, it was found that according to the face image generation method according to Example 2, it is possible to generate an impression change image that simulates aging and rejuvenation of a real subject with high accuracy. In Example 2, the apparent age was used as the impression tendency, but it was confirmed that the same result was obtained even if the actual age of the subject was used instead (not shown).
- the generated impression change image may be displayed and output by the beauty information output unit 80 (see FIG. 1) and presented to the subject.
- the degree of impression tendency is the apparent or actual age of the subject, and based on this age, the population is divided into a first group including the subject (for example, a young group) and a second group not including the subject. (For example, elderly people). And assigning a first weight to an aging factor (in the case of type I, a ninth-order basis vector) that is biased in one of the first population or the second population, A second weight that is smaller than the first weight is assigned to the aging factor expressed in both of the second population (in the case of type I, the first, tenth, and twentieth basis vectors). To change the main component score (weighting factor).
- an aging factor in the case of type I, a ninth-order basis vector
- the apparent age or actual age after a predetermined amount of change is preferably between the average age of the first population (eg, younger age group) and the average age of the second population (eg, older age group). .
- the principal component score of the aging factor after being changed in age within a predetermined aging width can be obtained based on the interpolation operation between the average value of the first group and the average value of the second group. it can.
- Example 2 after classifying the population of subjects into groups of type I to type IV, the bias of the expression level of aging factors was further analyzed by dividing into young and old age groups. It is not limited to this. Divide the entire population into young and elderly groups without classifying them into groups such as type I to type IV, and analyze the bias in the expression of aging factors to identify the dominant aging factors May be.
- cluster analysis based on the degree of coincidence of aging factors is used in the above embodiment, but the present invention is not limited to this.
- the population may be classified into a plurality of groups based on measurable shape characteristics such as the size and relative position of a specific part in the face shape of the subject.
- Some subjects belonging to the population may belong to both the first group and the second group. That is, in Example 2, subjects were classified into a young age group and an elderly age group at the age of 46 without any leakage or duplication, but the present invention is not limited to this.
- the first group may be 24 to 50 years old
- the second group may be 40 to 60 years old. Even if some subjects belong to both groups, the method of the present invention for analyzing the aging tendency from the tendency of the feature amount in the average face shape of each group does not lose its validity.
- you may divide a population into three or more groups, such as a young group, a middle class, and an elderly group.
- a factor whose expression level gradually increases from the younger layer to the middle layer and further from the middle layer to the elderly layer is specified as a dominant aging factor, and the expression of such factor is suppressed.
- a cosmetic treatment method and cosmetics may be provided to the subject.
- an aging factor that is not expressed in the younger layer and the intermediate layer but is expressed only in the elderly layer may be specified, and a cosmetic treatment method or cosmetic that suppresses the expression of the factor may be provided to the subject.
- the population of subjects is classified into a plurality of groups in the face impression analysis method.
- beauty information such as a beauty treatment method based on the degree of impression tendency of the subject's facial features.
- the population may be classified into a plurality of groups. That is, it is preferable to classify a population of a plurality of subjects into a plurality of groups and obtain a group to which the subject belongs based on the expression amount of the feature amount on the subject's face.
- the population may be classified into a plurality of groups based on the degree of coincidence of the tendency of a plurality of weighting factors related to a plurality of basis vectors (aging factors) having a high correlation with the impression tendency.
- a population is divided into a plurality of groups based on the degree of coincidence of tendencies of a plurality of weighting factors related to a plurality of basis vectors (aging factors) that are highly correlated with impression tendencies (apparent age). While classifying, it is good to obtain
- the cosmetic treatment method is associated with each aging factor in advance and stored as a table (see FIG. 8), but instead, based on the tendency of the expression level of the aging factor.
- a cosmetic treatment method may be associated with each group classified in advance and stored as a table (not shown).
- a beauty treatment method including any one of a beauty molding method, a beauty massage method, a hair makeup method, a makeup makeup method, a hair cosmetic, or Information indicating makeup cosmetics may be stored in the trend information storage unit 74 in association with each group of type I to type IV.
- the face impression determination unit 60 determines a group to which the subject belongs based on the weighting factor of the aging factor included in the subject analysis model, and further refers to the tendency information storage unit 74 to correspond to the determined group. Beauty information may be acquired and output by the beauty information output unit 80. Thereby, the beauty information suitable for the group to which the subject belongs is provided.
- FIG. 42 (a), FIG. 44 (a), FIG. 45 (a), and FIG. 46 (a) it was found that there was no significant difference in the face shape at the age of 45, which is close to the average age for each group. . This is because the subjects in each group are sufficiently evenly distributed so that there is no significant difference in the average face shape at the present time regarding types I to IV classified according to the degree of coincidence of the trends in the expression of aging factors It means that.
- the rejuvenated images in each of the above figures (b) are compared, a difference appears in the face shape.
- the aging images in each of the above figures (c) the difference in the face shape is noticeable. Specifically, in the I-type aging image in FIG.
- the aging image of type III in FIG. 45 (c) shows a tendency to become suspended although the legal line is shallow.
- beauty information for the middle part of the face such as highlights on the cheeks, may be stored in association with Type III.
- the IV type aging image of FIG. 46 (c) has a tendency that the corner of the mouth is lowered, the legal line is deep, and the corners of the eyes hang down.
- beauty information for the entire face such as beauty information for the vicinity of the lips such as a lip liner, a concealer that hides the legal line, and an eyebrow for raising the eyebrows, should be stored in association with each other.
- Type I and Type III have a common tendency for the shape of the entire face outline, called cheek drooping, to deform with age. In order to suppress such an aging impression, the base makeup cosmetic is more effective than the makeup cosmetic.
- type II and type IV have a common tendency that the shape of the facial part (partial element) called drooping of the corner of the eye is prominently age-deformed. In order to suppress such an aging impression, it is common in that makeup cosmetics are more effective than base makeup cosmetics.
- the base makeup cosmetic is associated with a group in which the shape of the entire face contour is prone to age-deformation, and the shape of the facial subelements is age-deformed.
- the makeup cosmetics may be stored in association with the group in which the tendency is remarkably exhibited.
- Example 2 the population was divided into the first group and the second group based on age, but the present invention is not limited to this.
- the population may be divided into a first group to which the subject belongs and a second group to which the subject does not belong, depending on the region of origin.
- the feature quantity (basis vector) that governs the impression tendency of the facial feature attributed to the region of origin.
- the face shape of a test subject's offspring can be estimated by producing
- Example 3 Using the population analysis model and the principal component analysis result common to Example 2, the degree of impression tendency of facial features other than aging was analyzed. Five beauty specialists looked at the photos of 148 subjects and evaluated the grades from adult faces to baby faces on a scale of 0-6. The higher the adult face, the higher the evaluation value, and the stronger the baby face, the lower the evaluation value.
- FIG. 47 is a table showing a single correlation coefficient between the weight coefficient for each base order and the degree of the adult face.
- a base order of a weighting factor larger than the limit value of the 5% significance level is indicated by “*”
- a base order of a weighting factor greater than the limit value of the 1% significance level is indicated by “**”.
- FIG. 48 (a) shows an average face of 10 subjects having the highest adult face evaluation value among all the population.
- FIG. 48 (b) shows the average face of the 10 subjects who have the lowest evaluation value of the adult face among all the population, that is, the child face that is most evaluated. 48 (a) and 48 (b) were compared, it was found that the face with a higher eye position, longer chin, and face length was more impressed by the adult face.
- FIG. 49 is a comparison table of the average apparent age of subjects classified into clusters 1 to 4, the degree of adult face (evaluation value), and the average value of principal component scores for each order of the adult face factor. .
- FIG. 50A shows an average face of approximately 30% of subjects who belong to cluster 1 in the population.
- the average face of cluster 1 was a round face, and the eye position was equivalent to the overall average face.
- the degree of adult face was rated as normal.
- FIG. 50B is an average face of approximately 20% of subjects belonging to cluster 2 in the population.
- the average face of cluster 2 was face length, and the position of the eyes was above the overall average face.
- the average face of cluster 2 was rated as the most adult face.
- FIG. 50C shows an average face of approximately 24% of subjects belonging to cluster 3 in the population.
- the average face of cluster 3 was face length and the eye position was intermediate.
- the degree of adult face was rated as slightly strong.
- FIG. 50D is an average face of approximately 16% of subjects who belong to cluster 4 in the population.
- the average face of cluster 4 was a round face, and the position of the eyes was below the overall average face.
- the average face of cluster 4 was evaluated as the most adult face, that is,
- the degree of the facial impression tendency can be quantitatively analyzed as the degree of the facial impression tendency. Then, as in this embodiment, the population is classified into a plurality of clusters based on the expression level of the adult facial factor, and a cluster to which an arbitrary subject belongs is obtained.
- This realizes a beauty counseling method in which beauty information for increasing or weakening the tendency of an adult face (increasing the tendency of a baby face) is presented according to the cluster to which the subject belongs.
- beauty information information on selection of cheek color and application method can be given.
- the 8th base in particular, has a high principal component score in common with clusters 2 and 3 that have a relatively strong adult face level. Therefore, a high weight is given to some higher-order factors (specifically, the 8th base) among a plurality of adult face factors, and a lower weight is given to other factors to give an adult face
- the degree of the baby face may be quantitatively evaluated.
- the impression change image may be generated by changing the expression level (weighting coefficient) of the adult face factor in the face image of the subject.
- a population is divided into a first group (cluster) including subjects and a second group (other clusters) not including subjects. Divide into Then, a first weight is assigned to a basis vector biased in one of the first population or the second population, and the basis expressed in both of the first population and the second population.
- the weighting coefficient of the base vector may be changed by giving a second weighting factor smaller than the first weighting factor to the vector.
- Example 4 Using the population analysis model and the principal component analysis result common to Example 2 and Example 3, the degree of impression tendency of facial features other than aging was analyzed. Five beauty professionals looked at the photos of 148 subjects and evaluated the degree of large-face impression to small-face impression on a scale of 0-6. The higher the large face impression, the higher the evaluation value, and the stronger the small face impression, the lower the evaluation value.
- FIG. 51 is a table showing a single correlation coefficient between the weight coefficient for each base order and the degree of large face impression.
- a base order of a weighting factor larger than the limit value of the 5% significance level is indicated by “*”
- a base order of a weighting factor greater than the limit value of the 1% significance level is indicated by “**”.
- FIG. 52 (a) shows an average face of 10 subjects having the highest evaluation value of large face impression among all the populations.
- FIG. 52 (b) shows the average face of the 10 subjects who have the lowest large face impression evaluation value among all the population, that is, the smallest face.
- the average face of the small face impression of FIG. 52 (b) is a height position where the chin is thin, the lower cheek bulge is small, and the face has the maximum width.
- the average face of the large face impression in FIG. 52 (a) had a large jaw, a swollen lower cheek, and a height position where the face had the maximum width was lower than the eyes.
- FIG. 53 is a comparison table of the average apparent age of subjects classified into clusters 1 to 4, the degree of large face impression (evaluation value), and the average value of principal component scores of each order of small face factors. is there.
- FIG. 54A shows an average face of approximately 30% of subjects who belong to cluster 1 in the population.
- the average face size of cluster 1 was equivalent to the overall average face.
- the large face impression was rated as normal.
- FIG. 54 (b) shows an average face of approximately 26% of subjects belonging to cluster 2 in the population.
- the average face of cluster 2 had a slightly swollen lower cheek than the overall average face.
- the average face of cluster 2 was rated as having a slightly large face impression.
- FIG. 54 (c) shows an average face of approximately 24% of subjects belonging to cluster 3 in the population.
- the average face of cluster 3 had a chin smaller than the overall average face, and the swelling of the lower cheeks was also small.
- the average face of cluster 2 was evaluated as having a small face impression.
- FIG. 54 (d) shows an average face of approximately 20% of subjects belonging to cluster 4 in the population.
- the average face of cluster 4 had a lower cheek than the overall average face.
- the average face of cluster 4 was rated as having
- the degree of the facial impression tendency can be quantitatively analyzed as the degree of the facial impression tendency. Then, as in this embodiment, the population is classified into a plurality of clusters based on the expression level of the small face factor, and a cluster to which an arbitrary subject belongs is obtained.
- This realizes a beauty counseling method in which beauty information for enhancing the small face impression is presented according to the cluster to which the subject belongs.
- the beauty information there can be mentioned information on dark makeup cosmetics and makeup methods applied to the cheeks and jaws in order to sharpen the jaw lines and enhance the small face impression.
- Example 5 Using the population analysis model and the principal component analysis results common to Example 2 to Example 4, the degree of impression tendency of facial features other than aging was analyzed. Five beauty specialists looked at the photos of 148 subjects and evaluated their eye size on a scale of 0-6. The higher the evaluation value, the more the impression that the eyes are large, and the lower the evaluation value, the more the impression that the eyes are small.
- FIG. 55 is a table showing a single correlation coefficient between the weight coefficient for each base order and the degree of eye size impression.
- a base order of a weighting factor larger than the limit value of the 5% significance level is indicated by “*”
- a base order of a weighting factor greater than the limit value of the 1% significance level is indicated by “**”.
- FIG. 56 (a) shows an average face of 10 subjects having the highest eye size evaluation value among all the populations.
- FIG. 56 (b) shows an average face of 10 subjects having the lowest eye size evaluation value among all the population.
- 56 (a) and 56 (b) are compared, the average face in FIG. 56 (a), which is evaluated as having large eyes, has not only large eyes, but also the width from the position of the maximum width of the face to the corner of the eyes. The dimensions were found to be small.
- FIG. 57 shows the average of the apparent ages of subjects classified into clusters 1 to 4, the impression that the eyes are large (evaluation value), and the average value of the principal component scores of each order of the eye size factor. It is a comparison table.
- FIG. 58A shows an average face of approximately 30% of subjects belonging to cluster 1 in the population.
- the average face of cluster 1 was a round face, and the face received the impression that the eyes were smaller than the overall average face.
- FIG. 58 (b) shows an average face of approximately 28% of subjects belonging to cluster 2 in the population.
- the average face of cluster 2 was a face having a face length and an impression that the eyes were larger than the overall average face.
- FIG. 58 (c) shows an average face of approximately 22% of subjects belonging to cluster 3 in the population.
- the average face of cluster 3 was a face having a face length and an impression that the eyes were smaller than the overall average face.
- FIG. 58 (d) shows an average face of approximately 20% of subjects belonging to cluster 4 in the population.
- the average face of cluster 4 was a face that received an impression that it was a small face and had larger eyes than the overall average face.
- the present example it was found that by quantifying the expression amount of the eye size factor, it is possible to quantitatively analyze the degree of receiving the impression that the eyes are large as the degree of the facial impression tendency. Then, as in this embodiment, the population is classified into a plurality of clusters based on the expression amount of the eye size factor, and a cluster to which an arbitrary subject belongs is obtained.
- This realizes a beauty counseling method in which beauty information for giving an impression that the eyes are large is presented according to the cluster to which the subject belongs.
- the beauty information information on the selection of the color of the eye shadow and the inner line and the application method can be given.
- various impression tendencies regarding facial features may be quantitatively evaluated.
- the degree of the round face or the surface length may be evaluated.
- “the degree of masculine or feminine appearance”, “the degree of oriental or western appearance”, “the degree of impression that the nose is passing”, and the like may be evaluated.
- quantify the degree of sensual impression tendency related to face creation such as “degree of face shape that looks healthy”, “degree of face shape that looks attractive”, and “degree of face shape that looks good” You may evaluate.
- One or a plurality of feature amounts (basic orders) associated with each impression tendency are stored in the trend information storage unit 74 (FIG. 1) as trend information PI (FIG. 5).
- the degree of the attractive face shape is high in a face that has a strong small face impression and a strong impression that the eyes are large. Yes. Therefore, the degree of other impression tendencies may be quantitatively evaluated based on the expression amount (weighting coefficient) of the characteristic amount (basic order) common to a plurality of impression tendencies.
- the degree of impression tendency is the degree of adult face or baby face of the subject, the degree of small face impression, the degree of round face or face length, or the size of eyes.
- the degree of impression is output.
- the beauty information is information representing a beauty treatment method, a hair cosmetic, or a makeup cosmetic including any one of a beauty molding method, a beauty massage method, a hair makeup method, and a makeup makeup method.
- the aging factor of Example 2 is each composed of a plurality of basis vectors. At least one basis vector constituting each factor is different from each other. Specifically, the 9th and 20th bases in the aging factor, the 7th and 8th bases in the adult face factor, the 11th and 16th bases in the small face factor, the 26th and 36th bases in the eye size factor Is a unique basis vector not included in other factors.
- the beauty counseling method provided by the above-described embodiment and its modifications may include displaying and outputting an impression change image of the subject to the subject.
- the current face image of the subject and the impression change image may be displayed and output in comparison.
- a beauty simulation image imitating a state in which a cosmetic treatment method including a hair makeup method or a makeup makeup method that does not change the shape of the subject's face is applied to the subject's face is generated. May be displayed and output.
- the subject can visually confirm a hair makeup method and a makeup makeup method that change the impression tendency in substantially the same manner as changing the face shape of the subject to an adult face or a small face.
- the apparent age or actual age of the subject, the degree of adult face or baby face, the degree of small face impression, the degree of round face or face length, and the degree of impression of eye size are determined. Each was evaluated. Among these, a plurality of impression tendencies may be evaluated. That is, the degree of impression tendency to be targeted in the face impression analysis method and the beauty counseling method provided by the above-described examples are the apparent age or actual age of the subject, the degree of adult face or baby face, the degree of small face impression, It may be any two or more selected from a round face or surface length and an eye size impression. Then, one or a plurality of feature amounts corresponding to each of these two or more impression tendencies may include basis vectors of different orders.
- the face impression analysis method and the beauty counseling method provided by the above embodiments analyze two or more impression tendencies, and one or a plurality of feature amounts corresponding to at least two impression tendencies, respectively. However, they may be composed of only different basis vectors.
- a facial impression analysis method comprising: calculating an expression level of the feature quantity in the face of the subject from the above, and obtaining a degree of impression tendency of the facial feature of the subject based on the expression level;
- the face shape information of the subject is further the homologous model, and the feature amount related to the subject is calculated by performing multivariate analysis on the population face information including the face shape information.
- Face impression analysis method; ⁇ 4> Multivariate analysis of the population face information not including the face shape information of the subject to obtain a plurality of continuous feature quantities from the first order to a predetermined order, and a product-sum operation of the feature quantities and their weight coefficients
- ⁇ 5> The face according to any one of ⁇ 1> to ⁇ 4>, wherein the degree of impression tendency is the apparent age, actual age, adult face or baby face degree, or small face impression degree of the subject.
- the face impression analysis method according to the above ⁇ 5>, wherein the degree of the impression tendency is an apparent age or an actual age of the subject, and the feature amount includes at least one of basis vectors highly correlated with the impression tendency; ⁇ 7>
- the feature quantity is a basis vector having a contribution ratio of 1% or more, and a correlation coefficient with the impression tendency is larger than a limit value of a 5% significance level in the number of samples of the population.
- the facial impression analysis method according to any one of ⁇ 1> to ⁇ 6>above; ⁇ 8> The method according to any one of ⁇ 1> to ⁇ 7>, wherein a three-dimensional coordinate value of the surface of the head including the face of the subject is acquired as the face shape information using a contact-type three-dimensional digitizer.
- Facial impression analysis method A three-dimensional coordinate value related to a plurality of feature points on the surface of the head is acquired using a contact-type three-dimensional digitizer, and another surface of the head is acquired using a non-contact-type three-dimensional measurement device
- the face impression analysis method according to ⁇ 8> wherein the three-dimensional coordinate value of the point is acquired;
- a plurality of two-dimensional images having different photographing angles are photographed with respect to a head including the face of the subject, and a three-dimensional coordinate value of the surface of the head is calculated as the face shape information based on the two-dimensional image.
- the face impression analysis method according to any one of ⁇ 1> to ⁇ 7>; ⁇ 11> The face impression analysis method according to any one of ⁇ 1> to ⁇ 10>, wherein the selection of the impression tendency is received from the subject.
- ⁇ 12> A beauty counseling method using the face impression analysis method according to any one of ⁇ 1> to ⁇ 11>, wherein the calculated expression level is greater than or equal to a predetermined value. Beauty counseling method characterized by outputting attached beauty information; ⁇ 13> The beauty information described in the above item ⁇ 12>, wherein the beauty information is information representing a beauty treatment method, a hair cosmetic method, or a cosmetic makeup method including a cosmetic molding method, a beauty massage method, a hair makeup method, or a cosmetic makeup method.
- Beauty counseling method ⁇ 14> The beauty according to ⁇ 12> or ⁇ 13>, wherein the beauty information is information representing the impression tendency group to which the subject belongs, which is selected based on the contribution rate of the feature amount.
- Steping method ⁇ 15> Multifaceted analysis of face shape acquisition means for acquiring face shape information representing the shape of the face surface of the subject and population face information representing the three-dimensional shape of the face surface of a population of a plurality of persons.
- Face impression analysis means comprising: face component analysis means for calculating the expression level of the image; and face impression determination means for acquiring the impression tendency or the degree thereof based on the feature value and the expression level with reference to the storage means apparatus; ⁇ 16> Multivariate analysis of the receiving means for receiving the face shape information representing the shape of the face surface of the subject through the network and the population face information representing the three-dimensional shape of the face surface of the population of a plurality of persons.
- Storage means for storing higher-order feature quantities and trend information representing the impression tendency of facial features associated with the feature quantities, and the feature quantities in the face of the subject from the face shape information and the feature quantities
- a facial component analysis means for calculating the expression level of the face, a face impression determination means for acquiring the impression tendency or its degree based on the feature quantity and the expression level with reference to the storage means, and the acquired impression tendency
- a face impression analysis system comprising: transmission means for transmitting output information indicating the degree thereof over a network; ⁇ 17> Three-dimensional shape estimation means for calculating a three-dimensional coordinate value of the subject based on a plurality of two-dimensional images having different photographing angles of the subject, wherein the receiving means is for a head including the face of the subject.
- a plurality of two-dimensional images taken at different photographing angles are received from the subject terminal, and the three-dimensional shape estimating means is configured to obtain the three-dimensional coordinates of the surface of the subject's head based on the two-dimensional image received by the receiving means.
- a value is calculated as the face shape information, the face component analyzing means calculates the expression level based on the calculated face shape information, and the transmitting means transmits the output information to the subject terminal.
- the facial impression analysis system according to ⁇ 18>
- the receiving unit receives selection of the impression tendency from the subject terminal, and the face component analyzing unit extracts the feature amount associated with the impression tendency selected with reference to the storage unit And calculating the expression level of the extracted feature quantity, and the face impression determining means acquires the degree of the feature quantity associated with the selected impression tendency. 17>.
- ⁇ 1a> One or a plurality of feature amounts obtained by multivariate analysis of face shape information representing the shape of the subject's face surface and population face information representing a three-dimensional shape of the face surface of a plurality of populations Calculating the expression amount of the feature amount in the face of the subject from the above, and obtaining the degree of impression tendency of the feature of the subject's face based on the expression amount;
- ⁇ 2a> The face impression analysis method according to ⁇ 1a>, wherein the population face information is a homologous model in which the number of data points and the topology are unified.
- ⁇ 3a> The face impression analysis method according to ⁇ 1a> or ⁇ 2a> above, in which the population is classified into a plurality of groups, and the group to which the subject belongs is obtained based on the expression level of the subject; ⁇ 4a> The face impression analysis according to ⁇ 3a>, wherein the population is classified into a plurality of groups based on a degree of coincidence of tendencies of a plurality of weighting factors related to a plurality of basis vectors having a high correlation with the impression tendency.
- the facial impression analysis method according to any one of ⁇ 1a> to ⁇ 9a>; ⁇ 11a>
- the above-described ⁇ 2a> in which the face shape information of the subject is the homologous model, and the feature amount related to the subject is calculated by performing multivariate analysis on the population face information including the face shape information.
- Face impression analysis method A multivariate analysis is performed on the population face information not including the face shape information of the subject to obtain a plurality of consecutive feature quantities from the first order to a predetermined order, and a product-sum operation of the feature quantities and weighting factors thereof
- the face impression analysis method according to any one of ⁇ 1a> to ⁇ 10a>, wherein the weighting coefficient is calculated as the expression level by reproducing the face shape information of the subject;
- ⁇ 13a> The face impression analysis method according to any one of ⁇ 1a> to ⁇ 12a>, in which selection of the impression tendency is received from the subject;
- ⁇ 14a> A beauty counseling method using the face impression analysis method according to any one of ⁇ 1a> to ⁇ 13a>, wherein the calculated expression level is greater than or equal to a predetermined amount.
- Beauty counseling method characterized by outputting attached beauty information
- the population is classified into a plurality of groups based on the degree of coincidence of coefficient tendencies, the group to which the subject belongs is obtained based on the expression level of the subject, and is previously associated with the group to which the subject belongs.
- Beauty counseling method characterized by outputting the beauty information ⁇ 17a> From the above ⁇ 14a> to ⁇ 16a, wherein the beauty information is information representing a beauty treatment method, a hair cosmetic, or a makeup cosmetic including any one of a beauty molding method, a beauty massage method, a hair makeup method, and a makeup makeup method >
- the beauty counseling method according to any one of ⁇ 18a>
- One or more feature amounts obtained by multivariate analysis of face shape information representing the shape of the subject's face surface and population face information representing the three-dimensional shape of the face surface of a plurality of populations From the above, the expression amount of the feature amount in the face of the subject is calculated, the expression amount in the face shape information is changed, and the impression tendency of the facial feature of the subject is changed based on the changed face shape information
- a face image generation method characterized in that a generated impression change image is generated; ⁇ 19a>
- the face shape is obtained by changing the weighting coefficient of the basis vector so that the feature amount includes at least one of basis vectors
- the face image generation method according to ⁇ 18a>, wherein the expression level in the information is changed; ⁇ 20a>
- the degree of the impression tendency is an apparent or actual age of the subject, and based on the age, the population is divided into a first group including the subject and a second group not including the subject.
- a first weight is assigned to the basis vector that is biased and expressed in one of the first population or the second population, and both the first population and the second population are given
- the face image generating method according to the above ⁇ 19a>, wherein the weighting factor is changed by giving a second weight smaller than the first weight to the expressed base vector; ⁇ 21a>
- the face image according to any one of ⁇ 19a> to ⁇ 21a>, in which the group to which the subject belongs is obtained, and a weighting factor of the basis vector having a high correlation with the impression tendency is changed in the group to which the subject
- a plurality of two-dimensional images taken at different photographing angles are received from the subject terminal, and the three-dimensional shape estimating means is configured to obtain the three-dimensional coordinates of the surface of the subject's head based on the two-dimensional image received by the receiving means.
- a value is calculated as the face shape information, the face component analyzing means calculates the expression level based on the calculated face shape information, and the transmitting means transmits the output information to the subject terminal.
- ⁇ 24a> face impression analysis system ⁇ 26a> The receiving unit accepts selection of the impression tendency from the subject terminal, and the face component analysis unit extracts the feature amount associated with the impression tendency selected with reference to the storage unit And calculating the expression level of the extracted feature quantity, and the face impression determining means acquires the degree of the feature quantity associated with the selected impression tendency. 25a>.
- the face impression analysis method according to the above ⁇ 4b>, wherein one or a plurality of the feature quantities corresponding to each of the at least two impression tendencies are configured only by basis vectors having different orders from each other;
- the degree of impression tendency is the degree of the adult face or baby face of the subject, the degree of small face impression, the degree of round face or face length, or the degree of impression of the eye size, and the subject belongs to
- the beauty counseling method according to ⁇ 15a> or ⁇ 16a> above, wherein beauty information previously associated with the group and the other group to which the subject does not belong is output;
- ⁇ 7b> The degree of impression tendency, wherein the feature amount includes at least one of second-order or higher-order basis vectors highly correlated with the impression tendency, extracted from a plurality of basis vectors obtained by the multivariate analysis.
- the face image generation method according to ⁇ 18a>, wherein the expression level in the face shape information is changed by changing a weighting coefficient of the base vector so that the value changes
- ⁇ 8b> The degree of impression tendency, wherein the feature amount includes at least one of secondary and higher-order basis vectors highly correlated with the impression tendency, extracted from a plurality of basis vectors obtained by the multivariate analysis.
- the face impression analysis device according to ⁇ 23a>, wherein the expression amount in the face shape information is changed by changing a weighting coefficient of the base vector so that the value changes by a predetermined amount; ⁇ 9b> The degree of impression tendency, wherein the feature amount includes at least one of second-order or higher-order basis vectors highly correlated with the impression tendency, extracted from a plurality of basis vectors obtained by the multivariate analysis.
- the face impression analysis system according to any one of ⁇ 24a> to ⁇ 26a>, wherein the expression amount in the face shape information is changed by changing a weighting coefficient of the base vector so that the value changes by a predetermined amount; ⁇ 10b> From ⁇ 18a> to ⁇ 18a> above, wherein the degree of impression tendency is the degree of adult face or baby face of the subject, the degree of small face impression, the degree of round face or face length, or the degree of impression of eye size
- the face image generation method according to any one of 22a> and ⁇ 7b>; ⁇ 11b>
- the degree of the impression tendency is the apparent age or actual age of the subject, the degree of adult face or baby face, the degree of small face impression, the degree of round face or face length, and the degree of eye size impression
- the face image generation according to ⁇ 7b>, wherein one or a plurality of the feature quantities corresponding to each of the two or more impression tendencies includes basis vectors having different orders from each other.
- ⁇ 12b> The face image generation method according to the above ⁇ 11b>, wherein one or a plurality of the feature amounts corresponding to each of the at least two impression tendencies are configured only by basis vectors having different orders from each other; ⁇ 13b> The face impression analysis device according to ⁇ 23a>, wherein the population face information is a homologous model in which the number of data points and the topology are unified.
- the storage means classifies and stores the population into a plurality of groups, and the face impression determination means determines the group to which the subject belongs based on the expression level of the subject ⁇ 23a > Or ⁇ 13b> face impression analyzer; ⁇ 15b> The storage unit stores the plurality of groups classified based on the degree of coincidence of the tendency of a plurality of weighting factors related to a plurality of basis vectors having a high correlation with the impression tendency.
- a facial impression analyzer according to claim 1; ⁇ 16b> The face impression analyzer according to any one of ⁇ 23a> or ⁇ 13b> to ⁇ 15b>, wherein the degree of the impression tendency is high or low of the apparent age or actual age of the subject; ⁇ 17b> The above ⁇ 23a> or ⁇ 23>, wherein the degree of impression tendency is the degree of adult face or baby face of the subject, the degree of small face impression, the degree of round face or face length, or the degree of impression of eye size 13b> to ⁇ 15b>
- the facial impression analyzer according to any one of ⁇ 18b> The degree of impression tendency is the level of appearance or actual age of the subject, the degree of adult face or baby face, the degree of small face impression, the degree of round face or face length, and the degree of impression of eye size
- the face impression analyzer according to the above ⁇ 18b>, wherein one or a plurality of the feature amounts corresponding to each of the at least two impression tendencies are configured only by basis vectors having different orders from each other; ⁇ 20b> The face impression determination means determines the impression tendency or the degree thereof based on the feature quantity having a correlation coefficient with the impression tendency larger than a 5% significance level limit value in the number of samples of the population.
- the facial impression analyzer according to any one of ⁇ 23a> or ⁇ 13b> to ⁇ 19b> to be acquired; ⁇ 21b> From the above ⁇ 23a> or ⁇ 13b> to ⁇ 20b, further including a contact-type three-dimensional digitizer that acquires three-dimensional coordinate values of a plurality of feature points on the surface of the head including the face of the subject as the face shape information >
- the facial impression analyzer according to any one of ⁇ 22b>
- the face impression analysis device according to ⁇ 21b>, further including a non-contact type three-dimensional measurement device that acquires a three-dimensional coordinate value of another point on the surface of the head; ⁇ 23b> A three-dimensional shape for calculating a three-dimensional coordinate value of the surface of the head as the face shape information, based on a plurality of two-dimensional images taken with respect to the head including the face of the subject and having different shooting angles.
- the face impression analyzer according to any one of ⁇ 23a> or ⁇ 13b> to ⁇ 22b>, further including an estimation unit; ⁇ 24b>
- the face shape information of the subject is further the homologous model, and the face component analysis means calculates the feature amount related to the subject by performing multivariate analysis on the population face information including the face shape information.
- the face impression analyzer according to ⁇ 13b>above; ⁇ 25b>
- the face component analyzing means multivariately analyzes the population face information not including the face shape information of the subject to obtain a plurality of continuous feature amounts from a primary to a predetermined order, and the feature amount and its The face according to any one of ⁇ 23a> or ⁇ 13b> to ⁇ 23b>, wherein the weighting coefficient is calculated as the expression level by reproducing the face shape information of the subject by a product-sum operation of weighting coefficients.
- Impression analyzer; ⁇ 26b> The facial impression analysis apparatus according to any one of ⁇ 23a> or ⁇ 13b> to ⁇ 25b>, further including condition input means for receiving selection of the impression tendency from the subject.
- the population is divided into a first group including the subject and a second group not including the subject, and the first group or the second group
- a first weight is assigned to the basis vector expressed in a biased manner, and the first weight is applied to the basis vector expressed in both the first population and the second population.
- the face image generation method according to any one of ⁇ 10b> to ⁇ 12b>, wherein the weighting factor is changed by giving a smaller second weight ratio; ⁇ 28b>
- a cosmetic counseling method using the face image generation method described in ⁇ 28b> above which includes a hair makeup method or a cosmetic makeup method that does not change the shape of the subject's face surface, and is applied to the subject's face
- a beauty counseling method that generates a beauty simulation image that imitates a state that has been reproduced, and displays and outputs the beauty simulation image and the impression change image.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Tourism & Hospitality (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Strategic Management (AREA)
- Primary Health Care (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- Image Processing (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
図1は、本発明の第一実施形態にかかる顔印象分析装置100を示す機能ブロック図である。
顔印象分析装置100は、顔形状取得部10、顔成分解析部50、顔印象決定部60および記憶部70を備える。
顔形状取得部10は、被験者の顔表面の形状を表す顔形状情報を取得する手段である。
記憶部70は、複数人の母集団の顔表面の三次元形状を表す母集団顔情報を多変量解析して求められた一または複数の特徴量(基底ベクトル)、および特徴量と対応づけられた顔の造作の印象傾向を表す傾向情報を記憶する。本実施形態の記憶部70は基底記憶部72および傾向情報記憶部74を含む。基底記憶部72は、母集団顔情報から抽出された一または複数の特徴量および各次の重み係数(固有値)を記憶する手段である。傾向情報記憶部74は、顔の造作の印象傾向を表す傾向情報を記憶する手段である。
顔成分解析部50は、被験者の顔形状情報と、母集団顔情報から抽出された特徴量とから、被験者の顔における当該特徴量の発現量を算出する手段である。
顔印象決定部60は、記憶部70(傾向情報記憶部74)を参照して、被験者の顔における特徴量およびその発現量に基づいて、被験者の顔の印象傾向またはその度合いを取得する手段である。
第一方法では、被験者の顔形状情報と一または複数の基底ベクトルとから、被験者の顔における基底ベクトルの発現量を算出し、この発現量に基づいて印象傾向の度合いを求める。
規格化部20は、接触式計測部12および非接触式計測部14で取得した高解像度の多数点の三次元形状モデルを、より少ない点数で構成された相同モデルに変換するための演算部である。相同モデルの具体的な生成方法は特許文献3に記載されている。第一方法の実施に先だって、多数の被験者の顔表面を含む頭部の形状を表す三次元形状モデルを顔形状取得部10で計測しておく。規格化部20は、これらの顔形状情報を、互いにデータ点数(節点数)およびトポロジーが統一された相同モデルに変換することにより母集団顔情報を生成する(図2:ステップS20)。生成された相同モデルである母集団顔情報(母集団解析モデル)は、モデル記憶部40に格納しておく。規格化部20は被験者の顔形状情報を母集団解析モデルと同じデータ点数(節点数)およびトポロジーの相同モデルに変換する。
なお、80歳代またはそれ以上の年代になると、一般に頭部の骨自体が痩せてくる傾向があるといわれており顔印象が変わりやすい。また、10代前半までは一般的に骨格の成長時期にあたるので、顔印象も変わりやすい。このような骨の変化による顔印象の変化は、頭部表面の三次元形状の主成分分析では現れにくいため、本実施形態ではこれを排除する。このため、母集団は20代から60代までとした。これにより、筋肉や脂肪の影響によるエイジングの進行が統計的に精度良く抽出される。
被験者解析モデル=平均顔形状+b1*第1基底ベクトルe1+b2*第2基底ベクトルe2+b3*第3基底ベクトルe3+・・・+bk*第k基底ベクトルek+・・・+bn*第n基底ベクトルen (1)
パターン3は、被験者の顔が大人びて見える(大人顔)か、または子供っぽく見える(童顔)か、という印象傾向が3次の基底ベクトルの重み係数と相関することを表している。
パターン4は、被験者の顔が小さく見える(小顔)という印象傾向が3次および12次の基底ベクトルの重み係数と相関することを表している。
たとえば、加齢因子のうち、7次の基底ベクトルの重み係数は、そのスコアが正で絶対値が大きいほど見掛け年齢が高くなる傾向にあることが本発明者により明らかとなっている(詳細は後述)。言い換えると、7次基底の重み係数と見掛け年齢とは正の相関をもつ。7次基底は、その重み係数を正方向に増加させることで、顔形状が加齢方向に変化する。したがって、パターン1における7次基底に関するプラス1σ値には、母集団における7次基底の重み係数のスコア分布から求まる平均値+1σ(標準偏差)の正のスコアが設定されている。平均値+1σのスコアは、正のスコアを上位、負のスコアを下位とした場合の、上位約1/3(正確には上位31.7パーセント)の順位に相当する。
一方で、加齢因子のうち、2次の基底ベクトルの重み係数は、そのスコアが負で絶対値が大きいほど見掛け年齢が高くなる傾向にあることも明らかとなっている(詳細は後述)。つまり、2次基底の重み係数と見掛け年齢とは負の相関をもつ。2次基底は、その重み係数を負方向に減少させることで、顔形状が加齢方向に変化する。したがって、パターン1における2次基底に関するプラス1σ値には、母集団における2次基底の重み係数のスコア分布から求まる平均値-1σ(標準偏差)の負のスコアが設定されている。平均値-1σのスコアは、正のスコアを上位、負のスコアを下位とした場合の、下位約1/3(正確には下位31.7パーセント)の順位に相当する。
顔印象決定部60は、この発現度合いを所定の正の閾値(たとえば、+1.0)と大小比較する。発現度合いがこの閾値以上である場合には、当該基底次数の傾向を被験者解析モデルが保有していると判定する。
パターン1に対応づけられた2次、7次、9次および11次のうち被験者解析モデルが保有している基底次数の数や、発現度合いの数値の大きさ等を勘案して、顔印象決定部60は被験者の顔の造作の印象傾向の度合いを決定する(図2:ステップS42)。具体的には、簡易な決定方法として、パターンに対応づけられた基底次数の個数(パターン1であれば4個)に対する、被験者解析モデルが傾向を保有している基底次数の数(0~4個)の比率をもって、当該パターンの印象傾向の度合いとすることができる。
図6は、本発明の第二実施形態にかかる顔印象分析装置100を含む顔印象分析システム1000を示す機能ブロック図である。図7は本実施形態の顔印象分析システム1000を用いておこなう顔印象分析方法(以下、第二方法という場合がある)のフローチャートである。以下、図6および図7を用いて本実施形態を説明する。第一実施形態と重複する説明は適宜省略する。
顔印象分析システム1000は、ネットワークを通じて互いに接続された顔印象分析装置100と被験者端末110とで構成されている。顔印象分析装置100はWEBサーバである。被験者端末110は被験者(ユーザ)が操作する携帯端末である。ネットワークはインターネットでもローカルエリアネットワーク(LAN)でもよく、無線ネットワークでも有線ネットワークでもよい。本実施形態では携帯電話ネットワークを例示する。顔印象分析装置100は、被験者端末110からの接続要求に基づいて、被験者端末110の表示ディスプレイにWEBアプリケーションサイトを表示させる。
より具体的には、被験者の顔形状情報を構成する計測点のうち、母集団の平均顔形状の節点にもっとも近接する点の三次元座標値を算出する。以下、この計測点を「相同モデル対応点」とする。平均顔形状の各節点と相同モデル対応点との距離の総和を「モデル間距離」とする。そして、第1基底の重み係数b1を正または負に変化させて、モデル間距離が極小となる重み係数b1を算出する。
第2基底も同様である。具体的には、平均顔形状+b1*第1基底ベクトルe1を上記の平均顔形状と読み替えて、被験者の顔形状情報を構成する計測点のうち、この新たな平均顔形状の節点にもっとも近接する点の三次元座標値を算出する。これらの点が、新たな相同モデル対応点となる。そして、この新たな平均顔形状の各節点と相同モデル対応点とのモデル間距離が極小となる第2基底の重み係数b2を算出する。以降同様に、平均顔形状+b1*第1基底ベクトルe1+・・・+bk-1*第k-1基底ベクトルek-1を新たな平均顔形状として、第k基底ベクトルの重み係数bkを算出していく。
これにより、被験者の顔形状情報を上記の式(1)のごとく第1基底ベクトルから第n基底ベクトルの線形和で再現することができる。
つぎに、顔成分解析部50は、上記の各次の基底画像の重みつきの合成によって、受信部16が受信した被験者の頭部画像が近似されるよう、基底画像の重み係数を決定する。具体的には、まず被験者の頭部画像を正規化し、テクスチャを捨象する。そして、顔成分解析部50は、各次の基底画像の画素値に乗じる重み係数を変化させて、被験者の頭部画像(正規化画像)の画素値との差分の二乗和が極小となるように重み係数を基底次数ごとに決定する。
第一方法および第二方法では、被験者の顔の造作の印象傾向の度合いを決定することに加えて、顔印象決定部60が算出した結果を用いた美容カウンセリングを提供してもよい。
以下、実施例を通じて本発明を具体的に説明する。実施例1は、顧客である被験者の頭部の三次元データを分析することによって、本人の加齢ポイントおよび加齢傾向を明らかにするものである。さらに、加齢傾向に応じた美容施術方法を出力することで、客観的かつ有効な美容カウンセリング情報が提供される。これにより、メイクアップ部位の優先順位等も明らかになるため、顧客自信のメイクアップによって的確に「若々しい」印象を与えることができるようになる。
図9(a)は、被験者の顔を含む頭部を非接触式の三次元レーザースキャナで計測した頭部全体の三次元光学データ(高解像度データ)である。計測点は約18万点である。この高解像度データは、被験者ごとに節点数とトポロジーが異なる。
図9(b)は、被験者の顔および頭皮部の13点の特徴点を示す図である。これらの点の三次元座標を、接触式三次元デジタイザで計測した。
図9(c)は、ジェネリックモデルを示す図である。ジェネリックモデルは、目元と口元の節点の配置密度が大きく、頭皮部の節点の配置密度が小さいモデルである。節点数は4703点である。
図13(b)、図14(b)、・・・、図27(b)は、20代から60代の全年代の相同モデルの平均顔(全体平均顔)の形状を示す斜視図であり、いずれも図12(b)と同じ図である。
図13(c)は、上記の式(1)の第1基底ベクトルの重み係数(b1)を母集団の標準偏差の+3倍(+3σ)とし、他の基底ベクトルの重み係数(b2~bn)をゼロとした場合の仮想形状の斜視図である。
図13(a)は、第1基底ベクトルの重み係数(b1)を母集団の標準偏差の-3倍(-3σ)とし、他の基底ベクトルの重み係数(b2~bn)をゼロとした場合の仮想形状の斜視図である。
図15(c)、図16(c)、・・・、図27(c)は、それぞれ順に、第3基底ベクトルから第15基底ベクトルの重み係数を母集団の標準偏差の+3倍(+3σ)とし、他の基底ベクトルの重み係数をゼロとした場合の仮想形状の斜視図である。
図14(a)は第2基底ベクトルの重み係数(b2)を母集団の標準偏差の-3倍(-3σ)とし、他の基底ベクトルの重み係数(b1, b3~bn)をゼロとした場合の仮想形状の斜視図である。
図15(a)、図16(a)、・・・、図27(a)は、それぞれ順に、第3基底ベクトルから第15基底ベクトルの重み係数を母集団の標準偏差の-3倍(-3σ)とし、他の基底ベクトルの重み係数をゼロとした場合の仮想形状の斜視図である。
これらの印象傾向と基底次数との関係を示すテーブルが、図5に示した傾向情報PIである。ただし、後述する実施例2のように母集団の被験者の人数や属性を変化させた場合には、見掛け年齢または実年齢に関する印象傾向が相関する基底次数は2次、7次、9次および11次から変化する。この場合は、印象傾向(エイジング)と相関の高い基底次数を予め官能的に求めておくとよい。そして、被験者モデルにおける当該基底次数の主成分の発現量に基づいて、印象傾向の度合いを判定すればよい。
2次、7次、9次および11次の4つの基底次数に関して、見掛け年齢と高い相関がみられた。また、2次、9次および11次は相関係数が負であり、7次は相関係数が正であった。
これにより、エイジングが進行すると2次、9次および11次は主成分がマイナス方向に進行し、7次は主成分がプラス方向に進行することが分かった。
図31(b)は9次の基底ベクトルの重み係数をエイジング進行方向に+2σとした図である。図31(c)は同じく+3σとした図であり、図21(a)と同じ図である。図31(d)は同じく-1σとした図である。図31(e)は同じく-2σとした図である。図31(f)は同じく-3σとした図であり、図21(c)と同じ図である。
2次の主成分は、顔の下膨れ具合、鼻両脇の膨らみ、および鼻の下の垂れ下がりに寄与することから、頬を持ち上げた印象とすることが有効である。このため、美容マッサージで実際に頬を持ち上げるほか、顔の上部を目立たせる化粧メイクや、髪をアップにまとめるなどのヘアメイクによって頬が持ち上がった印象とすることが好適である。また、コンシーラーなどを用いて法令線を隠すことも有効である。したがって、図8に前掲したように、2次基底によるエイジング傾向の緩和方法として、(i)頬上部をハイライトする、(ii)頬にチークをさす、(iii)法令線を隠す、(iv)ヘアのトップのボリュームを上げる、(v)顔面運動で頬を持ち上げる、の5つの美容施術方法が例示されている。
9次の主成分は、目尻上方部の内下垂、および口角の引け具合に寄与することから、(i)目尻を上げるメイク、(ii)法令線を隠す、などの美容施術方法が有効である。
11次の主成分は、口角のくぼみを緩和するため、(i)口角の凹みを隠す、(ii)唇の輪郭をくっきり描く、(iii)顔面運動で表情筋を引き締める、などの美容施術方法が有効である。
20代の被験者のうち3名が、エイジング印象軸(因子)を各1つのみ保有していた。2因子以上を保有する者はいなかった。
30代の被験者のうち2名が、エイジング印象軸(因子)を各1つのみ保有していた。また、2因子を保有する者が1名存在した。
40代の被験者のうち2名が、エイジング印象軸(因子)を各1つのみ保有していた。また、2因子を保有する者が2名存在した。
50代の被験者のうち7名が、エイジング印象軸(因子)を各1つのみ保有していた。また、2因子を保有する者が1名存在した。
60代の被験者のうち3名が、エイジング印象軸(因子)を各1つのみ保有していた。また、2因子を保有する者が6名存在した。
母集団の50人のうち、3因子以上を保有する者はいなかった。
実施例1よりも相同モデルの数を増加させた点を除き同様に母集団解析モデルの主成分分析を行った。被験者の母集団はいずれも日本人の女性とし、20代と30代を各29名、40代、50代および60代を各30名の合計148人とした。主成分分析により147次までの基底ベクトルを求めた。図36は、このうち20次基底まで(一部省略)の寄与率と累積寄与率を表すテーブルである。20次基底までの主成分の累積寄与率は80%を超え、具体的には87.3%であった。また、20次以下(具体的には18次以下)の各主成分の寄与率はそれぞれ1%未満であった。
より具体的には、1次主成分がプラス方向に進行すると顔が下膨れする、すなわち頬の下部が膨出する傾向が見られた。9次主成分がマイナス方向に進行すると口角が後方に引け、鼻幅が広くなり、目尻が垂れ下がる傾向が見られた。10次主成分がマイナス方向に進行すると眼窩幅および頬幅が狭くなり、下顎が突き出す傾向が見られた。また、他の加齢因子と比べて、10次主成分を変化させても法令線はあまり深化しなかった。12次主成分がプラス方向に進行すると口角が下方に引け、鼻が低くなり、顎下がたるむ傾向が見られた。20次主成分がプラス方向に進行すると鼻の下が伸びて上唇結節が下がり、目尻が垂れ下がる傾向が見られた。これらは、いずれも日本人女性のエイジングが進行した場合の顔の印象傾向の変化である。
I型には母集団148人中の39人(26%)の被験者が含まれ、見掛け年齢の平均は44.7歳であった。I型の被験者は9次および12次主成分をともに多く有することが特徴であった。図39(a)はI型の24歳以上46歳未満の20人(若齢層)の平均顔形状モデルの正面図である。図39(b)はI型の46歳以上65歳未満の19人(高齢層)の平均顔形状モデルの正面図である。
図39(e)はIII型の21歳以上47歳未満の20人(若齢層)、図39(f)はIII型の47歳以上63歳未満の19人(高齢層)の平均顔形状モデルの正面図である。
図39(g)はIV型の23歳以上46歳未満の20人(若齢層)、図39(h)はIV型の46歳以上65歳未満の19人(高齢層)の平均顔形状モデルの正面図である。
図40(c)の結果から、III型に属する被験者は9次と10次の主成分得点が有意に大きいことが分かった。そして、加齢変化により、負符号を付した9次の主成分得点は小さな値(-1.004)から中程度の値(-0.151)に変化し、負符号を付した10次の主成分得点は大きな値(+0.456)から更に大きな値(+1.208)に変化することが分かった。
図40(d)の結果から、IV型に属する被験者は9次の主成分得点が有意に大きいことが分かった。加齢変化により、負符号を付した9次の主成分得点は、やや大きな値(+0.440)から更に大きな値(+1.335)に変化することが分かった。
印象変化画像生成部90は、被験者が属する群に対応する有意加齢因子(たとえば、I型では9次と12次の2つ)の主成分得点を説明変数とする重回帰式において、解が上記所望の変化後年齢となる各説明変数を算出する。具体的には、説明変数を、各有意加齢因子の重み係数biの標準偏差の倍数として算出するとよい。
印象変化画像生成部90は、被験者解析モデルに対して、有意加齢因子について上記で求まった新たな重み係数bi(たとえば、I型の場合はi=9、12:上式(1)を参照)を適用して再構築することで、印象変化画像を生成する。
なお、上記では、I型からIV型ごとの1つまたは2つ有意加齢因子のみを説明変数とする重回帰式に基づいて、当該有意加齢因子の主成分得点を変化させることで所望の変化後年齢の重み係数biを算出することを説明した。しかし本発明はこれに限られない。有意加齢因子を含むすべての加齢因子(実施例2では、1次、9次、10次、12次、20次の5つ)を説明変数とする重回帰式に基づいて、変化後年齢を解とする主成分得点および重み係数をそれぞれ算出してもよい。そして、加齢因子にかかる重み係数を変更して被験者解析モデルを再構築することで、被験者の顔形状を所望に加齢(または若化)した印象変化画像を生成することができる。ここで、実施例2のように有意加齢因子に関する重み係数のみを変化させることで、被験者の固有かつ自然な顔形状をより残した印象変化画像を生成することができ好適である。
図42(b)は、図42(a)の平均顔形状モデルを見掛け年齢が約30歳となるまで若化した状態を示す斜視図である。図42(c)は、図42(a)の平均顔形状モデルを見掛け年齢が約60歳となるまで加齢した状態を示す斜視図である。
図44(b)は、II型の被験者の平均顔形状モデルを見掛け年齢が約30歳になるまで若化した状態を示す斜視図である。図44(c)は、II型の被験者の平均顔形状モデルを見掛け年齢が約60歳になるまで加齢した状態を示す斜視図である。
図45(b)は、III型の被験者の平均顔形状モデルを見掛け年齢が約30歳になるまで若化した状態を示す斜視図である。図45(c)は、III型の被験者の平均顔形状モデルを見掛け年齢が約60歳になるまで加齢した状態を示す斜視図である。
図46(b)は、IV型の被験者の平均顔形状モデルを見掛け年齢が約30歳になるまで若化した状態を示す斜視図である。図46(c)は、IV型の被験者の平均顔形状モデルを見掛け年齢が約60歳になるまで加齢した状態を示す斜視図である。
具体的には、図42(c)のI型の加齢画像は、頬の下部が膨出して下膨れ状態に加齢し、また口角が後方に引けることが分かる。したがって、I型に対しては、口元の上部にあたる頬に明度の高いハイライトを施したり、口角が引き締まって見えるように彩度の高いリップライナーを塗布したりするなど、顔の中間部から下部に対する美容情報を対応づけて記憶しておくとよい。
図44(c)のII型の加齢画像は、目尻が下垂し、上唇結節が下垂する傾向が見られる。このため、II型に対しては、眉尻を上げるようにアイブローを施すなど目の近傍に対する美容情報と、リップライナーなどの唇近傍に対する美容情報を対応づけである記憶しておくとよい。
図45(c)のIII型の加齢画像は、法令線は浅いものの、つり目になる傾向が見られる。このため、III型に対しては、頬へのハイライトなど顔の中間部に対する美容情報を対応づけて記憶しておくとよい。
図46(c)のIV型の加齢画像は、口角が下がり、法令線が深く、目尻が下垂する傾向が見られる。このため、IV型に対しては、リップライナーなどの唇近傍に対する美容情報、法令線を隠すコンシーラー、眉尻を上げるアイブローなど、顔の全体に対する美容情報を対応づけて記憶しておくとよい。
実施例2と共通の母集団解析モデルおよび主成分分析結果を用いて、エイジング以外の顔の造作の印象傾向の度合いを分析した。
美容専門家5名が148人の被験者の写真を見て、大人顔から童顔の程度を0~6の7段階で評価した。大人顔の程度が強いほど高い評価値とし、童顔の程度が強いほど低い評価値とした。
図50(b)は、母集団のうちクラスター2に属する約20%の被験者の平均顔である。クラスター2の平均顔は、面長であって、目の位置は全体平均顔よりも上方であった。クラスター2の平均顔は、もっとも大人顔の程度が強いと評価された。
図50(c)は、母集団のうちクラスター3に属する約24%の被験者の平均顔である。クラスター3の平均顔は、面長であって、目の位置は中間的であった。大人顔の程度がやや強いと評価された。
図50(d)は、母集団のうちクラスター4に属する約16%の被験者の平均顔である。クラスター4の平均顔は、丸顔であって、目の位置は全体平均顔よりも下方であった。クラスター4の平均顔は、もっとも大人顔の程度が弱い、すなわち童顔であると評価された。
実施例2および実施例3と共通の母集団解析モデルおよび主成分分析結果を用いて、エイジング以外の顔の造作の印象傾向の度合いを分析した。
美容専門家5名が148人の被験者の写真を見て、大顔印象から小顔印象の程度を0~6の7段階で評価した。大顔印象が強いほど高い評価値とし、小顔印象が強いほど低い評価値とした。
図54(b)は、母集団のうちクラスター2に属する約26%の被験者の平均顔である。クラスター2の平均顔は全体平均顔よりも下頬がやや膨れていた。クラスター2の平均顔は、大顔印象がやや強いと評価された。
図54(c)は、母集団のうちクラスター3に属する約24%の被験者の平均顔である。クラスター3の平均顔は全体平均顔よりも顎が小さく、下頬の膨らみも小さかった。クラスター2の平均顔は小顔印象が強いと評価された。
図54(d)は、母集団のうちクラスター4に属する約20%の被験者の平均顔である。クラスター4の平均顔は全体平均顔よりも下頬が膨れていた。クラスター4の平均顔は、もっとも大顔印象が強いと評価された。
実施例2から実施例4と共通の母集団解析モデルおよび主成分分析結果を用いて、エイジング以外の顔の造作の印象傾向の度合いを分析した。
美容専門家5名が148人の被験者の写真を見て、目の大きさを0~6の7段階で評価した。目が大きいとの印象を受けるほど高い評価値とし、目が小さいとの印象を受けるほど低い評価値とした。
図58(b)は、母集団のうちクラスター2に属する約28%の被験者の平均顔である。クラスター2の平均顔は、面長であって、全体平均顔よりも目が大きいとの印象を受ける顔であった。
図58(c)は、母集団のうちクラスター3に属する約22%の被験者の平均顔である。クラスター3の平均顔は、面長であって、全体平均顔よりも目が小さいとの印象を受ける顔であった。
図58(d)は、母集団のうちクラスター4に属する約20%の被験者の平均顔である。クラスター4の平均顔は、小顔であって、全体平均顔よりも目が大きいとの印象を受ける顔であった。
<1>被験者の顔表面の形状を表す顔形状情報と、複数人の母集団の顔表面の三次元形状を表す母集団顔情報を多変量解析して求められた高次の特徴量と、から前記被験者の顔における前記特徴量の発現量を算出し、前記発現量に基づいて前記被験者の顔の造作の印象傾向の度合いを求めることを特徴とする顔印象分析方法;
<2>前記母集団顔情報は、互いにデータ点数およびトポロジーが統一された相同モデルである上記<1>に記載の顔印象分析方法;
<3>被験者の前記顔形状情報がさらに前記相同モデルであり、前記顔形状情報を含む前記母集団顔情報を多変量解析して前記被験者にかかる前記特徴量を算出する上記<2>に記載の顔印象分析方法;
<4>被験者の前記顔形状情報を含まない前記母集団顔情報を多変量解析して一次から所定次までの連続する複数の特徴量を求め、前記特徴量およびその重み係数の積和演算で前記被験者の前記顔形状情報を再現することにより前記重み係数を前記発現量として算出する上記<1>または<2>に記載の顔印象分析方法;
<5>前記印象傾向の度合いが、被験者の見掛け年齢、実年齢、大人顔もしくは童顔の程度、または小顔印象の程度である上記<1>から<4>のいずれか一項に記載の顔印象分析方法;
<6>前記印象傾向の度合いが被験者の見掛け年齢または実年齢であり、前記特徴量が前記印象傾向と相関の高い基底ベクトルの少なくとも一つを含む上記<5>に記載の顔印象分析方法;
<7>前記特徴量が、寄与率1%以上の基底ベクトルであり、かつ前記印象傾向との相関係数が前記母集団の標本数での5%有意水準の限界値よりも大きいことを特徴とする上記<1>から<6>のいずれか一項に記載の顔印象分析方法;
<8>接触式の三次元デジタイザを用いて被験者の顔を含む頭部の表面の三次元座標値を前記顔形状情報として取得する上記<1>から<7>のいずれか一項に記載の顔印象分析方法;
<9>接触式の三次元デジタイザを用いて頭部の表面の複数の特徴点に関する三次元座標値を取得し、かつ非接触式の三次元計測装置を用いて前記頭部の表面の他の点の三次元座標値を取得する上記<8>に記載の顔印象分析方法;
<10>撮影アングルが異なる複数の二次元画像を被験者の顔を含む頭部について撮影し、前記二次元画像に基づいて前記頭部の表面の三次元座標値を前記顔形状情報として算出する上記<1>から<7>のいずれか一項に記載の顔印象分析方法;
<11>前記印象傾向の選択を前記被験者から受け付ける上記<1>から<10>のいずれか一項に記載の顔印象分析方法;
<12>上記<1>から<11>のいずれか一項に記載の顔印象分析方法を用いた美容カウンセリング方法であって、算出された前記発現量が所定以上である前記特徴量に予め対応付けられた美容情報を出力することを特徴とする美容カウンセリング方法;
<13>前記美容情報が、美容成形方法、美容マッサージ方法、ヘアメイク方法、化粧メイク方法のいずれかを含む美容施術方法、毛髪化粧料またはメイク化粧料を表す情報である上記<12>に記載の美容カウンセリング方法;
<14>前記美容情報が、前記特徴量の寄与率の大きさに基づいて選択された、前記被験者が属する前記印象傾向の群を表す情報である上記<12>または<13>に記載の美容カウンセリング方法;
<15>被験者の顔表面の形状を表す顔形状情報を取得する顔形状取得手段と、複数人の母集団の顔表面の三次元形状を表す母集団顔情報を多変量解析して求められた高次の特徴量、および前記特徴量と対応づけられた顔の造作の印象傾向を表す傾向情報を記憶する記憶手段と、前記顔形状情報と前記特徴量とから前記被験者の顔における前記特徴量の発現量を算出する顔成分解析手段と、前記記憶手段を参照して、前記特徴量および前記発現量に基づいて前記印象傾向またはその度合いを取得する顔印象決定手段と、を備える顔印象分析装置;
<16>被験者の顔表面の形状を表す顔形状情報をネットワークを通じて受信する受信手段と、複数人の母集団の顔表面の三次元形状を表す母集団顔情報を多変量解析して求められた高次の特徴量、および前記特徴量と対応づけられた顔の造作の印象傾向を表す傾向情報を記憶する記憶手段と、前記顔形状情報と前記特徴量とから前記被験者の顔における前記特徴量の発現量を算出する顔成分解析手段と、前記記憶手段を参照して、前記特徴量および前記発現量に基づいて前記印象傾向またはその度合いを取得する顔印象決定手段と、取得した前記印象傾向またはその度合いを示す出力情報をネットワークを通じて送信する送信手段と、を備える顔印象分析システム;
<17>被写体の撮影アングルが互いに異なる複数の二次元画像に基づいて前記被写体の三次元座標値を算出する三次元形状推定手段をさらに備え、前記受信手段は、被験者の顔を含む頭部について撮影された撮影アングルが異なる複数の二次元画像を被験者端末から受信し、前記三次元形状推定手段は、前記受信手段が受信した前記二次元画像に基づいて被験者の頭部の表面の三次元座標値を前記顔形状情報として算出し、前記顔成分解析手段は、算出された前記顔形状情報に基づいて前記発現量を算出し、前記送信手段は、前記出力情報を前記被験者端末に送信する上記<16>に記載の顔印象分析システム;
<18>前記受信手段は、前記印象傾向の選択を前記被験者端末から受け付け、前記顔成分解析手段は、前記記憶手段を参照して選択された前記印象傾向と対応づけられた前記特徴量を抽出するとともに、抽出された前記特徴量の前記発現量を算出し、前記顔印象決定手段は、選択された前記印象傾向と対応づけられた前記特徴量の度合いを取得することを特徴とする上記<17>に記載の顔印象分析システム。
<1a>被験者の顔表面の形状を表す顔形状情報と、複数人の母集団の顔表面の三次元形状を表す母集団顔情報を多変量解析して求められた一または複数の特徴量と、から前記被験者の顔における前記特徴量の発現量を算出し、前記発現量に基づいて前記被験者の顔の造作の印象傾向の度合いを求めることを特徴とする顔印象分析方法;
<2a>前記母集団顔情報は、互いにデータ点数およびトポロジーが統一された相同モデルである上記<1a>に記載の顔印象分析方法;
<3a>前記母集団を複数の群に分類するとともに、前記被験者の前記発現量に基づいて前記被験者が属する前記群を求める上記<1a>または<2a>に記載の顔印象分析方法;
<4a>前記印象傾向と相関の高い複数次の基底ベクトルにかかる複数の重み係数の傾向の一致度に基づいて前記母集団を複数の前記群に分類する上記<3a>に記載の顔印象分析方法;
<5a>前記印象傾向の度合いが、被験者の見掛け年齢、実年齢、大人顔もしくは童顔の程度、または小顔印象の程度である上記<1a>から<4a>のいずれか一項に記載の顔印象分析方法;
<6a>前記印象傾向の度合いが被験者の見掛け年齢または実年齢であり、前記特徴量が前記印象傾向と相関の高い基底ベクトルの少なくとも一つを含む上記<4a>に記載の顔印象分析方法;
<7a>前記特徴量と前記印象傾向との相関係数が前記母集団の標本数での5%有意水準の限界値よりも大きいことを特徴とする上記<1a>から<6a>のいずれか一項に記載の顔印象分析方法;
<8a>接触式の三次元デジタイザを用いて被験者の顔を含む頭部の表面の三次元座標値を前記顔形状情報として取得する上記<1a>から<7a>のいずれか一項に記載の顔印象分析方法;
<9a>接触式の三次元デジタイザを用いて頭部の表面の複数の特徴点に関する三次元座標値を取得し、かつ非接触式の三次元計測装置を用いて前記頭部の表面の他の点の三次元座標値を取得する上記<8a>に記載の顔印象分析方法;
<10a>撮影アングルが異なる複数の二次元画像を被験者の顔を含む頭部について撮影し、前記二次元画像に基づいて前記頭部の表面の三次元座標値を前記顔形状情報として算出する上記<1a>から<9a>のいずれか一項に記載の顔印象分析方法;
<11a>被験者の前記顔形状情報がさらに前記相同モデルであり、前記顔形状情報を含む前記母集団顔情報を多変量解析して前記被験者にかかる前記特徴量を算出する上記<2a>に記載の顔印象分析方法;
<12a>被験者の前記顔形状情報を含まない前記母集団顔情報を多変量解析して一次から所定次までの連続する複数の特徴量を求め、前記特徴量およびその重み係数の積和演算で前記被験者の前記顔形状情報を再現することにより前記重み係数を前記発現量として算出する上記<1a>から<10a>のいずれか一項に記載の顔印象分析方法;
<13a>前記印象傾向の選択を前記被験者から受け付ける上記<1a>から<12a>のいずれか一項に記載の顔印象分析方法;
<14a>上記<1a>から<13a>のいずれか一項に記載の顔印象分析方法を用いた美容カウンセリング方法であって、算出された前記発現量が所定以上である前記特徴量に予め対応付けられた美容情報を出力することを特徴とする美容カウンセリング方法;
<15a>前記美容情報が、前記特徴量の寄与率の大きさに基づいて選択された、前記被験者が属する前記印象傾向の群を表す情報である上記<14a>に記載の美容カウンセリング方法;
<16a>上記<1a>から<13a>のいずれか一項に記載の顔印象分析方法を用いた美容カウンセリング方法であって、前記印象傾向と相関の高い複数次の基底ベクトルにかかる複数の重み係数の傾向の一致度に基づいて前記母集団を複数の群に分類するとともに、前記被験者の前記発現量に基づいて前記被験者が属する前記群を求め、前記被験者が属する前記群に予め対応付けられた美容情報を出力することを特徴とする美容カウンセリング方法;
<17a>前記美容情報が、美容成形方法、美容マッサージ方法、ヘアメイク方法、化粧メイク方法のいずれかを含む美容施術方法、毛髪化粧料またはメイク化粧料を表す情報である上記<14a>から<16a>のいずれか一項に記載の美容カウンセリング方法;
<18a>被験者の顔表面の形状を表す顔形状情報と、複数人の母集団の顔表面の三次元形状を表す母集団顔情報を多変量解析して求められた一または複数の特徴量と、から前記被験者の顔における前記特徴量の発現量を算出し、前記顔形状情報における前記発現量を変更し、変更された前記顔形状情報に基づいて前記被験者の顔の造作の印象傾向を変化させた印象変化画像を生成することを特徴とする顔画像生成方法;
<19a>前記特徴量が、前記印象傾向と相関の高い基底ベクトルの少なくとも一つを含み、前記印象傾向の度合いが所定量だけ変化するよう前記基底ベクトルの重み係数を変化させることにより前記顔形状情報における前記発現量を変更する上記<18a>に記載の顔画像生成方法;
<20a>前記印象傾向の度合いが被験者の見掛けまたは実際の年齢であって、前記年齢に基づいて、前記母集団を、前記被験者を含む第一の集団と前記被験者を含まない第二の集団とに分け、前記第一の集団または前記第二の集団の一方に偏って発現している前記基底ベクトルに第一の重率を付与し、前記第一の集団および前記第二の集団の両方に発現している前記基底ベクトルに前記第一の重率よりも小さな第二の重率を付与して前記重み係数を変化させることを特徴とする上記<19a>に記載の顔画像生成方法;
<21a>前記所定量だけ変化させた後の前記年齢が、前記第一の集団の平均年齢と前記第二の集団の平均年齢との間の値である上記<20a>に記載の顔画像生成方法;
<22a>前記印象傾向と相関の高い複数次の基底ベクトルにかかる複数の重み係数の傾向の一致度に基づいて前記母集団を複数の群に分類するとともに、前記被験者の前記発現量に基づいて前記被験者が属する前記群を求め、前記被験者が属する前記群において前記印象傾向と相関の高い前記基底ベクトルの重み係数を変化させる上記<19a>から<21a>のいずれか一項に記載の顔画像生成方法;
<23a>被験者の顔表面の形状を表す顔形状情報を取得する顔形状取得手段と、複数人の母集団の顔表面の三次元形状を表す母集団顔情報を多変量解析して求められた一または複数の特徴量、および前記特徴量と対応づけられた顔の造作の印象傾向を表す傾向情報を記憶する記憶手段と、前記顔形状情報と前記特徴量とから前記被験者の顔における前記特徴量の発現量を算出する顔成分解析手段と、前記記憶手段を参照して、前記特徴量および前記発現量に基づいて前記印象傾向またはその度合いを取得する顔印象決定手段と、を備える顔印象分析装置;
<24a>被験者の顔表面の形状を表す顔形状情報をネットワークを通じて受信する受信手段と、複数人の母集団の顔表面の三次元形状を表す母集団顔情報を多変量解析して求められた一または複数の特徴量、および前記特徴量と対応づけられた顔の造作の印象傾向を表す傾向情報を記憶する記憶手段と、前記顔形状情報と前記特徴量とから前記被験者の顔における前記特徴量の発現量を算出する顔成分解析手段と、前記記憶手段を参照して、前記特徴量および前記発現量に基づいて前記印象傾向またはその度合いを取得する顔印象決定手段と、取得した前記印象傾向またはその度合いを示す出力情報をネットワークを通じて送信する送信手段と、を備える顔印象分析システム;
<25a>被写体の撮影アングルが互いに異なる複数の二次元画像に基づいて前記被写体の三次元座標値を算出する三次元形状推定手段をさらに備え、前記受信手段は、被験者の顔を含む頭部について撮影された撮影アングルが異なる複数の二次元画像を被験者端末から受信し、前記三次元形状推定手段は、前記受信手段が受信した前記二次元画像に基づいて被験者の頭部の表面の三次元座標値を前記顔形状情報として算出し、前記顔成分解析手段は、算出された前記顔形状情報に基づいて前記発現量を算出し、前記送信手段は、前記出力情報を前記被験者端末に送信する上記<24a>に記載の顔印象分析システム;
<26a>前記受信手段は、前記印象傾向の選択を前記被験者端末から受け付け、前記顔成分解析手段は、前記記憶手段を参照して選択された前記印象傾向と対応づけられた前記特徴量を抽出するとともに、抽出された前記特徴量の前記発現量を算出し、前記顔印象決定手段は、選択された前記印象傾向と対応づけられた前記特徴量の度合いを取得することを特徴とする上記<25a>に記載の顔印象分析システム。
<1b>前記特徴量が、前記印象傾向と相関の高い二次以上の基底ベクトルの少なくとも一つを含む上記<1a>に記載の顔印象分析方法;
<2b>前記印象傾向の度合いが、被験者の見掛け年齢または実年齢の高低である上記<1a>から<4a>または<1b>のいずれか一項に記載の顔印象分析方法;
<3b>前記印象傾向の度合いが、被験者の大人顔もしくは童顔の程度、小顔印象の程度、丸顔もしくは面長の程度、または目の大きさの印象の程度である上記<1a>から<4a>または<1b>のいずれか一項に記載の顔印象分析方法;
<4b>前記印象傾向の度合いが、被験者の見掛け年齢もしくは実年齢の高低、大人顔もしくは童顔の程度、小顔印象の程度、丸顔もしくは面長の程度、および目の大きさの印象の程度、より選ばれるいずれか2以上であり、前記2以上の印象傾向のそれぞれに対応する一または複数の前記特徴量同士が、互いに異なる次数の基底ベクトルを含む上記<1b>に記載の顔印象分析方法;
<5b>少なくとも2つの前記印象傾向のそれぞれに対応する一または複数の前記特徴量同士が、互いに異なる次数の基底ベクトルのみで構成されている上記<4b>に記載の顔印象分析方法;
<6b>前記印象傾向の度合いが、被験者の大人顔もしくは童顔の程度、小顔印象の程度、丸顔もしくは面長の程度、または目の大きさの印象の程度であって、前記被験者が属する前記群と、前記被験者が属さない他の前記群と、に予め対応付けられた美容情報を出力することを特徴とする上記<15a>または<16a>に記載の美容カウンセリング方法;
<7b>前記特徴量が、前記多変量解析で求められた複数の基底ベクトルより抽出された、前記印象傾向と相関の高い二次以上の基底ベクトルの少なくとも一つを含み、前記印象傾向の度合いが所定量だけ変化するよう前記基底ベクトルの重み係数を変化させることにより前記顔形状情報における前記発現量を変更する上記<18a>に記載の顔画像生成方法。
<9b>前記特徴量が、前記多変量解析で求められた複数の基底ベクトルより抽出された、前記印象傾向と相関の高い二次以上の基底ベクトルの少なくとも一つを含み、前記印象傾向の度合いが所定量だけ変化するよう前記基底ベクトルの重み係数を変化させることにより前記顔形状情報における前記発現量を変更する上記<24a>から<26a>のいずれか一項に記載の顔印象分析システム;
<10b>前記印象傾向の度合いが、被験者の大人顔もしくは童顔の程度、小顔印象の程度、丸顔もしくは面長の程度、または目の大きさの印象の程度である上記<18a>から<22a>または<7b>のいずれか一項に記載の顔画像生成方法;
<11b>前記印象傾向の度合いが、被験者の見掛け年齢もしくは実年齢の高低、大人顔もしくは童顔の程度、小顔印象の程度、丸顔もしくは面長の程度、および目の大きさの印象の程度、より選ばれるいずれか2以上であり、前記2以上の印象傾向のそれぞれに対応する一または複数の前記特徴量同士が、互いに異なる次数の基底ベクトルを含む上記<7b>に記載の顔画像生成方法;
<12b>少なくとも2つの前記印象傾向のそれぞれに対応する一または複数の前記特徴量同士が、互いに異なる次数の基底ベクトルのみで構成されている上記<11b>に記載の顔画像生成方法;
<13b>前記母集団顔情報は、互いにデータ点数およびトポロジーが統一された相同モデルである上記<23a>に記載の顔印象分析装置;
<14b>前記記憶手段は前記母集団を複数の群に分類して記憶しており、前記顔印象決定手段は前記被験者の前記発現量に基づいて前記被験者が属する前記群を決定する上記<23a>または<13b>に記載の顔印象分析装置;
<15b>前記記憶手段は、前記印象傾向と相関の高い複数次の基底ベクトルにかかる複数の重み係数の傾向の一致度に基づいて分類された複数の前記群を記憶している上記<14b>に記載の顔印象分析装置;
<16b>前記印象傾向の度合いが、被験者の見掛け年齢または実年齢の高低である上記<23a>または<13b>から<15b>のいずれか一項に記載の顔印象分析装置;
<17b>前記印象傾向の度合いが、被験者の大人顔もしくは童顔の程度、小顔印象の程度、丸顔もしくは面長の程度、または目の大きさの印象の程度である上記<23a>または<13b>から<15b>のいずれか一項に記載の顔印象分析装置;
<18b>前記印象傾向の度合いが、被験者の見掛け年齢もしくは実年齢の高低、大人顔もしくは童顔の程度、小顔印象の程度、丸顔もしくは面長の程度、および目の大きさの印象の程度、より選ばれるいずれか2以上であり、前記2以上の印象傾向のそれぞれに対応する一または複数の前記特徴量同士が、互いに異なる次数の基底ベクトルを含む上記<8b>に記載の顔印象分析装置;
<19b>少なくとも2つの前記印象傾向のそれぞれに対応する一または複数の前記特徴量同士が、互いに異なる次数の基底ベクトルのみで構成されている上記<18b>に記載の顔印象分析装置;
<20b>前記顔印象決定手段は、前記印象傾向との相関係数が前記母集団の標本数での5%有意水準の限界値よりも大きい前記特徴量に基づいて前記印象傾向またはその度合いを取得する上記<23a>または<13b>から<19b>のいずれか一項に記載の顔印象分析装置;
<21b>被験者の顔を含む頭部の表面の複数の特徴点の三次元座標値を前記顔形状情報として取得する接触式の三次元デジタイザを更に含む上記<23a>または<13b>から<20b>のいずれか一項に記載の顔印象分析装置;
<22b>前記頭部の表面の他の点の三次元座標値を取得する非接触式の三次元計測装置を更に含む上記<21b>に記載の顔印象分析装置;
<23b>被験者の顔を含む頭部について撮影された、撮影アングルが互いに異なる複数の二次元画像に基づいて、前記頭部の表面の三次元座標値を前記顔形状情報として算出する三次元形状推定手段を更に含む上記<23a>または<13b>から<22b>のいずれか一項に記載の顔印象分析装置;
<24b>被験者の前記顔形状情報がさらに前記相同モデルであり、前記顔成分解析手段は、前記顔形状情報を含む前記母集団顔情報を多変量解析して前記被験者にかかる前記特徴量を算出する上記<13b>に記載の顔印象分析装置;
<25b>前記顔成分解析手段は、被験者の前記顔形状情報を含まない前記母集団顔情報を多変量解析して一次から所定次までの連続する複数の特徴量を求め、前記特徴量およびその重み係数の積和演算で前記被験者の前記顔形状情報を再現することにより前記重み係数を前記発現量として算出する上記<23a>または<13b>から<23b>のいずれか一項に記載の顔印象分析装置;
<26b>前記印象傾向の選択を前記被験者から受け付ける条件入力手段を更に備える上記<23a>または<13b>から<25b>のいずれか一項に記載の顔印象分析装置。
<27b>前記印象傾向の度合いに基づいて、前記母集団を、前記被験者を含む第一の集団と前記被験者を含まない第二の集団とに分け、前記第一の集団または前記第二の集団の一方に偏って発現している前記基底ベクトルに第一の重率を付与し、前記第一の集団および前記第二の集団の両方に発現している前記基底ベクトルに前記第一の重率よりも小さな第二の重率を付与して前記重み係数を変化させることを特徴とする上記<10b>から<12b>のいずれか一項に記載の顔画像生成方法;
<28b>被験者の顔画像と前記印象変化画像とを対比して表示出力する上記<18a>から<22a>、<7b>または<10b>から<12b>のいずれか一項に記載の顔画像生成方法;
<29b>上記<28b>に記載の顔画像生成方法を用いた美容カウンセリング方法であって、ヘアメイク方法もしくは化粧メイク方法を含み被験者の顔表面の形状を変化させない美容施術方法を被験者の顔に適用した状態を模した美容シミュレーション画像を生成し、前記美容シミュレーション画像と前記印象変化画像とを対比して表示出力する美容カウンセリング方法。
Claims (30)
- 被験者の顔表面の形状を表す顔形状情報と、複数人の母集団の顔表面の三次元形状を表す母集団顔情報を多変量解析して求められた一または複数の特徴量と、から前記被験者の顔における前記特徴量の発現量を算出し、前記発現量に基づいて前記被験者の顔の造作の印象傾向の度合いを求めることを特徴とする顔印象分析方法。
- 前記母集団顔情報は、互いにデータ点数およびトポロジーが統一された相同モデルである請求項1に記載の顔印象分析方法。
- 前記母集団を複数の群に分類するとともに、前記被験者の前記発現量に基づいて前記被験者が属する前記群を求める請求項1または2に記載の顔印象分析方法。
- 前記印象傾向と相関の高い複数次の基底ベクトルにかかる複数の重み係数の傾向の一致度に基づいて前記母集団を複数の前記群に分類する請求項3に記載の顔印象分析方法。
- 前記特徴量が、前記多変量解析で求められた複数の基底ベクトルより抽出された、前記印象傾向と相関の高い二次以上の基底ベクトルの少なくとも一つを含む請求項1から4のいずれか一項に記載の顔印象分析方法。
- 前記印象傾向の度合いが、被験者の見掛け年齢または実年齢の高低である請求項1から5のいずれか一項に記載の顔印象分析方法。
- 前記印象傾向の度合いが、被験者の大人顔もしくは童顔の程度、小顔印象の程度、丸顔もしくは面長の程度、または目の大きさの印象の程度である請求項1から5のいずれか一項に記載の顔印象分析方法。
- 前記印象傾向の度合いが、被験者の見掛け年齢もしくは実年齢の高低、大人顔もしくは童顔の程度、小顔印象の程度、丸顔もしくは面長の程度、および目の大きさの印象の程度、より選ばれるいずれか2以上であり、
前記2以上の印象傾向のそれぞれに対応する一または複数の前記特徴量同士が、互いに異なる次数の基底ベクトルを含む請求項5に記載の顔印象分析方法。 - 少なくとも2つの前記印象傾向のそれぞれに対応する一または複数の前記特徴量同士が、互いに異なる次数の基底ベクトルのみで構成されている請求項8に記載の顔印象分析方法。
- 前記特徴量と前記印象傾向との相関係数が前記母集団の標本数での5%有意水準の限界値よりも大きいことを特徴とする請求項1から9のいずれか一項に記載の顔印象分析方法。
- 接触式の三次元デジタイザを用いて被験者の顔を含む頭部の表面の三次元座標値を前記顔形状情報として取得する請求項1から10のいずれか一項に記載の顔印象分析方法。
- 前記接触式の三次元デジタイザを用いて頭部の表面の複数の特徴点に関する三次元座標値を取得し、かつ非接触式の三次元計測装置を用いて前記頭部の表面の他の点の三次元座標値を取得する請求項11に記載の顔印象分析方法。
- 撮影アングルが異なる複数の二次元画像を被験者の顔を含む頭部について撮影し、前記二次元画像に基づいて前記頭部の表面の三次元座標値を前記顔形状情報として算出する請求項1から12のいずれか一項に記載の顔印象分析方法。
- 被験者の前記顔形状情報がさらに前記相同モデルであり、前記顔形状情報を含む前記母集団顔情報を多変量解析して前記被験者にかかる前記特徴量を算出する請求項2に記載の顔印象分析方法。
- 被験者の前記顔形状情報を含まない前記母集団顔情報を多変量解析して一次から所定次までの連続する複数の特徴量を求め、前記特徴量およびその重み係数の積和演算で前記被験者の前記顔形状情報を再現することにより前記重み係数を前記発現量として算出する請求項1から13のいずれか一項に記載の顔印象分析方法。
- 前記印象傾向の選択を前記被験者から受け付ける請求項1から15のいずれか一項に記載の顔印象分析方法。
- 請求項1から16のいずれか一項に記載の顔印象分析方法を用いた美容カウンセリング方法であって、
算出された前記発現量が所定以上である前記特徴量に予め対応付けられた美容情報を出力することを特徴とする美容カウンセリング方法。 - 前記美容情報が、前記特徴量の寄与率の大きさに基づいて選択された、前記被験者が属する前記印象傾向の群を表す情報である請求項17に記載の美容カウンセリング方法。
- 請求項1から16のいずれか一項に記載の顔印象分析方法を用いた美容カウンセリング方法であって、
前記印象傾向と相関の高い複数次の基底ベクトルにかかる複数の重み係数の傾向の一致度に基づいて前記母集団を複数の群に分類するとともに、前記被験者の前記発現量に基づいて前記被験者が属する前記群を求め、
前記被験者が属する前記群に予め対応付けられた美容情報を出力することを特徴とする美容カウンセリング方法。 - 前記印象傾向の度合いが、被験者の大人顔もしくは童顔の程度、小顔印象の程度、丸顔もしくは面長の程度、または目の大きさの印象の程度であって、
前記被験者が属する前記群と、前記被験者が属さない他の前記群と、に予め対応付けられた美容情報を出力することを特徴とする請求項18または19に記載の美容カウンセリング方法。 - 前記美容情報が、美容成形方法、美容マッサージ方法、ヘアメイク方法、化粧メイク方法のいずれかを含む美容施術方法、毛髪化粧料またはメイク化粧料を表す情報である請求項17から20のいずれか一項に記載の美容カウンセリング方法。
- 被験者の顔表面の形状を表す顔形状情報と、複数人の母集団の顔表面の三次元形状を表す母集団顔情報を多変量解析して求められた一または複数の特徴量と、から前記被験者の顔における前記特徴量の発現量を算出し、前記顔形状情報における前記発現量を変更し、変更された前記顔形状情報に基づいて前記被験者の顔の造作の印象傾向を変化させた印象変化画像を生成することを特徴とする顔画像生成方法。
- 前記特徴量が、前記多変量解析で求められた複数の基底ベクトルより抽出された、前記印象傾向と相関の高い二次以上の基底ベクトルの少なくとも一つを含み、
前記印象傾向の度合いが所定量だけ変化するよう前記基底ベクトルの重み係数を変化させることにより前記顔形状情報における前記発現量を変更する請求項22に記載の顔画像生成方法。 - 前記印象傾向の度合いが被験者の見掛けまたは実際の年齢であって、
前記年齢に基づいて、前記母集団を、前記被験者を含む第一の集団と前記被験者を含まない第二の集団とに分け、
前記第一の集団または前記第二の集団の一方に偏って発現している前記基底ベクトルに第一の重率を付与し、前記第一の集団および前記第二の集団の両方に発現している前記基底ベクトルに前記第一の重率よりも小さな第二の重率を付与して前記重み係数を変化させることを特徴とする請求項23に記載の顔画像生成方法。 - 前記所定量だけ変化させた後の前記年齢が、前記第一の集団の平均年齢と前記第二の集団の平均年齢との間の値である請求項24に記載の顔画像生成方法。
- 前記印象傾向と相関の高い複数次の基底ベクトルにかかる複数の重み係数の傾向の一致度に基づいて前記母集団を複数の群に分類するとともに、前記被験者の前記発現量に基づいて前記被験者が属する前記群を求め、
前記被験者が属する前記群において前記印象傾向と相関の高い前記基底ベクトルの重み係数を変化させる請求項23から25のいずれか一項に記載の顔画像生成方法。 - 被験者の顔表面の形状を表す顔形状情報を取得する顔形状取得手段と、
複数人の母集団の顔表面の三次元形状を表す母集団顔情報を多変量解析して求められた一または複数の特徴量、および前記特徴量と対応づけられた顔の造作の印象傾向を表す傾向情報を記憶する記憶手段と、
前記顔形状情報と前記特徴量とから前記被験者の顔における前記特徴量の発現量を算出する顔成分解析手段と、
前記記憶手段を参照して、前記特徴量および前記発現量に基づいて前記印象傾向またはその度合いを取得する顔印象決定手段と、を備える顔印象分析装置。 - 被験者の顔表面の形状を表す顔形状情報をネットワークを通じて受信する受信手段と、
複数人の母集団の顔表面の三次元形状を表す母集団顔情報を多変量解析して求められた一または複数の特徴量、および前記特徴量と対応づけられた顔の造作の印象傾向を表す傾向情報を記憶する記憶手段と、
前記顔形状情報と前記特徴量とから前記被験者の顔における前記特徴量の発現量を算出する顔成分解析手段と、
前記記憶手段を参照して、前記特徴量および前記発現量に基づいて前記印象傾向またはその度合いを取得する顔印象決定手段と、
取得した前記印象傾向またはその度合いを示す出力情報をネットワークを通じて送信する送信手段と、を備える顔印象分析システム。 - 被写体の撮影アングルが互いに異なる複数の二次元画像に基づいて前記被写体の三次元座標値を算出する三次元形状推定手段をさらに備え、
前記受信手段は、被験者の顔を含む頭部について撮影された撮影アングルが異なる複数の二次元画像を被験者端末から受信し、
前記三次元形状推定手段は、前記受信手段が受信した前記二次元画像に基づいて被験者の頭部の表面の三次元座標値を前記顔形状情報として算出し、
前記顔成分解析手段は、算出された前記顔形状情報に基づいて前記発現量を算出し、
前記送信手段は、前記出力情報を前記被験者端末に送信する請求項28に記載の顔印象分析システム。 - 前記受信手段は、前記印象傾向の選択を前記被験者端末から受け付け、
前記顔成分解析手段は、前記記憶手段を参照して選択された前記印象傾向と対応づけられた前記特徴量を抽出するとともに、抽出された前記特徴量の前記発現量を算出し、
前記顔印象決定手段は、選択された前記印象傾向と対応づけられた前記特徴量の度合いを取得することを特徴とする請求項29に記載の顔印象分析システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280033752.7A CN103649987B (zh) | 2011-07-07 | 2012-07-06 | 脸印象分析方法、美容信息提供方法和脸图像生成方法 |
US14/131,374 US9330298B2 (en) | 2011-07-07 | 2012-07-06 | Face impression analyzing method, aesthetic counseling method, and face image generating method |
JP2012558097A JP5231685B1 (ja) | 2011-07-07 | 2012-07-06 | 顔印象分析方法、美容カウンセリング方法および顔画像生成方法 |
EP12806872.3A EP2731072A4 (en) | 2011-07-07 | 2012-07-06 | FINGER-PRINTED PRINT ANALYSIS METHOD, COSMETIC CONSULTATION METHOD, AND FACE IMAGE GENERATION METHOD |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011151289 | 2011-07-07 | ||
JP2011-151289 | 2011-07-07 | ||
JP2012019529 | 2012-02-01 | ||
JP2012-019529 | 2012-02-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013005447A1 true WO2013005447A1 (ja) | 2013-01-10 |
Family
ID=47436809
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/004404 WO2013005447A1 (ja) | 2011-07-07 | 2012-07-06 | 顔印象分析方法、美容カウンセリング方法および顔画像生成方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9330298B2 (ja) |
EP (1) | EP2731072A4 (ja) |
JP (1) | JP5231685B1 (ja) |
CN (1) | CN103649987B (ja) |
WO (1) | WO2013005447A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014122253A3 (en) * | 2013-02-07 | 2014-10-02 | Crisalix Sa | 3d platform for aesthetic simulation |
CN104408417A (zh) * | 2014-11-25 | 2015-03-11 | 苏州福丰科技有限公司 | 基于三维人脸局部特征匹配的超市预付费支付方法 |
WO2015122195A1 (ja) * | 2014-02-17 | 2015-08-20 | Necソリューションイノベータ株式会社 | 印象分析装置、ゲーム装置、健康管理装置、広告支援装置、印象分析システム、印象分析方法、プログラム、及びプログラム記録媒体 |
WO2015125759A1 (ja) * | 2014-02-24 | 2015-08-27 | 花王株式会社 | 加齢分析方法及び加齢分析装置 |
CN105210110A (zh) * | 2013-02-01 | 2015-12-30 | 松下知识产权经营株式会社 | 美容辅助装置、美容辅助系统、美容辅助方法以及美容辅助程序 |
JP2016193175A (ja) * | 2015-03-31 | 2016-11-17 | ポーラ化成工業株式会社 | 顔の見た目印象の決定部位の抽出方法、顔の見た目印象の決定因子の抽出方法、顔の見た目印象の鑑別方法 |
JP2018073273A (ja) * | 2016-11-02 | 2018-05-10 | 花王株式会社 | 加齢分析方法 |
CN109310475A (zh) * | 2016-06-21 | 2019-02-05 | 约翰·G·罗伯森 | 用于自动生成面部修复设计和应用方案以解决可观察的面部偏差的系统和方法 |
WO2023210341A1 (ja) * | 2022-04-25 | 2023-11-02 | 株式会社資生堂 | 顔分類方法、装置、およびプログラム |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9122919B2 (en) | 2013-03-15 | 2015-09-01 | Skin Republic, Inc. | Systems and methods for specifying and formulating customized topical agents |
US9477330B2 (en) * | 2013-11-05 | 2016-10-25 | Microsoft Technology Licensing, Llc | Stylus tilt tracking with a digitizer |
KR102294927B1 (ko) * | 2014-03-31 | 2021-08-30 | 트라이큐빅스 인크. | 가상 성형 sns 서비스 방법 및 장치 |
WO2017125975A1 (ja) * | 2016-01-22 | 2017-07-27 | パナソニックIpマネジメント株式会社 | メイクトレンド分析装置、メイクトレンド分析方法、およびメイクトレンド分析プログラム |
JP6722866B2 (ja) * | 2016-02-29 | 2020-07-15 | パナソニックIpマネジメント株式会社 | 画像処理装置および画像処理方法 |
WO2017165363A1 (en) | 2016-03-21 | 2017-09-28 | The Procter & Gamble Company | Systems and methods for providing customized product recommendations |
US10398389B1 (en) | 2016-04-11 | 2019-09-03 | Pricewaterhousecoopers Llp | System and method for physiological health simulation |
US10621771B2 (en) | 2017-03-21 | 2020-04-14 | The Procter & Gamble Company | Methods for age appearance simulation |
US10614623B2 (en) | 2017-03-21 | 2020-04-07 | Canfield Scientific, Incorporated | Methods and apparatuses for age appearance simulation |
JP6849824B2 (ja) | 2017-05-31 | 2021-03-31 | ザ プロクター アンド ギャンブル カンパニーThe Procter & Gamble Company | セルフィーを撮影するためにユーザをガイドするためのシステム及び方法 |
JP6849825B2 (ja) | 2017-05-31 | 2021-03-31 | ザ プロクター アンド ギャンブル カンパニーThe Procter & Gamble Company | 見掛け肌年齢を判定するためのシステム及び方法 |
CN108399379B (zh) * | 2017-08-11 | 2021-02-12 | 北京市商汤科技开发有限公司 | 用于识别面部年龄的方法、装置和电子设备 |
CN107578625B (zh) * | 2017-08-22 | 2020-11-10 | 青海省交通科学研究院 | 一种基于多维矢量相关性的车型分类方法及装置 |
CN107580251B (zh) * | 2017-09-15 | 2018-09-21 | 南京陶特思软件科技有限公司 | 信息输入方式自适应选择系统 |
CN110096936B (zh) * | 2018-01-31 | 2023-03-03 | 伽蓝(集团)股份有限公司 | 评估眼部表观年龄、眼部衰老程度的方法及其应用 |
US10395436B1 (en) | 2018-03-13 | 2019-08-27 | Perfect Corp. | Systems and methods for virtual application of makeup effects with adjustable orientation view |
US11676157B2 (en) | 2018-07-13 | 2023-06-13 | Shiseido Company, Limited | System and method for adjusting custom topical agents |
JP2020091662A (ja) * | 2018-12-05 | 2020-06-11 | 富士ゼロックス株式会社 | 情報処理装置及びプログラム |
CN110599639B (zh) * | 2019-08-13 | 2021-05-07 | 深圳市天彦通信股份有限公司 | 身份验证方法及相关产品 |
TW202122040A (zh) * | 2019-12-09 | 2021-06-16 | 麗寶大數據股份有限公司 | 臉部肌肉狀態分析與評價方法 |
CN111008971B (zh) * | 2019-12-24 | 2023-06-13 | 天津工业大学 | 一种合影图像的美学质量评价方法及实时拍摄指导系统 |
CN113850708A (zh) * | 2020-06-28 | 2021-12-28 | 华为技术有限公司 | 交互方法、交互装置、智能镜子 |
CN113377020A (zh) * | 2021-05-10 | 2021-09-10 | 深圳数联天下智能科技有限公司 | 设备控制方法、装置、设备及存储介质 |
CN117745036B (zh) * | 2024-02-18 | 2024-04-30 | 四川金投科技股份有限公司 | 一种基于特征识别及近场通信的牲畜信息管理方法及系统 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07152928A (ja) * | 1993-11-30 | 1995-06-16 | Canon Inc | 画像処理方法及び装置 |
JP2001331791A (ja) | 2000-05-23 | 2001-11-30 | Pola Chem Ind Inc | 加齢パターンの鑑別方法 |
JP2004102359A (ja) | 2002-09-04 | 2004-04-02 | Advanced Telecommunication Research Institute International | 画像処理装置、画像処理方法および画像処理プログラム |
JP2006119040A (ja) * | 2004-10-22 | 2006-05-11 | Kao Corp | 顔形状分類方法および顔形状評価方法および顔形状評価装置 |
JP2007128171A (ja) * | 2005-11-01 | 2007-05-24 | Advanced Telecommunication Research Institute International | 顔画像合成装置、顔画像合成方法および顔画像合成プログラム |
JP2008171074A (ja) | 2007-01-09 | 2008-07-24 | National Institute Of Advanced Industrial & Technology | 三次元形状モデル生成装置、三次元形状モデル生成方法、コンピュータプログラム、及び三次元形状モデル生成システム |
JP2009054060A (ja) | 2007-08-29 | 2009-03-12 | Kao Corp | 顔の形状評価方法 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE69419291T2 (de) | 1993-09-03 | 1999-12-30 | Canon Kk | Formmessapparat |
US20060020630A1 (en) * | 2004-07-23 | 2006-01-26 | Stager Reed R | Facial database methods and systems |
JP4795718B2 (ja) | 2005-05-16 | 2011-10-19 | 富士フイルム株式会社 | 画像処理装置および方法並びにプログラム |
CN101751551B (zh) | 2008-12-05 | 2013-03-20 | 比亚迪股份有限公司 | 一种基于图像的人脸识别方法、装置、系统及设备 |
US8670597B2 (en) * | 2009-08-07 | 2014-03-11 | Google Inc. | Facial recognition with social network aiding |
JP5905702B2 (ja) | 2011-10-18 | 2016-04-20 | 花王株式会社 | 顔印象判定チャート |
-
2012
- 2012-07-06 US US14/131,374 patent/US9330298B2/en active Active
- 2012-07-06 JP JP2012558097A patent/JP5231685B1/ja active Active
- 2012-07-06 EP EP12806872.3A patent/EP2731072A4/en not_active Ceased
- 2012-07-06 WO PCT/JP2012/004404 patent/WO2013005447A1/ja active Application Filing
- 2012-07-06 CN CN201280033752.7A patent/CN103649987B/zh active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07152928A (ja) * | 1993-11-30 | 1995-06-16 | Canon Inc | 画像処理方法及び装置 |
JP2001331791A (ja) | 2000-05-23 | 2001-11-30 | Pola Chem Ind Inc | 加齢パターンの鑑別方法 |
JP2004102359A (ja) | 2002-09-04 | 2004-04-02 | Advanced Telecommunication Research Institute International | 画像処理装置、画像処理方法および画像処理プログラム |
JP2006119040A (ja) * | 2004-10-22 | 2006-05-11 | Kao Corp | 顔形状分類方法および顔形状評価方法および顔形状評価装置 |
JP2007128171A (ja) * | 2005-11-01 | 2007-05-24 | Advanced Telecommunication Research Institute International | 顔画像合成装置、顔画像合成方法および顔画像合成プログラム |
JP2008171074A (ja) | 2007-01-09 | 2008-07-24 | National Institute Of Advanced Industrial & Technology | 三次元形状モデル生成装置、三次元形状モデル生成方法、コンピュータプログラム、及び三次元形状モデル生成システム |
JP2009054060A (ja) | 2007-08-29 | 2009-03-12 | Kao Corp | 顔の形状評価方法 |
Non-Patent Citations (2)
Title |
---|
FUMIYASU NISHINO ET AL.: "Quantitative Analysis for Impression Expression of Facial Features And Its Application to Discrimination of Facial Attributes", IEICE TECHNICAL REPORT, HCS2004-49-59, HUMAN COMMUNICATION KISO, vol. 104, no. 744, 17 March 2005 (2005-03-17), pages 13 - 18, XP008172590 * |
See also references of EP2731072A4 |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105210110A (zh) * | 2013-02-01 | 2015-12-30 | 松下知识产权经营株式会社 | 美容辅助装置、美容辅助系统、美容辅助方法以及美容辅助程序 |
US10321747B2 (en) | 2013-02-01 | 2019-06-18 | Panasonic Intellectual Property Management Co., Ltd. | Makeup assistance device, makeup assistance system, makeup assistance method, and makeup assistance program |
EP2953090A4 (en) * | 2013-02-01 | 2016-05-25 | Panasonic Ip Man Co Ltd | MAKEUP AID, MAKEUP AID, MAKEUP ASSISTANCE AND MAKEUP ASSISTANCE |
CN105164686B (zh) * | 2013-02-07 | 2020-04-10 | 克里赛利克斯有限公司 | 用于美容模拟的3d平台 |
CN105164686A (zh) * | 2013-02-07 | 2015-12-16 | 克里赛利克斯有限公司 | 用于美容模拟的3d平台 |
US11501363B2 (en) | 2013-02-07 | 2022-11-15 | Crisalix S.A. | 3D platform for aesthetic simulation |
WO2014122253A3 (en) * | 2013-02-07 | 2014-10-02 | Crisalix Sa | 3d platform for aesthetic simulation |
JPWO2015122195A1 (ja) * | 2014-02-17 | 2017-03-30 | Necソリューションイノベータ株式会社 | 印象分析装置、ゲーム装置、健康管理装置、広告支援装置、印象分析システム、印象分析方法及びプログラム |
WO2015122195A1 (ja) * | 2014-02-17 | 2015-08-20 | Necソリューションイノベータ株式会社 | 印象分析装置、ゲーム装置、健康管理装置、広告支援装置、印象分析システム、印象分析方法、プログラム、及びプログラム記録媒体 |
CN106030659B (zh) * | 2014-02-24 | 2019-01-22 | 花王株式会社 | 增龄分析方法及增龄分析装置 |
JP2015172935A (ja) * | 2014-02-24 | 2015-10-01 | 花王株式会社 | 加齢分析方法及び加齢分析装置 |
CN106030659A (zh) * | 2014-02-24 | 2016-10-12 | 花王株式会社 | 增龄分析方法及增龄分析装置 |
TWI716344B (zh) * | 2014-02-24 | 2021-01-21 | 日商花王股份有限公司 | 增齡分析方法、使用增齡分析方法之衰老修護之輔助方法、增齡分析裝置及電腦可讀取之記錄媒體 |
WO2015125759A1 (ja) * | 2014-02-24 | 2015-08-27 | 花王株式会社 | 加齢分析方法及び加齢分析装置 |
CN104408417A (zh) * | 2014-11-25 | 2015-03-11 | 苏州福丰科技有限公司 | 基于三维人脸局部特征匹配的超市预付费支付方法 |
JP2016193175A (ja) * | 2015-03-31 | 2016-11-17 | ポーラ化成工業株式会社 | 顔の見た目印象の決定部位の抽出方法、顔の見た目印象の決定因子の抽出方法、顔の見た目印象の鑑別方法 |
CN109310475A (zh) * | 2016-06-21 | 2019-02-05 | 约翰·G·罗伯森 | 用于自动生成面部修复设计和应用方案以解决可观察的面部偏差的系统和方法 |
JP2018073273A (ja) * | 2016-11-02 | 2018-05-10 | 花王株式会社 | 加齢分析方法 |
JP7074422B2 (ja) | 2016-11-02 | 2022-05-24 | 花王株式会社 | 加齢分析方法 |
WO2023210341A1 (ja) * | 2022-04-25 | 2023-11-02 | 株式会社資生堂 | 顔分類方法、装置、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
JP5231685B1 (ja) | 2013-07-10 |
CN103649987B (zh) | 2018-05-25 |
JPWO2013005447A1 (ja) | 2015-02-23 |
US9330298B2 (en) | 2016-05-03 |
US20140226896A1 (en) | 2014-08-14 |
EP2731072A4 (en) | 2015-03-25 |
CN103649987A (zh) | 2014-03-19 |
EP2731072A1 (en) | 2014-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5231685B1 (ja) | 顔印象分析方法、美容カウンセリング方法および顔画像生成方法 | |
JP6985403B2 (ja) | 年齢容姿シミュレーションのための方法 | |
US10614623B2 (en) | Methods and apparatuses for age appearance simulation | |
JP4809056B2 (ja) | チークメーキャップのための顔分類装置、顔分類プログラム、及び該プログラムが記録された記録媒体 | |
WO2007063878A1 (ja) | 顔分類方法、顔分類装置、分類マップ、顔分類プログラム、及び該プログラムが記録された記録媒体 | |
US20160203358A1 (en) | Systems and methods for using curvatures to analyze facial and body features | |
JP5651385B2 (ja) | 顔評価方法 | |
Dantcheva et al. | Female facial aesthetics based on soft biometrics and photo-quality | |
JP5035524B2 (ja) | 顔画像の合成方法及び合成装置 | |
JP2004102359A (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
JP2017120595A (ja) | 化粧料の塗布状態の評価方法 | |
JP5905702B2 (ja) | 顔印象判定チャート | |
JP5095182B2 (ja) | 顔分類装置、顔分類プログラム、及び該プログラムが記録された記録媒体 | |
JPH11265443A (ja) | 印象の評価方法及び装置 | |
KR20190042493A (ko) | 식별가능한 얼굴 편차를 처리하기 위한 얼굴 교정 디자인 및 애플리케이션 프로토콜을 자동으로 생성하는 시스템 및 방법 | |
JP4893968B2 (ja) | 顔画像の合成方法 | |
CN106030659B (zh) | 增龄分析方法及增龄分析装置 | |
JP2022078936A (ja) | 肌画像分析方法 | |
JP5650012B2 (ja) | 顔画像処理方法、美容カウンセリング方法および顔画像処理装置 | |
Leta et al. | A study of the facial aging-a multidisciplinary approach | |
JP2021129977A (ja) | いきいきとした顔の度合い推定方法、いきいきとした顔の度合い推定装置、及びいきいきとした顔の度合い推定プログラム | |
WO2022243498A1 (en) | Computer-based body part analysis methods and systems | |
JP5234857B2 (ja) | チークメーキャップのための顔分類方法、分類見極めマップ、顔分類プログラム、及び該プログラムが記録された記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2012558097 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12806872 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012806872 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14131374 Country of ref document: US |