US20030062058A1 - Method of modifying facial images, makeup simulation method, makeup method, makeup support apparatus and foundation transfer film - Google Patents
Method of modifying facial images, makeup simulation method, makeup method, makeup support apparatus and foundation transfer film Download PDFInfo
- Publication number
- US20030062058A1 US20030062058A1 US10/281,232 US28123202A US2003062058A1 US 20030062058 A1 US20030062058 A1 US 20030062058A1 US 28123202 A US28123202 A US 28123202A US 2003062058 A1 US2003062058 A1 US 2003062058A1
- Authority
- US
- United States
- Prior art keywords
- makeup
- face
- post
- areas
- faces
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 108
- 230000001815 facial effect Effects 0.000 title claims abstract description 37
- 238000004088 simulation Methods 0.000 title claims description 19
- 238000012546 transfer Methods 0.000 title claims description 19
- 238000012545 processing Methods 0.000 claims abstract description 56
- 239000000463 material Substances 0.000 claims description 7
- 238000013459 approach Methods 0.000 claims description 5
- 238000003384 imaging method Methods 0.000 claims description 3
- 238000012986 modification Methods 0.000 abstract description 8
- 230000004048 modification Effects 0.000 abstract description 8
- 210000000887 face Anatomy 0.000 description 76
- 210000004709 eyebrow Anatomy 0.000 description 10
- 238000010586 diagram Methods 0.000 description 6
- 238000002316 cosmetic surgery Methods 0.000 description 4
- 230000003796 beauty Effects 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 239000000470 constituent Substances 0.000 description 1
- 239000002537 cosmetic Substances 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 210000000744 eyelid Anatomy 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 239000000976 ink Substances 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61K—PREPARATIONS FOR MEDICAL, DENTAL OR TOILETRY PURPOSES
- A61K8/00—Cosmetics or similar toiletry preparations
- A61K8/02—Cosmetics or similar toiletry preparations characterised by special physical form
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61Q—SPECIFIC USE OF COSMETICS OR SIMILAR TOILETRY PREPARATIONS
- A61Q1/00—Make-up preparations; Body powders; Preparations for removing make-up
- A61Q1/02—Preparations containing skin colorants, e.g. pigments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/80—2D [Two Dimensional] animation, e.g. using sprites
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B33—ADDITIVE MANUFACTURING TECHNOLOGY
- B33Y—ADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
- B33Y80/00—Products made by additive manufacturing
Definitions
- the present invention relates to a technique of modifying facial images and a makeup art of putting makeup on a face using the modifying technique.
- the present inventor has been making studies, for a long time, on factors which determine the faces of persons, particularly facial impressions, and beauty and ugliness, from the viewpoints of plastic surgery and cosmetic surgery.
- the study has provided several new findings.
- One of them is the “highlight theory”.
- the components of a face such as eyebrows, eyes, a nose and lips, would become significant factors to determine a facial impression.
- the research done by the present inventor shows that the prime factor to determine a facial impression is the shading state of a face, i.e., the states of a bright area and a dark area, particularly, the state of the brightest area or highlighted area.
- the shading state relates to shading on a face which occurs under illumination of the natural light, and reflects or expresses the three-dimensional shape of the face.
- the states of a highlighted area in such shading particularly affects a facial impression.
- the analysis based on this highlight theory classifies faces into three general types.
- the shapes of highlighted areas are an inverted triangle type [(a) in FIG. 5], an oval type [(b) in FIG. 5] and a clover-leaf type [(c) in FIG. 5].
- a typical beautiful face in each facial type has a good shape of the associated highlighted area.
- an inverted triangle type for example, no or little distortion or deformation on its shape is a significant factor in giving a good impression on a face.
- Another finding is a theory on the shapes of faces people feel beautiful or preferable.
- Conventional theories on facial beauty include canon theory or a theory based on golden cut, and the average theory which has recently come up.
- the average theory means a theory that the face that appears most beautiful among those of a certain human race is an average face of the individual faces of that race. Faces based on those theories merely provide general “beautiful faces”. In other words, those theories cannot give ideal post-makeup faces which are makeup aims applicable to makeup methods. Unlike those theories, the theory found by the present inventor can be put to a practical use in makeup methods.
- model faces which are, for example, faces of preferred talents or actresses or actors, or “beautiful faces” originated from the average theory and the like, is selected as a desirable face.
- the actual face of the person is subjected to image processing like morphing so that the actual face is mixed with the desirable face to approach the latter.
- This process provides a plurality of “modified faces” with different mixing ratios or similarity levels with respect to the desirable face.
- those “modified faces” contain an “ideal face” that person considers most beautiful or preferable.
- the data obtained by the present inventor shows that persons in 48 cases out of 50 could find “ideal faces” they thought most beautiful or preferable. In the other two cases, “ideal faces” could be found by modifying the desirable faces.
- the mixing ratio in every case ranged from 10 to 50%.
- one model face itself should not necessarily be the most preferable face for a particular person. That is, the “ideal face” a particular person can find for herself or himself is a “modified face” which is resulted from the mixing of a model face with her or his own face, not the model face itself.
- the actual results of the experiments conducted by the present inventor show that most subject persons did not prefer model faces using “beautiful faces” created by the average theory or the like.
- the present invention has been devised to effectively utilize the above-described findings and aims at using those findings to a technique of modifying, for example, photographed facial images and to a makeup technique.
- facial images picked up by a still camera, a video camera or the like are modified as follows. First, a desirable face is determined, and then an original facial image is subjected to image processing to become closer to this desirable face so that a modified facial image is acquired.
- Model faces can be used in determining a desirable face.
- the faces of preferable talents or actresses or actors can be used as model faces. If one does not have any preferable talent or the like, standard good-looking faces acquired from a plurality of beautiful faces which have previously been prepared corresponding to human races or like according to the foregoing average theory may be used.
- an original facial image is subjected to image processing to become closer to a desirable face, thereby forming images of a plurality of selectable modified faces with different similarity levels with respect to the desirable face, and then a selection is made from those selectable modified faces to acquire a modified facial image. This further facilitates acquiring a face ideal to the subject person.
- the key point is to make the states of highlighted areas of a facial image to be modified closer to those of a desirable face. This can provide a more beautiful and preferable face and thus facilitate approaching to a desirable face.
- This facial image modifying method is an application of the foregoing theory on the shapes of faces that people think beautiful or preferable. Unlike the conventional photograph modification techniques which mainly depend on a manual work, this method can automate the modification work by electronic processing. What is more, modification results can give greater satisfaction.
- This method of the present invention can be adapted to operational simulation in plastic surgery and cosmetic surgery as well as ordinary photograph modification.
- a desirable face is determined first.
- a desirable face may be determined by asking the subject person about preferences.
- model faces may be used in determining a desirable face.
- the faces of talents, or actresses or actors the subject person like can be used as model faces. If the subject person has no preference for any talent or the like, averages of good-looking faces which have previously been prepared corresponding to human races or the like may be used as model faces.
- a preferred desirable face can be selected from model faces using the name or the like of the preferable talent or the like as an index.
- a preferred desirable face should be selected from model faces of standard beauty which has previously been input as data.
- a selection is made while sequentially displaying the model faces on a monitor screen or based on the type of the face of the subject person.
- the model faces should not necessarily be shown to the subject person.
- images of imagined post-makeup faces based on the desirable face or post-makeup faces which are imagined as preferable are prepared. This involves a process of mixing the face of the subject person with the desirable face to make the former face closer to the latter by performing image processing like shape-merging on the image of the face of the subject person. Then, an ideal post-makeup face the subject person desires is determined from the imagined post-makeup faces. Specifically, a plurality of imagined post-makeup faces with different mixing ratios or similarity levels with respect to the desirable face are acquired through the foregoing image processing, and a preferable selection is made from those faces as an ideal post-makeup face within the range of the similarity levels which can be obtained by makeup.
- the ideal post-makeup face which is expected to be finally obtained is given in advance. That is, the subject person can know the makeup-finished face in a short time.
- the significant feature of the makeup method embodying the present invention lies in that a final, makeup-finished face can be presented in a short time by determining a desirable face on which a makeup procedure is based and acquiring an ideal post-makeup face with respect to this desirable face. In this feature is effectively used the above-described theory on the shapes of faces people think beautiful or preferable.
- a makeup procedure is derived from this ideal post-makeup face.
- a sequence of makeup procedures for achieving the ideal post-makeup face such as selecting which portions of eyebrows to be shaved and where to draw eyebrows, selecting the lines and ranges for eye liners and eye shadows, selecting the colors of eye shadows, setting the drawing range for a lipstick and separately applying different foundations, is derived from a previously set makeup program.
- makeup is applied to the subject person. This makes it possible to accurately realize the ideal post-makeup face or the makeup state the subject person has previously found appropriate, on the face of the subject person. That is, makeup which the subject person desires can be done freely and in a short time.
- the image processing for preparing the images of imagined post-makeup faces on the basis of the highlight theory involves a process of making the shapes of the eyebrows, the eye lines, the lip line, and so forth closer to those of a desirable face within a range where modification by makeup is possible, in addition to the basic process of getting the states of highlighted areas on a face closer to those of the desirable face within a range where modification by makeup is possible.
- a shape-merging scheme or the like can be also used in this case.
- image processing is performed to divide the ideal post-makeup face into a plurality of areas originating from differences between bright and dark areas like a pattern of contour lines. Then, color data is acquired for each segmented area. Color data in this case consists of hue and brightness, and chroma if necessary, and R, G and B which are used in a television system or C, M, Y and K which are basic printing colors in a computer system are normally used. Based on the color data of the individual areas and a previously set foundation program, foundations to be applied on the individual areas are determined. Finally, the thud determined foundations are applied on the face of the subject person in accordance with the segmented areas.
- One makeup simulation method and a makeup method using this simulation method therefore include the following steps: (a) determining a desirable face and performing image processing on an image of a face of a subject person based on the desirable face, thereby forming images of imagined post-makeup faces; (b) displaying the imagined post-makeup faces, prepared in the step (a), on a monitor screen or the like and selectively determining an ideal post-makeup face desired by the subject person from the displayed imagined post-makeup faces; (c) deriving a makeup procedure from an image of the ideal post-makeup face, determined in the step (b), based on a previously set makeup program; and (d) applying makeup on the face of the subject person in accordance with the makeup procedure derived in the step (c).
- Another makeup simulation method embodying the present invention and another makeup method using this simulation method include the following steps: (e) determining a desirable face and performing image processing on an image of a face of a subject person based on the desirable face, thereby forming images of imagined post-makeup faces; (f) displaying the imagined post-makeup faces, prepared in the step (c), on a monitor screen or the like and selectively determining an ideal post-makeup face desired by the subject person from the displayed imagined post-makeup faces; (g) performing image processing to segment an image of the ideal post-makeup face, determined in the step (f), to a plurality of areas in regard to bright areas and dark areas; (h) acquiring color data for the segmented areas obtained in the step (g); (i) determining foundations to be applied on individual areas based on the color data for those areas, acquired in the step (h), and a previously set
- the makeup method based on dark and bright states may be modified in such a way as to be able to print the image of the ideal post-makeup face, segmented into a plurality of areas in regard to bright areas and dark areas in the step (g), in an actual size of the face of the subject person, to cut out individual areas from the printed image to prepare a pattern mask for foundation, and to execute separate application of foundations in the step (k) by using this pattern mask.
- This design can allow different foundations to be separately applied on the associated areas easily and accurately. That is, separate application of different foundations associated with an ideal post-makeup face can be carried out easily and accurately.
- the makeup method which is based on the highlight theory may be modified in such a manner as to be able to project the image of the ideal post-makeup face, segmented into a plurality of areas in regard to bright areas and dark areas in the step (g), in an actual size on the face of the subject person, and to execute separate application of foundations in the step (j) based on this projected image.
- This modification can also permit different foundations to be separately applied on the associated areas easily and accurately.
- Another makeup method which is based on dark and bright states includes the following steps: (A) determining a desirable face and performing image processing on an image of a face of a subject person based on the desirable face, thereby forming images of imagined post-makeup faces; (B) displaying the imagined post-makeup faces, prepared in the step (A), on a monitor screen or the like and selectively determining from the displayed imagined post-makeup faces an ideal post-makeup face the subject person desires; (C) performing image processing to segment an image of the ideal post-makeup face, determined in the step (B), to a plurality of areas in regard to bright areas and dark areas; (D) acquiring color data for the segmented areas obtained in the step (C); (E) determining foundations to be applied on individual areas based on the color data for those areas, acquired in the step (D), and a previously set foundation program; (F) printing foundations, determined in the step (E) in association with the plurality of areas segmented in regard to bright areas
- a further makeup method which is based on dark and bright states includes the following steps: (H) determining a desirable face and performing image processing on an image of a face of a subject person based on the desirable face, thereby forming images of imagined post-makeup faces; (I) displaying the imagined post-makeup faces, prepared in the step (H), on a monitor screen or the like and selectively determining from the displayed imagined post-makeup faces an ideal post-makeup face the subject person desires; (K) printing an image of the ideal post-makeup face in the bright and dark states, determined in the step (I), on a thin base material using foundations to prepare a foundation transfer film; and (L) transferring foundations, printed on the foundation transfer film prepared in the step (K), on the face of the subject person to thereby apply the foundations on the face of the subject person.
- the processing means in the makeup support apparatus may further perform a process of segmenting the determined ideal post-makeup face to a plurality of areas in regard to bright areas and dark areas, a process of acquiring color data for the segmented areas, and a process of determining foundations to be applied on individual areas based on the acquired color data for those areas.
- FIG. 1 is a diagram for explaining one example of image processing in a facial image modifying method.
- FIG. 2 is a structural diagram of a makeup support apparatus according to one carrying-out mode of the present invention.
- FIG. 3 is an explanatory diagram concerning a highlight theory.
- FIG. 4 is a diagram for explaining segmentation into highlighted areas.
- FIG. 5 is an explanatory diagram of the types of highlighted areas.
- a carrying-out mode of a facial image modifying method will be described.
- a system which includes a data processing apparatus like a personal computer and a monitor connected to the apparatus is used in implementing the facial image modifying method of the present invention.
- An image of a face to be modified or an original facial image is put into the data processing apparatus.
- a desirable face is selected from previously prepared model faces, e.g., multiple standard faces which have been prepared based on the average theory, and the original facial image is subjected to image processing based on this desirable face.
- the image processing is executed by a program which employs an image processing scheme like morphing or twining. Specifically, the coordinate values of facial constituents, such as eyebrows, eyes, a nose and a mouth, of both an original facial image M and a desirable face T, and the coordinate values of bright and dark patterns of each face are acquired as shown in FIG. 1, for example, and the original facial image M is mixed with and transformed toward the desirable face T based on those coordinate values.
- an image processing scheme like morphing or twining.
- essential portions such as eyebrows, eyes, a nose, a mouth and the lines of highlighted areas
- a tip portion of the nose is expressed as “0”, the eye lines as “1-2” and “3-4”, and the eyebrows as “5-6” and “7-8”.
- those points on the original facial image M are shifted toward the corresponding points on the desirable face T by a morphing program.
- the lines of the highlighted areas or the like are shifted or their inclinations are changed by a twining program.
- Such image processing by shifting points or the like includes a plurality of phases according to the degree of such shifting.
- selectable modified faces imaging post-makeup faces in the case of a makeup method which will be described later
- selectable modified faces imaging post-makeup faces in the case of a makeup method which will be described later
- point shifting is performed every one tenth of the distance between corresponding points on both facial images, for example, there are ten phases and there are thus ten imagined post-makeup faces.
- subject persons found their “ideal faces” (selectable modified faces or imagined post-makeup faces) on selectable modified faces at the second or third phase, i.e., selectable modified faces which had approached to their desirable faces by degrees of 20 to 30%. That is, while there is a desirable face as an image, a face too close to this desirable face comes off the actual “ideal face”.
- the makeup method of the present invention is carried out by using a makeup support apparatus.
- the makeup support apparatus of which structure is illustrated in FIG. 2 includes a data processing apparatus 1 , a storage device 2 , a monitor 3 and an image sensing unit 4 .
- the data processing apparatus 1 which executes various kinds of data processing required in the makeup method of the present invention, is a personal computer or the like.
- the storage device 2 is used to store data of model faces which are used in the makeup method of the present invention.
- An internal memory device or an external memory device is used as the storage device 2 .
- the makeup method proceeds as follows. First, the instructor asks the subject person specific questions to determine the impression of a face desired by the subject person or a desirable face.
- Model faces are used to make a desirable face concrete. As model faces are used image data of faces of popular talents, actresses or actors or the like, or standard good-looking faces prepared in association with human races or the like, which are stored in the storage device 2 .
- a desirable face is selected from the model faces by using an average face for each race, each era, each personality, each work or the like or the name of a specific talent, or is selected from various model faces sequentially displayed on the screen of the monitor 3 .
- the face of the subject person is picked up by the image sensing unit 4 and its image is sent into the storage device 2 .
- the image of the face of the subject person and the image of the desirable face are mixed to make the former image closer to the latter.
- the original facial image of the face of the subject person acquired in the above process is subjected to the same scheme as employed in the above-described modifying method to make the highlighted areas, the shapes of the eyebrows, the eye lines, the lip line and the like on the face approach those of the desirable face step by step.
- FIG. 3 exemplifies highlighted areas.
- (a) in FIG. 3 shows the states of bright and dark areas on the original facial image of the face of the subject person
- (b) in FIG. 3 shows the states of bright and dark areas made close to those of the desirable face in image processing.
- the densely dotted portions represent relatively dark portions or dark areas and a portion surrounded by the dark portions represent the brightest area (highlighted area).
- the highlighted area belongs to an oval shape
- the current face of the subject person is off the ideal oval shape and the face of the subject person can be made closer to the desirable face which has an ideal oval highlighted area by mainly modifying the off portions.
- the images of a plurality of imagined post-makeup faces with different similarity levels with respect to a desirable face are formed as described above, and those images are displayed on the screen of the monitor 3 to permit the subject person to make a selection. If the subject person agrees with any of the displayed images, the selected face is set as an ideal post-makeup face.
- foundation in particular, how to apply foundations is derived by performing the following processes under the control of a foundation program which is included in the makeup program.
- image processing is performed to divide the ideal post-makeup face into a plurality of bright and dark areas, e.g., two to four areas.
- This process is exemplified in FIG. 4.
- the face is segmented to four areas with different brightnesses (black-filled portion, densely-dotted portion, thinly-dotted portion and white portion).
- color data for the segmented areas are acquired.
- the color data in this case are based on R, G and B used in a television system. Based on the obtained color data of each area, a foundation to be applied on that area is determined.
- a pattern mask is used to apply foundations on the face of a subject person.
- a pattern mask is formed like a pattern sheet by cutting out individual areas from a printed image of the face, segmented into a plurality of areas in regard to bright areas and dark areas in the above-described image processing, in the real size of the face of the subject person. Then, the pattern mask is placed on the face of the subject person and foundations for the individual areas are applied over the pattern mask.
- a third carrying-out mode of the present invention employs a projection scheme to apply foundations on the face of a subject person. Specifically, the image of the face, segmented into a plurality of areas in regard to bright areas and dark areas in the above-described image processing, is projected on the face of the subject person in real size, and foundations for the associated areas are separately applied based on this projected image.
- a fourth carrying-out mode of the present invention uses a transfer scheme to apply foundations on the face of a subject person. Specifically, first, in association with a plurality of areas segmented in regard to bright areas and dark areas in the above-discussed image processing, foundations for the respective areas are printed in an actual size on a thin base material to prepare a foundation transfer film. Alternatively, images of bright and dark states on the ideal post-makeup face determined in the above-described manner are printed on a thin base material using foundations to prepare a foundation transfer film. Then, such a foundation transfer film is pressed against the face of the subject person to transfer the foundations. Finally, the proper shade-off process or the like is performed on the transferred foundations at the boundaries between the individual areas.
- the foundation transfer film in this case does not necessarily correspond to the entire face, but a partial transfer film for the upper eyelids or a portion around the nose, for example, may be used.
- application of foundations to the face of the subject person is accomplished by directly printing the foundations on the face by means of a three-dimensional printer or the like.
- an image with separately applied foundations which is formed by the above-described image processing, is directly spray-printed on the face by a three-dimensional printer or the like which uses the necessary foundations as inks.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Birds (AREA)
- Epidemiology (AREA)
- Image Processing (AREA)
- Processing Or Creating Images (AREA)
- Color Image Communication Systems (AREA)
- Image Analysis (AREA)
- Electrically Operated Instructional Devices (AREA)
- Editing Of Facsimile Originals (AREA)
- Facsimile Image Signal Circuits (AREA)
Abstract
New findings such as the highlight theory which determine the impression of a face are utilized for the technique of modifying a photographed facial image and the makeup technique. A method according to the present invention includes determining a desirable face desired by a person who is subjected to facial modification or makeup, and performing image processing like shape-merging on an original facial image or an image of a face of the subject person based on the desirable face to thereby prepare images of selectable modified faces.
Description
- The present invention relates to a technique of modifying facial images and a makeup art of putting makeup on a face using the modifying technique.
- The present inventor has been making studies, for a long time, on factors which determine the faces of persons, particularly facial impressions, and beauty and ugliness, from the viewpoints of plastic surgery and cosmetic surgery. The study has provided several new findings. One of them is the “highlight theory”. Conventionally, it was a general thought that the components of a face, such as eyebrows, eyes, a nose and lips, would become significant factors to determine a facial impression. The research done by the present inventor however shows that the prime factor to determine a facial impression is the shading state of a face, i.e., the states of a bright area and a dark area, particularly, the state of the brightest area or highlighted area. The shading state relates to shading on a face which occurs under illumination of the natural light, and reflects or expresses the three-dimensional shape of the face. The states of a highlighted area in such shading particularly affects a facial impression.
- The analysis based on this highlight theory classifies faces into three general types. As exemplarily shown in FIG. 5, the shapes of highlighted areas are an inverted triangle type [(a) in FIG. 5], an oval type [(b) in FIG. 5] and a clover-leaf type [(c) in FIG. 5]. A typical beautiful face in each facial type has a good shape of the associated highlighted area. With regard to an inverted triangle type, for example, no or little distortion or deformation on its shape is a significant factor in giving a good impression on a face.
- Another finding is a theory on the shapes of faces people feel beautiful or preferable. Conventional theories on facial beauty include canon theory or a theory based on golden cut, and the average theory which has recently come up. The average theory means a theory that the face that appears most beautiful among those of a certain human race is an average face of the individual faces of that race. Faces based on those theories merely provide general “beautiful faces”. In other words, those theories cannot give ideal post-makeup faces which are makeup aims applicable to makeup methods. Unlike those theories, the theory found by the present inventor can be put to a practical use in makeup methods.
- According to what has been newly found by the present inventor, through the following procedures, it is possible in most occasions to find the most beautiful or preferable face for each specific person, or the most beautiful or preferable face that can be made by makeup available to that individual. The fundamental theory is such that if faces are produced by combining a face of given standards, specifically, a face generally considered beautiful or a face preferred by an individual and the actual face of that particular individual at the proper mixing ratios through an image processing technique, an “ideal face” which is considered most beautiful or preferable by that individual can be found in most cases from the foregoing mixed faces. Specifically, one of model faces, which are, for example, faces of preferred talents or actresses or actors, or “beautiful faces” originated from the average theory and the like, is selected as a desirable face. Based on this desirable face, the actual face of the person is subjected to image processing like morphing so that the actual face is mixed with the desirable face to approach the latter. This process provides a plurality of “modified faces” with different mixing ratios or similarity levels with respect to the desirable face. In most cases, those “modified faces” contain an “ideal face” that person considers most beautiful or preferable. The data obtained by the present inventor shows that persons in 48 cases out of 50 could find “ideal faces” they thought most beautiful or preferable. In the other two cases, “ideal faces” could be found by modifying the desirable faces. The mixing ratio in every case ranged from 10 to 50%.
- It is to be noted that one model face itself should not necessarily be the most preferable face for a particular person. That is, the “ideal face” a particular person can find for herself or himself is a “modified face” which is resulted from the mixing of a model face with her or his own face, not the model face itself. The actual results of the experiments conducted by the present inventor show that most subject persons did not prefer model faces using “beautiful faces” created by the average theory or the like.
- The present invention has been devised to effectively utilize the above-described findings and aims at using those findings to a technique of modifying, for example, photographed facial images and to a makeup technique.
- According to the present invention, facial images picked up by a still camera, a video camera or the like are modified as follows. First, a desirable face is determined, and then an original facial image is subjected to image processing to become closer to this desirable face so that a modified facial image is acquired.
- Model faces can be used in determining a desirable face. The faces of preferable talents or actresses or actors can be used as model faces. If one does not have any preferable talent or the like, standard good-looking faces acquired from a plurality of beautiful faces which have previously been prepared corresponding to human races or like according to the foregoing average theory may be used. In the foregoing method, it may also be arranged that an original facial image is subjected to image processing to become closer to a desirable face, thereby forming images of a plurality of selectable modified faces with different similarity levels with respect to the desirable face, and then a selection is made from those selectable modified faces to acquire a modified facial image. This further facilitates acquiring a face ideal to the subject person.
- It is preferable to use the foregoing highlight theory in modifying and making an original facial image closer to a desirable face. That is, the key point is to make the states of highlighted areas of a facial image to be modified closer to those of a desirable face. This can provide a more beautiful and preferable face and thus facilitate approaching to a desirable face.
- This facial image modifying method is an application of the foregoing theory on the shapes of faces that people think beautiful or preferable. Unlike the conventional photograph modification techniques which mainly depend on a manual work, this method can automate the modification work by electronic processing. What is more, modification results can give greater satisfaction. This method of the present invention can be adapted to operational simulation in plastic surgery and cosmetic surgery as well as ordinary photograph modification.
- In a makeup simulation method embodying the present invention and a makeup method which uses this simulation method, a desirable face is determined first. In the case where a makeup instructor instructs a subject person how to put makeup, for example, a desirable face may be determined by asking the subject person about preferences. Alternatively, model faces may be used in determining a desirable face. The faces of talents, or actresses or actors the subject person like can be used as model faces. If the subject person has no preference for any talent or the like, averages of good-looking faces which have previously been prepared corresponding to human races or the like may be used as model faces. In the former case, a preferred desirable face can be selected from model faces using the name or the like of the preferable talent or the like as an index. In the latter case, a preferred desirable face should be selected from model faces of standard beauty which has previously been input as data. In both cases a selection is made while sequentially displaying the model faces on a monitor screen or based on the type of the face of the subject person. In the latter case, the model faces should not necessarily be shown to the subject person.
- Once the desirable face is decided, then images of imagined post-makeup faces based on the desirable face or post-makeup faces which are imagined as preferable are prepared. This involves a process of mixing the face of the subject person with the desirable face to make the former face closer to the latter by performing image processing like shape-merging on the image of the face of the subject person. Then, an ideal post-makeup face the subject person desires is determined from the imagined post-makeup faces. Specifically, a plurality of imagined post-makeup faces with different mixing ratios or similarity levels with respect to the desirable face are acquired through the foregoing image processing, and a preferable selection is made from those faces as an ideal post-makeup face within the range of the similarity levels which can be obtained by makeup. Thus, the ideal post-makeup face which is expected to be finally obtained is given in advance. That is, the subject person can know the makeup-finished face in a short time. As apparent from the above, the significant feature of the makeup method embodying the present invention lies in that a final, makeup-finished face can be presented in a short time by determining a desirable face on which a makeup procedure is based and acquiring an ideal post-makeup face with respect to this desirable face. In this feature is effectively used the above-described theory on the shapes of faces people think beautiful or preferable.
- When an ideal post-makeup face is decided through the foregoing makeup simulation, a makeup procedure is derived from this ideal post-makeup face. Specifically, a sequence of makeup procedures for achieving the ideal post-makeup face, such as selecting which portions of eyebrows to be shaved and where to draw eyebrows, selecting the lines and ranges for eye liners and eye shadows, selecting the colors of eye shadows, setting the drawing range for a lipstick and separately applying different foundations, is derived from a previously set makeup program. Based on this makeup procedure, makeup is applied to the subject person. This makes it possible to accurately realize the ideal post-makeup face or the makeup state the subject person has previously found appropriate, on the face of the subject person. That is, makeup which the subject person desires can be done freely and in a short time.
- To realize the foregoing feature of the makeup method of the present invention which lies in the preparation of an ideal post-makeup face through image processing based on a desirable face or a model face, it is important to mix the current face of a subject person with a model face through image processing to get the former face closer to the latter. That is, it is essential to acquire an ideal post-makeup face by making the current face of the subject person closer to the reference model face by image processing. The feature also requires an effective scheme or standards for making the current face of the subject person closer to the ideal post-makeup face by actual makeup. This can be accomplished by the foregoing highlight theory.
- Based on the highlight theory which teaches highlighted areas considerably affect the impression of a face and a beautiful and preferable face can be made by erasing any unnecessary shade (dark area) to make the shapes of the highlighted areas better, it becomes relatively easy to get the face of a subject person closer to a model face through image processing and it is also easy to make the face of the subject person closer to a desirable face by implementing such a makeup as to adjust the states of the highlighted areas in association with those of the model face (desirable face). In other words, a more effective scheme of making the face of the subject person closer to the reference model face can be acquired based on the highlight theory.
- The image processing for preparing the images of imagined post-makeup faces on the basis of the highlight theory involves a process of making the shapes of the eyebrows, the eye lines, the lip line, and so forth closer to those of a desirable face within a range where modification by makeup is possible, in addition to the basic process of getting the states of highlighted areas on a face closer to those of the desirable face within a range where modification by makeup is possible. A shape-merging scheme or the like can be also used in this case.
- Based on the highlight theory, attention should be paid to a greatest highlighted area as an impressing element of a face in the makeup simulation method and makeup method, too. Since makeup about highlighted areas is accomplished by separately applying different foundations, the types of foundations and procedures for applying them separately are derived as essential factors from an ideal post-makeup face.
- To accomplish the process, image processing is performed to divide the ideal post-makeup face into a plurality of areas originating from differences between bright and dark areas like a pattern of contour lines. Then, color data is acquired for each segmented area. Color data in this case consists of hue and brightness, and chroma if necessary, and R, G and B which are used in a television system or C, M, Y and K which are basic printing colors in a computer system are normally used. Based on the color data of the individual areas and a previously set foundation program, foundations to be applied on the individual areas are determined. Finally, the thud determined foundations are applied on the face of the subject person in accordance with the segmented areas.
- One makeup simulation method according to the present invention and a makeup method using this simulation method therefore include the following steps: (a) determining a desirable face and performing image processing on an image of a face of a subject person based on the desirable face, thereby forming images of imagined post-makeup faces; (b) displaying the imagined post-makeup faces, prepared in the step (a), on a monitor screen or the like and selectively determining an ideal post-makeup face desired by the subject person from the displayed imagined post-makeup faces; (c) deriving a makeup procedure from an image of the ideal post-makeup face, determined in the step (b), based on a previously set makeup program; and (d) applying makeup on the face of the subject person in accordance with the makeup procedure derived in the step (c).
- Another makeup simulation method embodying the present invention and another makeup method using this simulation method, particularly, a makeup simulation method for applying makeup based on dark and bright states on a face and a makeup method using this simulation method, include the following steps: (e) determining a desirable face and performing image processing on an image of a face of a subject person based on the desirable face, thereby forming images of imagined post-makeup faces; (f) displaying the imagined post-makeup faces, prepared in the step (c), on a monitor screen or the like and selectively determining an ideal post-makeup face desired by the subject person from the displayed imagined post-makeup faces; (g) performing image processing to segment an image of the ideal post-makeup face, determined in the step (f), to a plurality of areas in regard to bright areas and dark areas; (h) acquiring color data for the segmented areas obtained in the step (g); (i) determining foundations to be applied on individual areas based on the color data for those areas, acquired in the step (h), and a previously set foundation program; and (j) separately applying foundations, determined in the step (i), on the face of the subject person in association with the individual areas.
- The makeup method based on dark and bright states may be modified in such a way as to be able to print the image of the ideal post-makeup face, segmented into a plurality of areas in regard to bright areas and dark areas in the step (g), in an actual size of the face of the subject person, to cut out individual areas from the printed image to prepare a pattern mask for foundation, and to execute separate application of foundations in the step (k) by using this pattern mask. This design can allow different foundations to be separately applied on the associated areas easily and accurately. That is, separate application of different foundations associated with an ideal post-makeup face can be carried out easily and accurately.
- The makeup method which is based on the highlight theory may be modified in such a manner as to be able to project the image of the ideal post-makeup face, segmented into a plurality of areas in regard to bright areas and dark areas in the step (g), in an actual size on the face of the subject person, and to execute separate application of foundations in the step (j) based on this projected image. This modification can also permit different foundations to be separately applied on the associated areas easily and accurately.
- Another makeup method according to the present invention which is based on dark and bright states includes the following steps: (A) determining a desirable face and performing image processing on an image of a face of a subject person based on the desirable face, thereby forming images of imagined post-makeup faces; (B) displaying the imagined post-makeup faces, prepared in the step (A), on a monitor screen or the like and selectively determining from the displayed imagined post-makeup faces an ideal post-makeup face the subject person desires; (C) performing image processing to segment an image of the ideal post-makeup face, determined in the step (B), to a plurality of areas in regard to bright areas and dark areas; (D) acquiring color data for the segmented areas obtained in the step (C); (E) determining foundations to be applied on individual areas based on the color data for those areas, acquired in the step (D), and a previously set foundation program; (F) printing foundations, determined in the step (E) in association with the plurality of areas segmented in regard to bright areas and dark areas in the image processing in the step (C), on a thin base material to prepare a transfer film; and (G) transferring foundations, printed on the transfer film prepared in the step (F), on the face of the subject person to thereby apply foundations, associated with the individual areas, on the face of the subject person.
- A further makeup method according to the present invention which is based on dark and bright states includes the following steps: (H) determining a desirable face and performing image processing on an image of a face of a subject person based on the desirable face, thereby forming images of imagined post-makeup faces; (I) displaying the imagined post-makeup faces, prepared in the step (H), on a monitor screen or the like and selectively determining from the displayed imagined post-makeup faces an ideal post-makeup face the subject person desires; (K) printing an image of the ideal post-makeup face in the bright and dark states, determined in the step (I), on a thin base material using foundations to prepare a foundation transfer film; and (L) transferring foundations, printed on the foundation transfer film prepared in the step (K), on the face of the subject person to thereby apply the foundations on the face of the subject person.
- Those methods can repeatedly realize makeup for an ideal post-makeup face by carrying out simple procedures, such as transferring foundations from a transfer film accurately corresponding to foundations for the ideal post-makeup face and properly shading off the transferred foundations at the boundaries between the individual areas. Thus, desired makeup can be carried out more easily and in a shorter time.
- A makeup support apparatus according to the present invention for use in the foregoing makeup simulation methods and makeup methods comprises image storage means for storing images of a plurality of model faces; imaging and input means for imaging and inputting a face of a subject person; a monitor for displaying a necessary image; and processing means capable of performing a process of causing an image of the face of the subject person to approach a desirable face already determined to thereby form images of imagined post-makeup faces, a process of displaying the imagined post-makeup faces on a screen of the monitor, and a process of deriving a makeup procedure from a determined ideal post-makeup face.
- It is preferable that the processing means in the makeup support apparatus may further perform a process of segmenting the determined ideal post-makeup face to a plurality of areas in regard to bright areas and dark areas, a process of acquiring color data for the segmented areas, and a process of determining foundations to be applied on individual areas based on the acquired color data for those areas.
- FIG. 1 is a diagram for explaining one example of image processing in a facial image modifying method.
- FIG. 2 is a structural diagram of a makeup support apparatus according to one carrying-out mode of the present invention.
- FIG. 3 is an explanatory diagram concerning a highlight theory.
- FIG. 4 is a diagram for explaining segmentation into highlighted areas.
- FIG. 5 is an explanatory diagram of the types of highlighted areas.
- Hereinbelow, carrying-out modes of the present invention will described. To begin with, a carrying-out mode of a facial image modifying method according to the present invention will be described. Normally, a system which includes a data processing apparatus like a personal computer and a monitor connected to the apparatus is used in implementing the facial image modifying method of the present invention. An image of a face to be modified or an original facial image is put into the data processing apparatus. Then, a desirable face is selected from previously prepared model faces, e.g., multiple standard faces which have been prepared based on the average theory, and the original facial image is subjected to image processing based on this desirable face.
- The image processing is executed by a program which employs an image processing scheme like morphing or twining. Specifically, the coordinate values of facial constituents, such as eyebrows, eyes, a nose and a mouth, of both an original facial image M and a desirable face T, and the coordinate values of bright and dark patterns of each face are acquired as shown in FIG. 1, for example, and the original facial image M is mixed with and transformed toward the desirable face T based on those coordinate values.
- More specifically, first, essential portions, such as eyebrows, eyes, a nose, a mouth and the lines of highlighted areas, are plotted for each of the original facial image M and the desirable face T. For example, a tip portion of the nose is expressed as “0”, the eye lines as “1-2” and “3-4”, and the eyebrows as “5-6” and “7-8”. Then, those points on the original facial image M are shifted toward the corresponding points on the desirable face T by a morphing program. The lines of the highlighted areas or the like are shifted or their inclinations are changed by a twining program. Such image processing by shifting points or the like includes a plurality of phases according to the degree of such shifting. That is, selectable modified faces (imagined post-makeup faces in the case of a makeup method which will be described later) corresponding to those phases can be obtained. If point shifting is performed every one tenth of the distance between corresponding points on both facial images, for example, there are ten phases and there are thus ten imagined post-makeup faces. According to the experiments conducted by the present inventor, most of the times, subject persons found their “ideal faces” (selectable modified faces or imagined post-makeup faces) on selectable modified faces at the second or third phase, i.e., selectable modified faces which had approached to their desirable faces by degrees of 20 to 30%. That is, while there is a desirable face as an image, a face too close to this desirable face comes off the actual “ideal face”.
- As the above process forms the images of a plurality of selectable modified faces with different similarity levels with respect to a desirable face, they are displayed on the screen of the monitor to permit the subject person to make the best selection, which is treated as a modified facial image.
- Now, carrying-out modes of a makeup simulation method and a makeup method according to the present invention will be described. The makeup method of the present invention is carried out by using a makeup support apparatus. The makeup support apparatus of which structure is illustrated in FIG. 2 includes a
data processing apparatus 1, a storage device 2, amonitor 3 and animage sensing unit 4. Thedata processing apparatus 1, which executes various kinds of data processing required in the makeup method of the present invention, is a personal computer or the like. The storage device 2 is used to store data of model faces which are used in the makeup method of the present invention. An internal memory device or an external memory device is used as the storage device 2. - In an exemplified case where a makeup instructor instructs a subject person a makeup procedure using such a makeup support system, the makeup method proceeds as follows. First, the instructor asks the subject person specific questions to determine the impression of a face desired by the subject person or a desirable face. Model faces are used to make a desirable face concrete. As model faces are used image data of faces of popular talents, actresses or actors or the like, or standard good-looking faces prepared in association with human races or the like, which are stored in the storage device2. A desirable face is selected from the model faces by using an average face for each race, each era, each personality, each work or the like or the name of a specific talent, or is selected from various model faces sequentially displayed on the screen of the
monitor 3. - Once the desirable face is determined, the face of the subject person is picked up by the
image sensing unit 4 and its image is sent into the storage device 2. By performing image processing on the image of the face of the subject person using thedata processing apparatus 1, the image of the face of the subject person and the image of the desirable face are mixed to make the former image closer to the latter. In the image processing, the original facial image of the face of the subject person acquired in the above process is subjected to the same scheme as employed in the above-described modifying method to make the highlighted areas, the shapes of the eyebrows, the eye lines, the lip line and the like on the face approach those of the desirable face step by step. - FIG. 3 exemplifies highlighted areas. (a) in FIG. 3 shows the states of bright and dark areas on the original facial image of the face of the subject person, and (b) in FIG. 3 shows the states of bright and dark areas made close to those of the desirable face in image processing. In the figure, the densely dotted portions represent relatively dark portions or dark areas and a portion surrounded by the dark portions represent the brightest area (highlighted area). As apparent from this example in which the highlighted area belongs to an oval shape, the current face of the subject person is off the ideal oval shape and the face of the subject person can be made closer to the desirable face which has an ideal oval highlighted area by mainly modifying the off portions.
- As this process proceeds, the images of a plurality of imagined post-makeup faces with different similarity levels with respect to a desirable face are formed as described above, and those images are displayed on the screen of the
monitor 3 to permit the subject person to make a selection. If the subject person agrees with any of the displayed images, the selected face is set as an ideal post-makeup face. - Once the ideal post-makeup face is determined, a sequence of makeup procedures about portions of eyebrows to be shaved and where to draw eyebrows, the lines and ranges for eye liners and eye shadows, the drawing range for a lipstick and separate application of different foundations, is derived from the ideal post-makeup face. This sequence is carried out by a previously set makeup program.
- With regard to foundation, in particular, how to apply foundations is derived by performing the following processes under the control of a foundation program which is included in the makeup program. First, image processing is performed to divide the ideal post-makeup face into a plurality of bright and dark areas, e.g., two to four areas. This process is exemplified in FIG. 4. In the exemplified diagram, the face is segmented to four areas with different brightnesses (black-filled portion, densely-dotted portion, thinly-dotted portion and white portion). Next, color data for the segmented areas are acquired. The color data in this case are based on R, G and B used in a television system. Based on the obtained color data of each area, a foundation to be applied on that area is determined.
- As the makeup procedure needed for making up the face of the subject person into the ideal post-makeup face and cosmetics to be used in the makeup procedure are determined in the above manner, actual makeup is finally applied to the face of the subject person based on what has been determined. In utilizing the thus derived makeup procedure and the like in the actual makeup, data about the makeup procedure, etc. should be displayed on the monitor screen or printed out.
- According to a second carrying-out mode of the present invention, a pattern mask is used to apply foundations on the face of a subject person. A pattern mask is formed like a pattern sheet by cutting out individual areas from a printed image of the face, segmented into a plurality of areas in regard to bright areas and dark areas in the above-described image processing, in the real size of the face of the subject person. Then, the pattern mask is placed on the face of the subject person and foundations for the individual areas are applied over the pattern mask.
- A third carrying-out mode of the present invention employs a projection scheme to apply foundations on the face of a subject person. Specifically, the image of the face, segmented into a plurality of areas in regard to bright areas and dark areas in the above-described image processing, is projected on the face of the subject person in real size, and foundations for the associated areas are separately applied based on this projected image.
- A fourth carrying-out mode of the present invention uses a transfer scheme to apply foundations on the face of a subject person. Specifically, first, in association with a plurality of areas segmented in regard to bright areas and dark areas in the above-discussed image processing, foundations for the respective areas are printed in an actual size on a thin base material to prepare a foundation transfer film. Alternatively, images of bright and dark states on the ideal post-makeup face determined in the above-described manner are printed on a thin base material using foundations to prepare a foundation transfer film. Then, such a foundation transfer film is pressed against the face of the subject person to transfer the foundations. Finally, the proper shade-off process or the like is performed on the transferred foundations at the boundaries between the individual areas. The foundation transfer film in this case does not necessarily correspond to the entire face, but a partial transfer film for the upper eyelids or a portion around the nose, for example, may be used.
- According to the fourth carrying-out mode of the present invention, application of foundations to the face of the subject person is accomplished by directly printing the foundations on the face by means of a three-dimensional printer or the like. Specifically, an image with separately applied foundations, which is formed by the above-described image processing, is directly spray-printed on the face by a three-dimensional printer or the like which uses the necessary foundations as inks.
Claims (14)
1. A facial image modifying method comprising the steps of determining a desirable face, and causing an original facial image to approach an image of said desirable face through image processing to acquire a modified facial image.
2. A makeup simulation method comprising the steps of:
(a) determining a desirable face and performing image processing on an image of a face of a subject person based on said desirable face, thereby forming images of imagined post-makeup faces;
(b) displaying said imagined post-makeup faces, prepared in said step (a), on a monitor screen or the like and selectively determining from said displayed imagined post-makeup faces an ideal post-makeup face said subject person desires; and
(c) deriving a makeup procedure from an image of said ideal post-makeup face, determined in said step (b), based on a previously set makeup program.
3. A makeup simulation method for executing makeup simulation based on shading states of a face, comprising the steps of:
(e) determining a desirable face and performing image processing on an image of a face of a subject person based on said desirable face, thereby forming images of imagined post-makeup faces;
(f) displaying said imagined post-makeup faces, prepared in said step (e), on a monitor screen or the like and selectively determining from said displayed imagined post-makeup faces an ideal post-makeup face said subject person desires;
(g) performing image processing to segment an image of said ideal post-makeup face, determined in said step (f), to a plurality of areas in regard to bright areas and dark areas;
(h) acquiring color data for said segmented areas obtained in said step (g); and
(i) determining foundations to be applied on individual areas based on said color data for those areas, acquired in said step (h), and a previously set foundation program.
4. The makeup simulation method according to claim 2 or 3, further comprising a step of preparing a plurality of model faces and determining said desirable face based on said model faces.
5. A makeup method comprising the steps of:
(a) determining a desirable face and performing image processing on an image of a face of a subject person based on said desirable face, thereby forming images of imagined post-makeup faces;
(b) displaying said imagined post-makeup faces, prepared in said step (a), on a monitor screen or the like and selectively determining from said displayed imagined post-makeup faces an ideal post-makeup face said subject person desires;
(c) deriving a makeup procedure from an image of said ideal post-makeup face, determined in said step (b), based on a previously set makeup program; and
(d) applying makeup on said face of said subject person in accordance with said makeup procedure derived in said step (c).
6. A makeup method for applying makeup based on shading states of a face, comprising the steps of:
(e) determining a desirable face and performing image processing on an image of a face of a subject person based on said desirable face, thereby forming images of imagined post-makeup faces;
(f) displaying said imagined post-makeup faces, prepared in said step (e), on a monitor screen or the like and selectively determining from said displayed imagined post-makeup faces an ideal post-makeup face said subject person desires;
(g) performing image processing to segment an image of said ideal post-makeup face, determined in said step (f), to a plurality of areas in regard to bright areas and dark areas;
(h) acquiring color data for said segmented areas obtained in said step (g);
(i) determining foundations to be applied on individual areas based on said color data for those areas, acquired in said step (h), and a previously set foundation program; and
(j) separately applying foundations, determined in said step (i), on said face of said subject person in association with said individual areas.
7. The makeup method according to claim 6 , further comprising:
(k) printing said image of said ideal post-makeup face, segmented into a plurality of areas in regard to bright areas and dark areas in said step (g) in claim 3 , in an actual size of said face of said subject person;
(l) cutting individual areas from a printed image obtained in said step (k) to prepare a pattern mask for foundation; and
(m) executing separate application of foundations in said step (j) in claim 6 using said pattern mask prepared in said step (l).
8. The makeup method according to claim 6 , further comprising:
(n) projecting said image of said ideal post-makeup face, segmented into a plurality of areas in regard to bright areas and dark areas in the image processing in said step (g) in claim 3 , in an actual size on said face of said subject person; and
(o) executing separate application of foundations in said step (j) in claim 3 based on said image projected in said step (n).
9. A makeup method for applying makeup based on bright and dark states of a face, comprising the steps of:
(A) determining a desirable face and performing image processing on an image of a face of a subject person based on said desirable face, thereby forming images of imagined post-makeup faces;
(B) displaying said imagined post-makeup faces, prepared in said step (A), on a monitor screen or the like and selectively determining from said displayed imagined post-makeup faces an ideal post-makeup face said subject person desires;
(C) performing image processing to segment an image of said ideal post-makeup face, determined in said step (B), to a plurality of areas in regard to bright areas and dark areas;
(D) acquiring color data for said segmented areas obtained in said step (C);
(E) determining foundations to be applied on individual areas based on said color data for those areas, acquired in said step (D), and a previously set foundation program;
(F) printing foundations, determined in said step (E) in association with said plurality of areas segmented in regard to bright areas and dark areas in said image processing in said step (C), on a thin base material to prepare a foundation transfer film; and
(G) transferring foundations, printed on said foundation transfer film prepared in said step (F), on said face of said subject person to thereby apply foundations, associated with said individual areas, on said face of said subject person.
10. A makeup method comprising the steps of:
(H) determining a desirable face and performing image processing on an image of a face of a subject person based on said desirable face, thereby forming images of imagined post-makeup faces;
(I) displaying said imagined post-makeup faces, prepared in said step (H), on a monitor screen or the like and selectively determining an ideal post-makeup face desired by said subject person from said displayed imagined post-makeup faces;
(K) printing an image of said ideal post-makeup face, determined in said step (I), on a thin base material using foundations to prepare a foundation transfer film; and
(L) transferring foundations, printed on said foundation transfer film prepared in said step (K), on said face of said subject person to thereby apply said foundations on said face of said subject person.
11. The makeup method according to any one of claims 5 to 10 , further comprising a step of preparing a plurality of model faces and determining said desirable face based on said model faces.
12. A makeup support apparatus for use in the makeup simulation method as recited in any one of claims 2 to 4 or in the makeup method as recited in any one of claims 5 to 11 , comprising:
image storage means for storing images of a plurality of model faces;
image pick-up and input means for imaging and inputting a face of a subject person;
a monitor for displaying a necessary image; and
processing means capable of performing a process of causing an image of said face of said subject person to approach a desirable face already determined to thereby form images of imagined post-makeup faces, a process of displaying said imagined post-makeup faces on a screen of said monitor, and a process of deriving a makeup procedure from a determined ideal post-makeup face.
13. The makeup support apparatus according to claim 12 , wherein said processing means further performs a process of segmenting said determined ideal post-makeup face to a plurality of areas in regard to bright areas and dark areas, a process of acquiring color data for said segmented areas, and a process of determining foundations to be applied on individual areas based on said acquired color data for those areas.
14. A foundation transfer film for use in separate application of foundations, said foundation transfer film having foundations printed on a thin base material in regard to bright areas and dark areas on a face.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/281,232 US20030062058A1 (en) | 1997-03-06 | 2002-10-28 | Method of modifying facial images, makeup simulation method, makeup method, makeup support apparatus and foundation transfer film |
US11/316,834 US20060132506A1 (en) | 1997-03-06 | 2005-12-27 | Method of modifying facial images, makeup simulation method, makeup method, makeup support apparatus and foundation transfer film |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP9-51460 | 1997-03-06 | ||
JP05146097A JP3912834B2 (en) | 1997-03-06 | 1997-03-06 | Face image correction method, makeup simulation method, makeup method, makeup support apparatus, and foundation transfer film |
US09/380,406 US6502583B1 (en) | 1997-03-06 | 1998-03-06 | Method of correcting face image, makeup simulation method, makeup method makeup supporting device and foundation transfer film |
US10/281,232 US20030062058A1 (en) | 1997-03-06 | 2002-10-28 | Method of modifying facial images, makeup simulation method, makeup method, makeup support apparatus and foundation transfer film |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/380,406 Division US6502583B1 (en) | 1997-03-06 | 1998-03-06 | Method of correcting face image, makeup simulation method, makeup method makeup supporting device and foundation transfer film |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/316,834 Division US20060132506A1 (en) | 1997-03-06 | 2005-12-27 | Method of modifying facial images, makeup simulation method, makeup method, makeup support apparatus and foundation transfer film |
Publications (1)
Publication Number | Publication Date |
---|---|
US20030062058A1 true US20030062058A1 (en) | 2003-04-03 |
Family
ID=12887558
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/380,406 Expired - Fee Related US6502583B1 (en) | 1997-03-06 | 1998-03-06 | Method of correcting face image, makeup simulation method, makeup method makeup supporting device and foundation transfer film |
US10/281,232 Abandoned US20030062058A1 (en) | 1997-03-06 | 2002-10-28 | Method of modifying facial images, makeup simulation method, makeup method, makeup support apparatus and foundation transfer film |
US11/316,834 Abandoned US20060132506A1 (en) | 1997-03-06 | 2005-12-27 | Method of modifying facial images, makeup simulation method, makeup method, makeup support apparatus and foundation transfer film |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/380,406 Expired - Fee Related US6502583B1 (en) | 1997-03-06 | 1998-03-06 | Method of correcting face image, makeup simulation method, makeup method makeup supporting device and foundation transfer film |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/316,834 Abandoned US20060132506A1 (en) | 1997-03-06 | 2005-12-27 | Method of modifying facial images, makeup simulation method, makeup method, makeup support apparatus and foundation transfer film |
Country Status (6)
Country | Link |
---|---|
US (3) | US6502583B1 (en) |
EP (1) | EP1030267B1 (en) |
JP (1) | JP3912834B2 (en) |
DE (1) | DE69841479D1 (en) |
TW (1) | TW416041B (en) |
WO (1) | WO1998039735A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070049832A1 (en) * | 2005-08-12 | 2007-03-01 | Edgar Albert D | System and method for medical monitoring and treatment through cosmetic monitoring and treatment |
US20070047761A1 (en) * | 2005-06-10 | 2007-03-01 | Wasilunas Elizabeth A | Methods Of Analyzing Human Facial Symmetry And Balance To Provide Beauty Advice |
US20080192999A1 (en) * | 2007-02-12 | 2008-08-14 | Edgar Albert D | System and method for applying a reflectance modifying agent to change a person's appearance based on a digital image |
US20080194971A1 (en) * | 2007-02-12 | 2008-08-14 | Edgar Albert D | System and method for applying a reflectance modifying agent electrostatically to improve the visual attractiveness of human skin |
US20090025747A1 (en) * | 2007-05-29 | 2009-01-29 | Edgar Albert D | Apparatus and method for the precision application of cosmetics |
US20100142755A1 (en) * | 2008-11-26 | 2010-06-10 | Perfect Shape Cosmetics, Inc. | Method, System, and Computer Program Product for Providing Cosmetic Application Instructions Using Arc Lines |
US20110124989A1 (en) * | 2006-08-14 | 2011-05-26 | Tcms Transparent Beauty Llc | Handheld Apparatus And Method For The Automated Application Of Cosmetics And Other Substances |
WO2011085727A1 (en) | 2009-01-15 | 2011-07-21 | Tim Schyberg | Advice information system |
EP2571395A4 (en) * | 2010-05-18 | 2014-03-19 | Elc Man Llc | Method and system for automatic or manual evaluation to provide targeted and individualized delivery of cosmetic actives in a mask or patch form |
EP3039990A4 (en) * | 2013-08-30 | 2016-09-14 | Panasonic Ip Man Co Ltd | Makeup assistance device, makeup assistance system, makeup assistance method, and makeup assistance program |
EP3530142A4 (en) * | 2016-10-24 | 2019-10-30 | Panasonic Intellectual Property Management Co., Ltd. | Image processing device, image processing method, and image processing program |
US11058208B2 (en) | 2018-04-13 | 2021-07-13 | Chanel Parfums Beaute | Method for selecting a cosmetic product for an intended user |
Families Citing this family (96)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3912834B2 (en) * | 1997-03-06 | 2007-05-09 | 有限会社開発顧問室 | Face image correction method, makeup simulation method, makeup method, makeup support apparatus, and foundation transfer film |
JP2000007530A (en) * | 1998-06-22 | 2000-01-11 | Kaihatsu Komonshitsu:Kk | Pigmentation make-up and cosmetic auxiliary tool, floodlight and pack using same |
US6293284B1 (en) | 1999-07-07 | 2001-09-25 | Division Of Conopco, Inc. | Virtual makeover |
US6437866B1 (en) | 1999-07-07 | 2002-08-20 | Fd Management, Inc. | System for assisting customers in selecting an optimum color cosmetic product |
TW511040B (en) * | 1999-07-07 | 2002-11-21 | Fd Man Inc | Color cosmetic selection system |
FR2799022B1 (en) * | 1999-09-29 | 2002-02-01 | Oreal | MAKEUP ASSISTANCE DEVICE AND ASSEMBLY CONSISTING OF SUCH A DEVICE AND A DEVICE FOR DELIVERING A PRODUCT HAVING A PREDETERMINED BRDF, SELECTED BY THE MAKEUP ASSISTANCE DEVICE |
JP4396873B2 (en) | 1999-10-01 | 2010-01-13 | 株式会社資生堂 | How to choose lipstick or eye shadow |
JP2001268594A (en) * | 2000-03-15 | 2001-09-28 | Infiniteface.Com Inc | Client server system for three-dimensional beauty simulation |
US7809153B2 (en) * | 2000-04-27 | 2010-10-05 | Inter-Images Partners, Lp | System and method for assessment of health risks and visualization of weight loss and muscle gain |
AU7664301A (en) | 2000-06-27 | 2002-01-21 | Ruth Gal | Make-up and fashion accessory display and marketing system and method |
JP2002109555A (en) * | 2000-07-24 | 2002-04-12 | Mitsubishi Electric Corp | Virtual cosmetic surgery system and virtual cosmetic surgery method |
KR100405328B1 (en) * | 2000-07-28 | 2003-11-12 | 이성환 | The real time hair,makeup and fashion accessary design system and method in the internet |
KR20020031488A (en) * | 2000-10-20 | 2002-05-02 | 홍광기 | Method for service providing a gift of cosmetics and system for the performing the same |
JP2002140492A (en) * | 2000-10-31 | 2002-05-17 | Daizo:Kk | Merchandise development system and merchandise development method |
US6959119B2 (en) * | 2000-11-03 | 2005-10-25 | Unilever Home & Personal Care Usa | Method of evaluating cosmetic products on a consumer with future predictive transformation |
FR2818529A1 (en) | 2000-12-21 | 2002-06-28 | Oreal | METHOD FOR DETERMINING A DEGREE OF A BODY TYPE CHARACTERISTIC |
US6761697B2 (en) | 2001-10-01 | 2004-07-13 | L'oreal Sa | Methods and systems for predicting and/or tracking changes in external body conditions |
US7634103B2 (en) | 2001-10-01 | 2009-12-15 | L'oreal S.A. | Analysis using a three-dimensional facial image |
US20030063102A1 (en) * | 2001-10-01 | 2003-04-03 | Gilles Rubinstenn | Body image enhancement |
US7324668B2 (en) | 2001-10-01 | 2008-01-29 | L'oreal S.A. | Feature extraction in beauty analysis |
US20030065588A1 (en) * | 2001-10-01 | 2003-04-03 | Gilles Rubinstenn | Identification and presentation of analogous beauty case histories |
US7437344B2 (en) | 2001-10-01 | 2008-10-14 | L'oreal S.A. | Use of artificial intelligence in providing beauty advice |
FR2831014B1 (en) * | 2001-10-16 | 2004-02-13 | Oreal | METHOD AND DEVICE FOR DETERMINING THE DESIRED AND / OR EFFECTIVE DEGREE OF AT LEAST ONE CHARACTERISTIC OF A PRODUCT |
KR20030059685A (en) * | 2002-01-04 | 2003-07-10 | 주식회사 태평양 | Unit for controlling a make-up simulation status |
KR20030091419A (en) * | 2002-05-28 | 2003-12-03 | 주식회사 태평양 | Makeup Simulation System Based On Facial Affection Type |
US7082211B2 (en) | 2002-05-31 | 2006-07-25 | Eastman Kodak Company | Method and system for enhancing portrait images |
JP2004005265A (en) * | 2002-05-31 | 2004-01-08 | Omron Corp | Image composing method, device and system |
US20030234871A1 (en) * | 2002-06-25 | 2003-12-25 | Squilla John R. | Apparatus and method of modifying a portrait image |
US7039222B2 (en) | 2003-02-28 | 2006-05-02 | Eastman Kodak Company | Method and system for enhancing portrait images that are processed in a batch mode |
US6945431B2 (en) * | 2003-07-24 | 2005-09-20 | Fluid Management, Inc. | Sanitizable piston pumps and dispensing systems incorporating the same |
JP2005092588A (en) * | 2003-09-18 | 2005-04-07 | Hitachi Software Eng Co Ltd | Composite image print device and image editing method |
US7347344B2 (en) | 2003-10-27 | 2008-03-25 | Fluid Management Operation Llc | Apparatus for dispensing a plurality of fluids and container for use in the same |
US6935386B2 (en) | 2003-10-30 | 2005-08-30 | Fluid Management, Inc. | Automated cosmetics dispenser for point of sale cosmetics products |
US20070019882A1 (en) * | 2004-01-30 | 2007-01-25 | Shoji Tanaka | Makeup simulation program, makeup simulation device, and makeup simulation method |
JP4551683B2 (en) | 2004-03-30 | 2010-09-29 | 任天堂株式会社 | Image processing apparatus and image processing program having imaging function |
JP2006004158A (en) * | 2004-06-17 | 2006-01-05 | Olympus Corp | Image processing program, image processing method, image processor, and recording medium |
US20060038812A1 (en) * | 2004-08-03 | 2006-02-23 | Warn David R | System and method for controlling a three dimensional morphable model |
ES2373366T3 (en) * | 2004-10-22 | 2012-02-02 | Shiseido Co., Ltd. | LIP MAKEUP PROCEDURE. |
US20060098076A1 (en) * | 2004-11-05 | 2006-05-11 | Liang Jason J | Desktop Personal Digital Cosmetics Make Up Printer |
US7796827B2 (en) * | 2004-11-30 | 2010-09-14 | Hewlett-Packard Development Company, L.P. | Face enhancement in a digital video |
US8107672B2 (en) | 2006-01-17 | 2012-01-31 | Shiseido Company, Ltd. | Makeup simulation system, makeup simulator, makeup simulation method, and makeup simulation program |
US8265351B2 (en) * | 2006-05-05 | 2012-09-11 | Parham Aarabi | Method, system and computer program product for automatic and semi-automatic modification of digital images of faces |
CN100527170C (en) * | 2006-09-20 | 2009-08-12 | 清华大学 | Complex expression emulation system and implementation method |
WO2008102440A1 (en) * | 2007-02-21 | 2008-08-28 | Tadashi Goino | Makeup face image creating device and method |
JP2007204487A (en) * | 2007-03-30 | 2007-08-16 | Kaihatsu Komonshitsu:Kk | Make-up method, make-up apparatus used in the same method and make-up transfer membrane |
US7933454B2 (en) * | 2007-06-25 | 2011-04-26 | Xerox Corporation | Class-based image enhancement system |
US8425477B2 (en) * | 2008-09-16 | 2013-04-23 | Elc Management Llc | Method and system for providing targeted and individualized delivery of cosmetic actives |
US8597667B2 (en) | 2008-05-09 | 2013-12-03 | Elc Management Llc | Targeted and individualized cosmetic delivery |
US8285059B2 (en) * | 2008-05-20 | 2012-10-09 | Xerox Corporation | Method for automatic enhancement of images containing snow |
US8194992B2 (en) * | 2008-07-18 | 2012-06-05 | Xerox Corporation | System and method for automatic enhancement of seascape images |
US8491926B2 (en) | 2008-09-16 | 2013-07-23 | Elc Management Llc | Method and system for automatic or manual evaluation to provide targeted and individualized delivery of cosmetic actives in a mask or patch form |
US20100224205A1 (en) * | 2008-12-09 | 2010-09-09 | Shekhar Mitra | Device and methods for modifying keratinous surfaces |
CA2749085A1 (en) * | 2009-01-16 | 2010-07-22 | The Procter & Gamble Company | Apparatus and methods for modifying keratinous surfaces |
KR101045964B1 (en) | 2009-09-22 | 2011-07-04 | 조소영 | Semi-permanent eye line special makeup method |
TW201212852A (en) * | 2010-09-21 | 2012-04-01 | Zong Jing Investment Inc | Facial cosmetic machine |
KR20120087256A (en) * | 2010-12-17 | 2012-08-07 | 한국전자통신연구원 | Method For Operating Makeup Robot Based On Expert Knowledge And System Thereof |
JP2012181688A (en) * | 2011-03-01 | 2012-09-20 | Sony Corp | Information processing device, information processing method, information processing system, and program |
JP2015507241A (en) | 2011-12-04 | 2015-03-05 | デジタル メイクアップ リミテッドDigital Makeup Ltd | Digital makeup |
US20130159895A1 (en) * | 2011-12-15 | 2013-06-20 | Parham Aarabi | Method and system for interactive cosmetic enhancements interface |
FR2984726B1 (en) * | 2011-12-23 | 2014-08-29 | Oreal | MAKE-UP PROCESS |
US11246820B2 (en) | 2011-12-23 | 2022-02-15 | L'oreal | Makeup process |
AU2013203345A1 (en) * | 2012-01-11 | 2013-07-25 | Steven Liew | A method and apparatus for facial aging assessment and treatment management |
TWI463955B (en) * | 2012-02-20 | 2014-12-11 | Zong Jing Investment Inc | Eye makeup device |
KR101398188B1 (en) * | 2012-03-13 | 2014-05-30 | 주식회사 네오위즈블레스스튜디오 | Method for providing on-line game supporting character make up and system there of |
US9129147B1 (en) | 2012-05-22 | 2015-09-08 | Image Metrics Limited | Optimal automatic capture of facial movements and expressions in video sequences |
US9104908B1 (en) | 2012-05-22 | 2015-08-11 | Image Metrics Limited | Building systems for adaptive tracking of facial features across individuals and groups |
US9111134B1 (en) | 2012-05-22 | 2015-08-18 | Image Metrics Limited | Building systems for tracking facial features across individuals and groups |
US9449412B1 (en) | 2012-05-22 | 2016-09-20 | Image Metrics Limited | Adaptive, calibrated simulation of cosmetic products on consumer devices |
US9460462B1 (en) | 2012-05-22 | 2016-10-04 | Image Metrics Limited | Monetization using video-based simulation of cosmetic products |
JP6288404B2 (en) * | 2013-02-28 | 2018-03-07 | パナソニックIpマネジメント株式会社 | Makeup support device, makeup support method, and makeup support program |
CN104270996B (en) * | 2013-03-01 | 2018-04-03 | 松下知识产权经营株式会社 | Device and printing process to the film printing functional material with biocompatibility |
CN104599297B (en) * | 2013-10-31 | 2018-07-10 | 厦门美图网科技有限公司 | A kind of image processing method for going up blush automatically to face |
CN103632165B (en) * | 2013-11-28 | 2017-07-04 | 小米科技有限责任公司 | A kind of method of image procossing, device and terminal device |
WO2015127394A1 (en) | 2014-02-23 | 2015-08-27 | Northeastern University | System for beauty, cosmetic, and fashion analysis |
GB201419438D0 (en) * | 2014-10-31 | 2014-12-17 | Microsoft Corp | Modifying video call data |
JP6607343B2 (en) * | 2015-03-30 | 2019-11-20 | パナソニックIpマネジメント株式会社 | Printing apparatus and method for manufacturing thin-film printed body |
WO2016175675A1 (en) * | 2015-04-29 | 2016-11-03 | Sarokvasha Fedor Valeryevich | Method of modifying a portrait image and device for doing the same |
US10324739B2 (en) | 2016-03-03 | 2019-06-18 | Perfect Corp. | Systems and methods for simulated application of cosmetic effects |
US11055762B2 (en) | 2016-03-21 | 2021-07-06 | The Procter & Gamble Company | Systems and methods for providing customized product recommendations |
WO2017177259A1 (en) * | 2016-04-12 | 2017-10-19 | Phi Technologies Pty Ltd | System and method for processing photographic images |
JP6731616B2 (en) * | 2016-06-10 | 2020-07-29 | パナソニックIpマネジメント株式会社 | Virtual makeup device, virtual makeup method, and virtual makeup program |
TWI573093B (en) * | 2016-06-14 | 2017-03-01 | Asustek Comp Inc | Method of establishing virtual makeup data, electronic device having method of establishing virtual makeup data and non-transitory computer readable storage medium thereof |
WO2018163356A1 (en) * | 2017-03-09 | 2018-09-13 | 株式会社 資生堂 | Information processing device and program |
US10621771B2 (en) | 2017-03-21 | 2020-04-14 | The Procter & Gamble Company | Methods for age appearance simulation |
US10614623B2 (en) | 2017-03-21 | 2020-04-07 | Canfield Scientific, Incorporated | Methods and apparatuses for age appearance simulation |
WO2018222808A1 (en) | 2017-05-31 | 2018-12-06 | The Procter & Gamble Company | Systems and methods for determining apparent skin age |
EP3635626A1 (en) | 2017-05-31 | 2020-04-15 | The Procter and Gamble Company | System and method for guiding a user to take a selfie |
DE102017127066A1 (en) * | 2017-11-17 | 2019-05-23 | Thomas Ebert | A method of creating a visual make-up instruction for a person to make up their face, a method of visualizing a make-up instruction for a person to make up their face, computer program product, server equipment, and a person instructing a person to make-up their face |
JP2019115653A (en) * | 2017-12-26 | 2019-07-18 | パナソニックIpマネジメント株式会社 | Body appearance correction support method and device, and computer program |
CN108257084B (en) * | 2018-02-12 | 2021-08-24 | 北京中视广信科技有限公司 | Lightweight face automatic makeup method based on mobile terminal |
CN109315913B (en) * | 2018-10-16 | 2022-03-18 | 陕西科技大学 | Wearable eyebrow drawing auxiliary device |
EP3664035B1 (en) | 2018-12-03 | 2021-03-03 | Chanel Parfums Beauté | Method for simulating the realistic rendering of a makeup product |
US10949649B2 (en) | 2019-02-22 | 2021-03-16 | Image Metrics, Ltd. | Real-time tracking of facial features in unconstrained video |
CN111652792B (en) * | 2019-07-05 | 2024-03-05 | 广州虎牙科技有限公司 | Local processing method, live broadcasting method, device, equipment and storage medium for image |
US11253045B2 (en) | 2019-07-18 | 2022-02-22 | Perfect Mobile Corp. | Systems and methods for recommendation of makeup effects based on makeup trends and facial analysis |
CN110853119B (en) * | 2019-09-15 | 2022-05-20 | 北京航空航天大学 | Reference picture-based makeup transfer method with robustness |
Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4276570A (en) * | 1979-05-08 | 1981-06-30 | Nancy Burson | Method and apparatus for producing an image of a person's face at a different age |
US5293313A (en) * | 1990-11-21 | 1994-03-08 | Picker International, Inc. | Real time physician view box |
US5353391A (en) * | 1991-05-06 | 1994-10-04 | Apple Computer, Inc. | Method apparatus for transitioning between sequences of images |
US5359712A (en) * | 1991-05-06 | 1994-10-25 | Apple Computer, Inc. | Method and apparatus for transitioning between sequences of digital information |
US5375195A (en) * | 1992-06-29 | 1994-12-20 | Johnston; Victor S. | Method and apparatus for generating composites of human faces |
US5537662A (en) * | 1992-05-29 | 1996-07-16 | Casio Computer Co., Ltd. | Electronic montage composing apparatus |
US5568598A (en) * | 1994-09-09 | 1996-10-22 | Intel Corporation | Displaying images using progressive fade-in |
US5602583A (en) * | 1995-02-10 | 1997-02-11 | Zenith Electronics Corporation | NTSC rejection filter with switched tomlinson precoder for reducing NTSC co-channel interference in ATV receivers |
US5623587A (en) * | 1993-10-15 | 1997-04-22 | Kideo Productions, Inc. | Method and apparatus for producing an electronic image |
US5638502A (en) * | 1992-12-25 | 1997-06-10 | Casio Computer Co., Ltd. | Device for creating a new object image relating to plural object images |
US5640522A (en) * | 1994-12-05 | 1997-06-17 | Microsoft Corporation | Method and system for previewing transition effects between pairs of images |
US5687259A (en) * | 1995-03-17 | 1997-11-11 | Virtual Eyes, Incorporated | Aesthetic imaging system |
US5774129A (en) * | 1995-06-07 | 1998-06-30 | Massachusetts Institute Of Technology | Image analysis and synthesis networks using shape and texture information |
US5785960A (en) * | 1997-03-19 | 1998-07-28 | Elizabeth Arden Co., Division Of Conopco, Inc. | Method and system for customizing dermatological foundation products |
US5818457A (en) * | 1993-05-25 | 1998-10-06 | Casio Computer Co., Ltd. | Face image data processing devices |
US5850463A (en) * | 1995-06-16 | 1998-12-15 | Seiko Epson Corporation | Facial image processing method and facial image processing apparatus |
US6293284B1 (en) * | 1999-07-07 | 2001-09-25 | Division Of Conopco, Inc. | Virtual makeover |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5939934Y2 (en) * | 1982-05-14 | 1984-11-10 | オクス工業株式会社 | spot face paint |
US5690130A (en) * | 1986-06-17 | 1997-11-25 | Color Prelude Inc. | Cosmetic sampler with integral applicator |
US5647941A (en) * | 1986-06-17 | 1997-07-15 | Color Prelude, Inc. | Method of making a lipstick sampler |
US4848378A (en) * | 1986-06-17 | 1989-07-18 | Alford Industries Inc. | Cosmetic sampler |
JPH0823871B2 (en) * | 1986-09-24 | 1996-03-06 | 鐘紡株式会社 | Make-up simulation system |
US4865057A (en) * | 1987-05-28 | 1989-09-12 | Braun Edward H | Apparatus and method for making hairpieces undetectable |
JPS644189A (en) * | 1987-06-26 | 1989-01-09 | Kanebo Ltd | Makeup simulation system |
JP2967228B2 (en) * | 1988-09-07 | 1999-10-25 | 株式会社トプコン | Image data transfer device |
JP3040466B2 (en) | 1990-07-17 | 2000-05-15 | ブリテイッシュ・テレコミュニケーションズ・パブリック・リミテッド・カンパニー | Image processing method |
JPH06319613A (en) * | 1993-04-30 | 1994-11-22 | Onishi Netsugaku:Kk | Make-up support device for face |
DE69301239T2 (en) * | 1993-08-03 | 1996-08-14 | Parfums Christian Dior, Paris | Process for determining the color of a complexion foundation for the sensitive restoration of the skin color of a person and device for its use |
JPH0863582A (en) * | 1994-04-18 | 1996-03-08 | Kunio Nogi | Picture processing method and simulating method for facial makeup using the same |
JP2603445B2 (en) * | 1994-11-10 | 1997-04-23 | インターナショナル・ビジネス・マシーンズ・コーポレイション | Hair image adaptation method and computer system |
JP3912834B2 (en) * | 1997-03-06 | 2007-05-09 | 有限会社開発顧問室 | Face image correction method, makeup simulation method, makeup method, makeup support apparatus, and foundation transfer film |
US6691872B1 (en) * | 1997-04-08 | 2004-02-17 | Aki, Inc. | Method of making a cosmetic sampler using bulk thin film application techniques |
US6190730B1 (en) * | 1998-05-22 | 2001-02-20 | Color Prelude, Inc. | Cosmetic sampler with sample screen printed on film |
US6950541B1 (en) * | 1999-05-11 | 2005-09-27 | Authentec, Inc. | Fingerprint sensor package including flexible circuit substrate and associated methods |
EP1136937B1 (en) * | 2000-03-22 | 2006-05-10 | Kabushiki Kaisha Toshiba | Facial image forming recognition apparatus and a pass control apparatus |
-
1997
- 1997-03-06 JP JP05146097A patent/JP3912834B2/en not_active Expired - Fee Related
-
1998
- 1998-03-05 TW TW087103212A patent/TW416041B/en not_active IP Right Cessation
- 1998-03-06 WO PCT/JP1998/000943 patent/WO1998039735A1/en active Application Filing
- 1998-03-06 DE DE69841479T patent/DE69841479D1/en not_active Expired - Lifetime
- 1998-03-06 EP EP98905802A patent/EP1030267B1/en not_active Expired - Lifetime
- 1998-03-06 US US09/380,406 patent/US6502583B1/en not_active Expired - Fee Related
-
2002
- 2002-10-28 US US10/281,232 patent/US20030062058A1/en not_active Abandoned
-
2005
- 2005-12-27 US US11/316,834 patent/US20060132506A1/en not_active Abandoned
Patent Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4276570A (en) * | 1979-05-08 | 1981-06-30 | Nancy Burson | Method and apparatus for producing an image of a person's face at a different age |
US5293313A (en) * | 1990-11-21 | 1994-03-08 | Picker International, Inc. | Real time physician view box |
US5353391A (en) * | 1991-05-06 | 1994-10-04 | Apple Computer, Inc. | Method apparatus for transitioning between sequences of images |
US5359712A (en) * | 1991-05-06 | 1994-10-25 | Apple Computer, Inc. | Method and apparatus for transitioning between sequences of digital information |
US5537662A (en) * | 1992-05-29 | 1996-07-16 | Casio Computer Co., Ltd. | Electronic montage composing apparatus |
US5375195A (en) * | 1992-06-29 | 1994-12-20 | Johnston; Victor S. | Method and apparatus for generating composites of human faces |
US5966137A (en) * | 1992-12-25 | 1999-10-12 | Casio Computer Co., Ltd. | Device for creating a new object image relating to plural object images |
US5638502A (en) * | 1992-12-25 | 1997-06-10 | Casio Computer Co., Ltd. | Device for creating a new object image relating to plural object images |
US5818457A (en) * | 1993-05-25 | 1998-10-06 | Casio Computer Co., Ltd. | Face image data processing devices |
US5623587A (en) * | 1993-10-15 | 1997-04-22 | Kideo Productions, Inc. | Method and apparatus for producing an electronic image |
US5568598A (en) * | 1994-09-09 | 1996-10-22 | Intel Corporation | Displaying images using progressive fade-in |
US5640522A (en) * | 1994-12-05 | 1997-06-17 | Microsoft Corporation | Method and system for previewing transition effects between pairs of images |
US5602583A (en) * | 1995-02-10 | 1997-02-11 | Zenith Electronics Corporation | NTSC rejection filter with switched tomlinson precoder for reducing NTSC co-channel interference in ATV receivers |
US5687259A (en) * | 1995-03-17 | 1997-11-11 | Virtual Eyes, Incorporated | Aesthetic imaging system |
US5825941A (en) * | 1995-03-17 | 1998-10-20 | Mirror Software Corporation | Aesthetic imaging system |
US5854850A (en) * | 1995-03-17 | 1998-12-29 | Mirror Software Corporation | Method and apparatus for selectively illustrating image modifications in an aesthetic imaging system |
US5774129A (en) * | 1995-06-07 | 1998-06-30 | Massachusetts Institute Of Technology | Image analysis and synthesis networks using shape and texture information |
US5850463A (en) * | 1995-06-16 | 1998-12-15 | Seiko Epson Corporation | Facial image processing method and facial image processing apparatus |
US5785960A (en) * | 1997-03-19 | 1998-07-28 | Elizabeth Arden Co., Division Of Conopco, Inc. | Method and system for customizing dermatological foundation products |
US6293284B1 (en) * | 1999-07-07 | 2001-09-25 | Division Of Conopco, Inc. | Virtual makeover |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070047761A1 (en) * | 2005-06-10 | 2007-03-01 | Wasilunas Elizabeth A | Methods Of Analyzing Human Facial Symmetry And Balance To Provide Beauty Advice |
US8915562B2 (en) | 2005-08-12 | 2014-12-23 | Tcms Transparent Beauty Llc | System and method for applying a reflectance modifying agent to improve the visual attractiveness of human skin |
US10016046B2 (en) | 2005-08-12 | 2018-07-10 | Tcms Transparent Beauty, Llc | System and method for applying a reflectance modifying agent to improve the visual attractiveness of human skin |
US11147357B2 (en) | 2005-08-12 | 2021-10-19 | Tcms Transparent Beauty, Llc | System and method for applying a reflectance modifying agent to improve the visual attractiveness of human skin |
US20070049832A1 (en) * | 2005-08-12 | 2007-03-01 | Edgar Albert D | System and method for medical monitoring and treatment through cosmetic monitoring and treatment |
US8007062B2 (en) | 2005-08-12 | 2011-08-30 | Tcms Transparent Beauty Llc | System and method for applying a reflectance modifying agent to improve the visual attractiveness of human skin |
US9247802B2 (en) | 2005-08-12 | 2016-02-02 | Tcms Transparent Beauty Llc | System and method for medical monitoring and treatment through cosmetic monitoring and treatment |
US11445802B2 (en) | 2005-08-12 | 2022-09-20 | Tcms Transparent Beauty, Llc | System and method for applying a reflectance modifying agent to improve the visual attractiveness of human skin |
US10043292B2 (en) | 2006-08-14 | 2018-08-07 | Tcms Transparent Beauty Llc | System and method for applying a reflectance modifying agent to change a person's appearance based on a digital image |
US9449382B2 (en) | 2006-08-14 | 2016-09-20 | Tcms Transparent Beauty, Llc | System and method for applying a reflectance modifying agent to change a persons appearance based on a digital image |
US20110124989A1 (en) * | 2006-08-14 | 2011-05-26 | Tcms Transparent Beauty Llc | Handheld Apparatus And Method For The Automated Application Of Cosmetics And Other Substances |
US8942775B2 (en) | 2006-08-14 | 2015-01-27 | Tcms Transparent Beauty Llc | Handheld apparatus and method for the automated application of cosmetics and other substances |
US8582830B2 (en) | 2007-02-12 | 2013-11-12 | Tcms Transparent Beauty Llc | System and method for applying a reflectance modifying agent to change a persons appearance based on a digital image |
US10467779B2 (en) | 2007-02-12 | 2019-11-05 | Tcms Transparent Beauty Llc | System and method for applying a reflectance modifying agent to change a person's appearance based on a digital image |
US8184901B2 (en) | 2007-02-12 | 2012-05-22 | Tcms Transparent Beauty Llc | System and method for applying a reflectance modifying agent to change a person's appearance based on a digital image |
US20080194971A1 (en) * | 2007-02-12 | 2008-08-14 | Edgar Albert D | System and method for applying a reflectance modifying agent electrostatically to improve the visual attractiveness of human skin |
US20080192999A1 (en) * | 2007-02-12 | 2008-08-14 | Edgar Albert D | System and method for applying a reflectance modifying agent to change a person's appearance based on a digital image |
US10163230B2 (en) | 2007-02-12 | 2018-12-25 | Tcms Transparent Beauty Llc | System and method for applying a reflectance modifying agent to change a person's appearance based on a digital image |
US10486174B2 (en) | 2007-02-12 | 2019-11-26 | Tcms Transparent Beauty Llc | System and method for applying a reflectance modifying agent electrostatically to improve the visual attractiveness of human skin |
US20090025747A1 (en) * | 2007-05-29 | 2009-01-29 | Edgar Albert D | Apparatus and method for the precision application of cosmetics |
US10092082B2 (en) | 2007-05-29 | 2018-10-09 | Tcms Transparent Beauty Llc | Apparatus and method for the precision application of cosmetics |
US20100142755A1 (en) * | 2008-11-26 | 2010-06-10 | Perfect Shape Cosmetics, Inc. | Method, System, and Computer Program Product for Providing Cosmetic Application Instructions Using Arc Lines |
WO2011085727A1 (en) | 2009-01-15 | 2011-07-21 | Tim Schyberg | Advice information system |
EP2571395A4 (en) * | 2010-05-18 | 2014-03-19 | Elc Man Llc | Method and system for automatic or manual evaluation to provide targeted and individualized delivery of cosmetic actives in a mask or patch form |
EP3039990A4 (en) * | 2013-08-30 | 2016-09-14 | Panasonic Ip Man Co Ltd | Makeup assistance device, makeup assistance system, makeup assistance method, and makeup assistance program |
EP3530142A4 (en) * | 2016-10-24 | 2019-10-30 | Panasonic Intellectual Property Management Co., Ltd. | Image processing device, image processing method, and image processing program |
US10789748B2 (en) | 2016-10-24 | 2020-09-29 | Panasonic Intellectual Property Management Co., Ltd. | Image processing device, image processing method, and non-transitory computer-readable recording medium storing image processing program |
US11058208B2 (en) | 2018-04-13 | 2021-07-13 | Chanel Parfums Beaute | Method for selecting a cosmetic product for an intended user |
Also Published As
Publication number | Publication date |
---|---|
JPH10255066A (en) | 1998-09-25 |
WO1998039735A1 (en) | 1998-09-11 |
EP1030267B1 (en) | 2010-01-27 |
EP1030267A1 (en) | 2000-08-23 |
US6502583B1 (en) | 2003-01-07 |
EP1030267A4 (en) | 2000-08-23 |
US20060132506A1 (en) | 2006-06-22 |
JP3912834B2 (en) | 2007-05-09 |
TW416041B (en) | 2000-12-21 |
DE69841479D1 (en) | 2010-03-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6502583B1 (en) | Method of correcting face image, makeup simulation method, makeup method makeup supporting device and foundation transfer film | |
US10467779B2 (en) | System and method for applying a reflectance modifying agent to change a person's appearance based on a digital image | |
US4872056A (en) | Method for displaying selected hairstyles in video form | |
CN107680071B (en) | Method and system for fusion processing of human face and human body | |
JP4753025B2 (en) | Makeup simulation method | |
CA2692667C (en) | Method and apparatus for realistic simulation of wrinkle aging and de-aging | |
WO2008100878A1 (en) | System and method for applying a reflectance modifying agent to change a person's appearance based on a digital image | |
JP5261586B2 (en) | Makeup simulation system, makeup simulation apparatus, makeup simulation method, and makeup simulation program | |
US20100189357A1 (en) | Method and device for the virtual simulation of a sequence of video images | |
US20070252831A1 (en) | Methods and Systems for Compositing Images | |
JP2009064423A (en) | Makeup simulation system, makeup simulation device, makeup simulation method, and makeup simulation program | |
CN102682420A (en) | Method and device for converting real character image to cartoon-style image | |
WO2021197186A1 (en) | Auxiliary makeup method, terminal device, storage medium and program product | |
JP2003044837A (en) | Device for simulating makeup, method for controlling makeup simulation and computer-readable recording medium having makeup simulation program recorded thereon | |
CN104282002A (en) | Quick digital image beautifying method | |
KR102037166B1 (en) | Personal Color Analysis System with complex perception of the auditory and visual and fashion persona color matching method using the same | |
JP2005216131A (en) | Makeup simulation apparatus, method and program | |
JP2007144194A (en) | Method for face image modification, method for makeup simulation, method for makeup, support equipment for makeup and cosmetic foundation transcription film | |
CN112541955B (en) | Image processing method, device and equipment | |
JP5029852B2 (en) | Makeup simulation method | |
KR100422470B1 (en) | Method and apparatus for replacing a model face of moving image | |
JP4487961B2 (en) | Makeup simulation method | |
Rojas | Photographing women: posing, lighting, and shooting techniques for portrait and fashion photography | |
Nishimura et al. | iMake: eye makeup design generator | |
CN110248104A (en) | A kind of image processing method, device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SCALAR CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:UTSUGI, RYUICHI;REEL/FRAME:013442/0743 Effective date: 20021015 Owner name: DRDC LIMITED, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:UTSUGI, RYUICHI;REEL/FRAME:013442/0743 Effective date: 20021015 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |