WO2018008138A1 - 化粧品情報提供システム、化粧品情報提供装置、化粧品情報提供方法、及びプログラム - Google Patents
化粧品情報提供システム、化粧品情報提供装置、化粧品情報提供方法、及びプログラム Download PDFInfo
- Publication number
- WO2018008138A1 WO2018008138A1 PCT/JP2016/070212 JP2016070212W WO2018008138A1 WO 2018008138 A1 WO2018008138 A1 WO 2018008138A1 JP 2016070212 W JP2016070212 W JP 2016070212W WO 2018008138 A1 WO2018008138 A1 WO 2018008138A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- color
- cosmetic
- unit
- image
- user
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0623—Item investigation
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D44/00—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
- A45D44/005—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- the present invention relates to a cosmetic information providing system, a cosmetic information providing apparatus, a cosmetic information providing method, and a program.
- Patent Literature 1 describes a makeup simulation system that performs makeup processing on a moving image obtained by photographing a user's face.
- Patent Document 2 describes a game device that performs a makeup simulation using a user's face image.
- Patent Document 1 can only simulate predetermined makeup such as makeup using existing cosmetics.
- predetermined makeup such as makeup using existing cosmetics.
- the user's desired color is not always among the predetermined makeup colors. For this reason, in the technique described in Patent Document 1, it may not be possible to simulate makeup of a user's desired color or obtain information on cosmetics corresponding to the desired color.
- Patent Document 2 The technique described in Patent Document 2 is merely for enjoying a makeup simulation. As a result of simulation, even if a favorite makeup is found, information on cosmetics for realizing the makeup cannot be obtained.
- An object of this invention is to provide the information of the cosmetics corresponding to a user's desired color.
- the present invention analyzes a first acquisition unit that acquires a first image representing a face on which makeup of a first color is applied, and analyzes the acquired first image to obtain the first color.
- a cosmetic information providing system including a determining unit that determines a corresponding cosmetic and an output unit that outputs information for identifying the determined cosmetic.
- information on cosmetics corresponding to the user's desired color can be provided.
- FIG. 4 is a diagram illustrating an example of a color table 132.
- FIG. The figure which shows an example of a function structure of the server.
- SYMBOLS 1 Cosmetic information provision system, 10 ... Server, 20 ... User terminal, 101 ... 1st acquisition part, 102 ... Extraction part, 103 ... 1st calculation part, 104 ... Determination part, 105 ... Transmission part, 106 ... 1st 3, acquisition unit 107, second calculation unit, 108, correction unit, 109, granting unit, 201, 206, second acquisition unit, 202, 209, generation unit, 203, 207, transmission unit, 204, 208 ... Receiving unit, 205, 210 ... Output unit
- FIG. 1 is a diagram illustrating an example of a configuration of a cosmetic information providing system 1 according to the first embodiment.
- the cosmetic information providing system 1 includes a server 10 and a user terminal 20.
- the server 10 and the user terminal 20 are connected via the network 2.
- the network 2 is the Internet, for example.
- the network 2 is not limited to the Internet, and may be another communication line.
- the server 10 provides cosmetic information corresponding to the user's desired color to the user terminal 20.
- the user terminal 20 is used by a user. For example, the user uses the user terminal 20 to perform a makeup color simulation. In addition, the user browses information on cosmetics provided from the server 10 using the user terminal 20.
- FIG. 2 is a diagram illustrating an example of a hardware configuration of the server 10.
- the server 10 is a computer.
- the server 10 includes a control unit 11, a communication unit 12, and a storage unit 13.
- the control unit 11 controls each unit of the server 10.
- a processor such as a CPU (Central Processing Unit) and a memory such as a ROM (Read Only Memory) or a RAM (Random Access Memory) are used as the control unit 11.
- the number of processors and memories may be singular or plural.
- the communication unit 12 is a communication interface connected to the network 2.
- the communication unit 12 performs data communication with the user terminal 20 via the network 2 under the control of the control unit 11.
- the storage unit 13 stores various data and programs.
- a hard disk drive is used as the storage unit 13.
- the program stored in the storage unit 13 includes a server program.
- the server program describes procedures on the server 10 side in a purchase support process, a correction process, and a feedback process, which will be described later.
- the storage unit 13 stores a product table 131, a color table 132, and a user table 133.
- FIG. 3 is a diagram illustrating an example of the product table 131.
- the product table 131 stores information on cosmetics.
- the product table 131 stores cosmetic names, image data, and product information.
- the name of the cosmetic is information for identifying the cosmetic.
- the image data is data indicating a cosmetic image. This image is, for example, a photograph of the appearance of a cosmetic package.
- the product information is detailed information on cosmetics.
- the product information includes a cosmetic price and a brand name.
- the product information may include the release date and popularity of the cosmetic product.
- FIG. 4 is a diagram illustrating an example of the color table 132.
- the color table 132 stores information on the color of cosmetics. Specifically, the color table 132 stores the name of the cosmetic and the color information. The name of the cosmetic is the same as the name of the cosmetic stored in the product table 131.
- the color information is information indicating the color of makeup using the cosmetic. As the color information, for example, Munsell values are used. As the color information, values represented by other display systems may be used.
- the color information may be a makeup color using a single cosmetic or a makeup color using a combination of a plurality of cosmetics. For example, the color information “C2” indicates a makeup color when the eye shadow B and the eye shadow C are applied in an overlapping manner.
- FIG. 5 is a diagram illustrating an example of the user table 133.
- the user table 133 stores information about users. Specifically, the user table 133 stores a user ID, a skin color type, a technical level, a makeup tool type, and a favorite brand name.
- the user ID is information for identifying the user.
- the skin color type is the user's skin color type.
- the technical level is the technical level of the user's makeup.
- the type of makeup tool is the type of makeup tool owned by the user.
- the favorite brand name is the name of the cosmetic brand that the user likes. Such information is registered in advance by the user, for example.
- FIG. 6 is a diagram illustrating an example of a functional configuration of the server 10.
- the server 10 includes functions of the first acquisition unit 101, the extraction unit 102, the first calculation unit 103, the determination unit 104, the transmission unit 105, the third acquisition unit 106, the second calculation unit 107, and the correction unit 108.
- Have The first acquisition unit 101, the transmission unit 105, and the third acquisition unit 106 are realized by the communication unit 12 under the control of the control unit 11.
- the extraction unit 102, the first calculation unit 103, the determination unit 104, the second calculation unit 107, and the correction unit 108 are realized by the control unit 11.
- these functions are realized by a processor storing a server program in a memory and executing it.
- the first acquisition unit 101 acquires a first image representing a face on which makeup of the first color has been applied from the user terminal 20.
- the first image is generated using, for example, a user's face image.
- the extraction unit 102 extracts a first color from the first image acquired by the first acquisition unit 101. This color includes not only the color type but also the color density and texture.
- the first calculation unit 103 calculates the similarity between the first color extracted by the extraction unit 102 and the second color of makeup using a predetermined cosmetic.
- the predetermined cosmetic is a cosmetic whose information is stored in the product table 131, for example.
- the cosmetic may be a single cosmetic or a combination of a plurality of cosmetics.
- the determination unit 104 determines a cosmetic whose similarity calculated by the first calculation unit 103 is higher than a threshold value. This threshold is set in advance, for example.
- the transmitting unit 105 transmits information identifying the cosmetic determined by the determining unit 104 to the user terminal 20.
- information for identifying the cosmetic for example, the name of the cosmetic is used.
- the information for identifying the cosmetic is not limited to the name of the cosmetic, and may be any information as long as the information can identify the cosmetic.
- an image obtained by encoding information for identifying cosmetics such as a barcode or a two-dimensional code may be used.
- the third acquisition unit 106 represents the face of the user who has applied the makeup of the third color using the cosmetic after the information for identifying the cosmetic determined by the determination unit 104 is output from the user terminal 20.
- a fifth image is acquired from the user terminal 20.
- the second calculation unit 107 calculates a correction value according to the difference between the first color and the third color.
- the correction unit 108 uses the correction value calculated by the second calculation unit 107 when the first acquisition unit 101 acquires the sixth image representing the face with the fourth color makeup. Then, the fourth color is corrected.
- FIG. 7 is a diagram illustrating an example of a hardware configuration of the user terminal 20.
- the user terminal 20 is a computer such as a tablet terminal or a personal computer.
- the user terminal 20 includes a control unit 21, a communication unit 22, a storage unit 23, an input unit 24, a display unit 25, an imaging unit 26, and an image reading unit 27.
- the control unit 21 controls each unit of the user terminal 20.
- a processor such as a CPU and a memory such as a ROM or a RAM are used.
- the number of processors and memories may be singular or plural.
- the communication unit 22 is a communication interface connected to the network 2.
- the communication unit 22 performs data communication with the server 10 via the network 2 under the control of the control unit 21.
- the storage unit 23 stores various data and programs. As the storage unit 23, for example, a hard disk drive is used.
- the program stored in the storage unit 23 includes a client program.
- the client program describes procedures on the user terminal 20 side in a purchase support process, a correction process, and a feedback process, which will be described later.
- the data stored in the storage unit 23 stores a makeup pattern of each part of the face to which makeup is applied, such as eyes, cheeks, and mouth. This makeup pattern has a shape corresponding to at least a corresponding part.
- the input unit 24 inputs information according to the user's operation.
- a mouse and a keyboard are used.
- a touch panel may be used as the input unit 24.
- the display unit 25 displays various information.
- a liquid crystal display is used as the display unit 25.
- the imaging unit 26 captures an image.
- a camera is used as the imaging unit 26.
- the image reading unit 27 reads an image.
- a scanner is used as the image reading unit 27, for example.
- FIG. 8 is a diagram illustrating an example of a functional configuration of the user terminal 20.
- the user terminal 20 has functions of a second acquisition unit 201, a generation unit 202, a transmission unit 203, a reception unit 204, and an output unit 205.
- the transmission unit 203 and the reception unit 204 are realized by the communication unit 22 under the control of the control unit 21.
- the second acquisition unit 201, the generation unit 202, and the output unit 205 are realized by the control unit 21.
- These functions are realized by, for example, a processor storing a client program in a memory and executing it.
- the second acquisition unit 201 acquires a second image representing the user's face from the imaging unit 26.
- the generation unit 202 uses the second image acquired by the second acquisition unit 201 to generate a third image representing a face on which makeup of the color selected by the user has been applied.
- the transmission unit 203 transmits image data indicating the third image generated by the generation unit 202 to the server 10.
- the receiving unit 204 receives information for identifying cosmetics from the server 10.
- the output unit 205 outputs the information received by the receiving unit 204.
- the output unit 205 causes the display unit 25 to display information.
- the output of information is not limited to the display of information.
- the information may be printed and output by the printer.
- information may be output from the speaker as sound.
- FIG. 9 is a sequence diagram illustrating an example of the purchase support process according to the first embodiment.
- FIG. 10 is a diagram illustrating an example of screen transition of the user terminal 20 in the purchase support process.
- a color simulation of makeup is performed using the face image of the user.
- a cosmetic corresponding to the color is recommended.
- step S11 a color simulation process is performed. Specifically, the user first captures his / her face image 311 using the imaging unit 26.
- the face image 311 is an example of a “second image” according to the present invention.
- the second acquisition unit 201 acquires the face image 311 captured by the imaging unit 26.
- a screen 31 shown in FIG. 10 is displayed on the display unit 25.
- the screen 31 includes a face image 311.
- the screen 31 accepts an operation for selecting a part to be applied (hereinafter referred to as “first target part”). For example, when the user performs an operation of selecting an eye using the input unit 24, the eye is selected as the first target region.
- first target part a part to be applied
- the “eye” here refers to the eye hole, not the eyeball portion between the eyelids.
- a screen 32 shown in FIG. 10 is displayed on the display unit 25.
- the screen 32 includes a plurality of color samples 321.
- the color of the sample 321 may be a color of an existing cosmetic product or a color unrelated to the existing cosmetic product.
- the screen 32 accepts an operation for selecting a makeup color from the colors of these samples 321. For example, when the user performs an operation of selecting brown using the input unit 24, brown is selected as the makeup color.
- a screen 33 shown in FIG. 10 is displayed on the display unit 25.
- the screen 33 receives an operation for adjusting the characteristics of the selected color.
- This feature includes color density and texture.
- the color characteristics are not limited to the color density and texture, but may be other characteristics of the color.
- the screen 33 includes sliders 331 to 333.
- the sliders 331 to 333 are used for operations for adjusting the color density of makeup, the amount of pearl, and the amount of glitter.
- the amount of pearl and the amount of lame affect the color texture.
- the user adjusts the color feature by operating the sliders 331 to 333 using the input unit 24. Further, the screen 33 includes a sample 334 of the adjusted color.
- the sample 334 represents the color after adjustment every time the color feature is adjusted by a user operation. Further, the screen 33 includes a determination button 335. When the adjustment of the color feature is completed, the user presses the determination button 335 using the input unit 24.
- the adjusted color is referred to as a “first target color”.
- the first target color is an example of the “first color” according to the present invention.
- the generation unit 202 uses the face image 311 described above to generate a user face image 341 in which the first target color is applied to the first target region.
- the face image 341 is an example of the “first image” and the “third image” according to the present invention.
- the generation unit 202 identifies the first target part from the face image 311 by image recognition.
- the generation unit 202 overlays the makeup pattern of the first target color on the identified first target region.
- This makeup pattern is a makeup pattern for eyes and has a shape corresponding to the eyehole. This makeup pattern is read from the storage unit 23 and used. In this way, the user's face image 341 is generated.
- a screen 34 shown in FIG. 10 is displayed on the display unit 25.
- the screen 34 includes a face image 341. Further, the screen 34 includes a determination button 342. If the user likes the first target color, the user presses the enter button 342 using the input unit 24. When the determination button 342 is pressed, the process proceeds to step S12.
- step S ⁇ b> 12 the transmission unit 203 sends the image data indicating the face image 341 of the user and the part information indicating the first target part to the server 10 together with the user ID of the user who uses the user terminal 20.
- the user ID for example, the one input by an operation using the input unit 24 when the user logs in to the user terminal 20 is used.
- the user ID, image data, and part information transmitted from the user terminal 20 reach the server 10 via the network 2.
- the first acquisition unit 101 receives this user ID, image data, and part information.
- step S13 by analyzing the face image 341 represented by the received image data, the color of the makeup applied to the first target region, that is, the cosmetic corresponding to the first target color is determined.
- the extraction unit 102 specifies a first target region from the face image 341 by image recognition. This identification is performed using a known face recognition technique using the positional relationship of feature points such as eyes, nose and mouth. For example, when the first target region is an eye, the eyehole region is specified. Subsequently, the extraction unit 102 extracts the first target color from the specified region.
- the first calculation unit 103 reads out color information of cosmetics corresponding to the first target part from the color table 132.
- the cosmetic corresponding to the first target part refers to a cosmetic used for the makeup of the first target part.
- the cosmetic corresponding to the first target part is an eye shadow.
- the color information C1 to C3 stored in association with the eye shadows A to D is read from the color table 132 shown in FIG.
- the first calculation unit 103 calculates the similarity between the first target color extracted by the extraction unit 102 and the color indicated by the color information read from the color table 132.
- the color indicated by the color information is an example of the “second color” according to the present invention.
- the similarity between the first target color and the color indicated by the color information C1 to C3 is calculated.
- FIG. 11 is a diagram illustrating an example of the calculated similarity.
- the similarity between the first target color and the color indicated by the color information C1 is “98”.
- the similarity between the first target color and the color indicated by the color information C2 is “92”.
- the similarity between the first target color and the color indicated by the color information C3 is “89”.
- the determination unit 104 determines a cosmetic having a color whose similarity calculated by the first calculation unit 103 is higher than a threshold value.
- the cosmetic determined by the determination unit 104 may be a single cosmetic or a combination of a plurality of cosmetics.
- the threshold is “90”.
- the similarity “98” between the first target color and the color indicated by the color information C1 and the similarity “92” between the first target color and the color indicated by the color information C2 are higher than the threshold. Therefore, the eye shadow A and the combination of eye shadows B and C are determined.
- the transmission unit 105 transmits the cosmetic information determined in step S ⁇ b> 13 to the user terminal 20.
- the cosmetic information includes a cosmetic name, image data, and product information stored in the product table 131.
- the name, image data, and product information of the eye shadow A, the name, image data, and product information of the eye shadow B, and the name, image data, and product information of the eye shadow C are read from the product table 131. Sent out and sent.
- the cosmetic information transmitted from the server 10 reaches the user terminal 20 via the network 2.
- the receiving unit 204 receives this cosmetic information.
- step S15 the output unit 205 displays the purchase screen 35 on the display unit 25 based on the received cosmetic information.
- the purchase screen 35 includes the name, image, and product information of the eye shadow A, the name, image, and product information of the eye shadow B, and the name, image of the eye shadow C. , And product information.
- the purchase screen 35 includes purchase buttons 351 and 352.
- the purchase button 351 is used for an operation for executing the purchase of the eye shadow A.
- the purchase button 351 is used for an operation for executing the purchase of the eye shadows B and C.
- the user presses the purchase button 351 using the input unit 24.
- the purchase button 351 is pressed, a procedure for purchasing eye shadow A by electronic commerce is performed. Thereby, the purchase of the eye shadow A is completed.
- FIG. 12 is a sequence diagram illustrating the correction process.
- the correction process is performed after the user has actually purchased cosmetics by the purchase support process described above.
- the user purchases cosmetics having a color close to the color for which the color simulation has been performed.
- This correction process is performed in order to reduce this difference in the next purchase support process.
- the storage unit 13 stores the user ID, image data, and part information received from the user terminal 20 in step S12 described above.
- this image data is referred to as “reference image data”.
- step S21 shown in FIG. 12 after applying makeup using the purchased cosmetics, the user photographs his / her face image using the imaging unit.
- the imaging unit For example, when the eye shadow A is purchased by the purchase support process described above, a face image of the user who applied makeup using the eye shadow A is photographed.
- This face image is an example of the “fifth image” according to the present invention.
- step S22 the transmission unit 203 transmits image data indicating the face image captured in step S21 to the server 10 together with the user ID of the user who uses the user terminal 20.
- the user ID for example, the one input by an operation using the input unit 24 when the user logs in to the user terminal 20 is used.
- this image data is referred to as “target image data”.
- the user ID and image data transmitted from the user terminal 20 reach the server 10 via the network 2.
- the third acquisition unit 106 receives this user ID and target image data.
- a correction value is calculated using the received target image data and the reference image data stored in the storage unit 13.
- the extraction unit 102 determines the color of makeup (hereinafter referred to as “ideal color”) applied to the first target region from the face image 341 indicated by the reference image data, as in step S103 of the purchase support process described above. Color).).
- the ideal color is an example of the “first color” according to the present invention.
- the extraction unit 102 extracts a makeup color (hereinafter, “actual color”) applied to the first target part from the face image indicated by the target image data.
- the actual color is an example of the “third color” according to the present invention.
- the second calculation unit 107 compares the ideal color extracted by the extraction unit 102 with the actual color, and obtains a difference between these colors.
- the second calculation unit 107 calculates a correction value based on the color difference. For example, when the actual color is darker than the ideal color, a correction value for reducing the color density is calculated.
- the calculated correction value is stored in the storage unit 13.
- the correction unit 108 corrects the first target color extracted by the extraction unit 102 using the correction value stored in the storage unit 13.
- the first target color is a makeup color included in the face image of the user newly acquired by the first acquisition unit 101.
- This face image is an example of the “sixth image” according to the present invention.
- the makeup color included in the face image is an example of the “fourth color” according to the present invention.
- the correction value is to lower the color density
- the density of the first target color is lowered using the correction value.
- the first calculation unit 103 calculates the similarity using the corrected first target color. As a result, the cosmetic corresponding to the corrected first target color is determined, and information on the determined cosmetic is provided.
- the first target color selected by the user in the purchase support process described above is the makeup color desired by the user.
- the information on the first target color is considered to be useful information for organizations such as manufacturers that manufacture and develop cosmetics.
- the first target color is a color that does not exist in existing cosmetics, or when the same first target color is selected by many users, the information on the first target color is new. It is thought to be useful for the development of cosmetics. Therefore, color information indicating the first target color may be provided to an organization such as a manufacturer that manufactures or develops cosmetics.
- the transmission unit 203 transmits color information indicating the first target color to the external device.
- the purchase screen 35 includes a close button 563.
- the user presses a close button 563 using the input unit 24. Thereby, the purchase screen 35 is closed.
- the transmission unit 203 may transmit color information indicating the first target color to the external device only when the close button 563 is pressed.
- the transmission unit 105 of the server 10 may transmit color information indicating the first target color to an external device.
- information on cosmetics corresponding to a user's desired color is provided.
- the user can purchase cosmetics corresponding to a desired color by browsing this information.
- this desired color is not limited to the color of makeup using existing cosmetics, but may be other colors.
- Second Embodiment 2-1 Configuration
- the configuration of the cosmetic information providing system 1 according to the second embodiment, the hardware configuration of the server 10, and the hardware configuration of the user terminal 20 are the same as those in the first embodiment.
- the functional configuration of the server 10 is basically the same as that of the first embodiment.
- the first acquisition unit 101 acquires a first image representing a face of a person different from the user.
- the functional configuration of the user terminal 20 is different from that of the first embodiment.
- FIG. 13 is a diagram illustrating an example of a functional configuration of the user terminal 20 according to the second embodiment.
- the user terminal 20 has the functions of a second acquisition unit 206, a transmission unit 207, a reception unit 208, a generation unit 209, and an output unit 210.
- the transmission unit 207 and the reception unit 208 are realized by the communication unit 22 under the control of the control unit 21.
- the second acquisition unit 206, the generation unit 209, and the output unit 210 are realized by the control unit 21.
- these functions are realized by a processor storing and executing a client program in a memory.
- the second acquisition unit 206 acquires a first image representing the face of a person different from the user from the image reading unit 27 or the imaging unit 26.
- the transmission unit 207 transmits image data indicating the first image acquired by the second acquisition unit 206 to the server 10.
- the receiving unit 208 receives information for identifying cosmetics from the server 10.
- the output unit 210 outputs information received by the receiving unit 208.
- the output unit 210 causes the display unit 25 to display information.
- the output of information is not limited to the display of information.
- the second acquisition unit 206 acquires a second image representing the user's face from the imaging unit 26.
- the generation unit 209 uses the second image acquired by the second acquisition unit 206 to generate a fourth face representing a face on which makeup is applied using cosmetics identified by the information received by the reception unit 208. Generate an image.
- the output unit 210 outputs the fourth image generated by the generation unit 209.
- FIG. 14 is a sequence diagram showing purchase support processing of purchase support processing according to the second embodiment.
- FIG. 15 is a diagram illustrating an example of screen transition of the user terminal 20 in the purchase support process.
- the user inputs another person's face image with a favorite makeup into the user terminal 20.
- the cosmetics corresponding to this makeup color are recommended.
- a makeup simulation using the recommended cosmetic is performed using the user's face image.
- step S ⁇ b> 31 the user inputs the face image 411 of another person with makeup on the user terminal 20.
- This face image 411 is a face image of a person different from the user. For example, assume that the user likes the makeup of a model published in a magazine. In this case, the user causes the image reading unit 27 to read the page on which this model is posted. As a result, the model face image 411 is input to the user terminal 20.
- model image 411 is an example of the “first image” according to the present invention.
- a screen 41 shown in FIG. 15 is displayed on the display unit 25.
- the screen 41 includes a model image 411.
- the screen 41 accepts an operation for selecting a part to which makeup has been applied.
- the user likes the makeup of the eyes included in the model image 411.
- the user performs an operation of selecting an eye using the input unit 24. Thereby, the eyes are selected.
- the part selected by the user's operation is referred to as “second target part”.
- the process proceeds to step S32.
- step S ⁇ b> 32 the transmitting unit 207, together with the user ID of the user who uses the user terminal 20, the image data representing the model image 411 input in step S ⁇ b> 31 and the part information indicating the second target part. Is transmitted to the server 10.
- this user ID as in the first embodiment described above, for example, the one input by an operation using the input unit 24 when the user logs in to the user terminal 20 is used.
- the user ID, image data, and part information transmitted from the user terminal 20 reach the server 10 via the network 2.
- the first acquisition unit 101 receives this user ID, image data, and part information.
- step S33 by analyzing the model image 411 represented by the received image data, a cosmetic corresponding to the color of makeup applied to the second target region is determined.
- the extraction unit 102 first identifies the region of the second target region from the model image 411 by image recognition, and extracts the makeup color from the identified region, as in the first embodiment described above. To do.
- the extracted makeup color is referred to as a “second target color”.
- the second target color is an example of the “first color” according to the present invention.
- the first calculation unit 103 reads out color information of cosmetics corresponding to the second target part from the color table 132, and the second target color extracted by the extraction unit 102 The similarity with the color indicated by the read color information is calculated.
- the color indicated by the color information is an example of the “second color” according to the present invention.
- the determination unit 104 determines a cosmetic product having a color whose calculated similarity is higher than a threshold value. Here, it is assumed that eye shadow A and a combination of eye shadows B and C are determined.
- step S34 the transmission unit 105 transmits the cosmetic information determined in step S33 to the user terminal 20 as in the first embodiment described above.
- the cosmetic information transmitted from the server 10 reaches the user terminal 20 via the network 2.
- the receiving unit 204 receives this cosmetic information.
- step S35 a makeup simulation process is performed. Specifically, first, a screen 42 shown in FIG. 15 is displayed on the display unit 25. The screen 42 is displayed based on the received cosmetic information.
- the screen 42 includes the name, image, and product information of the eye shadow A, the name, image, and product information of the eye shadow B, and the name, image, and product information of the eye shadow C. .
- the screen 42 accepts an operation for selecting cosmetics.
- the screen 42 includes selection buttons 421 and 422.
- the selection button 421 is used for an operation of selecting the eye shadow A.
- the selection button 422 is used for an operation of selecting a combination of eye shadows B and C.
- the user wants to simulate the makeup of the eye shadow A.
- the user presses the selection button 421 using the input unit 24. Thereby, the eye shadow A is selected.
- the selected cosmetic is referred to as “target cosmetic”.
- This face image is an example of the “second image” according to the present invention.
- the second acquisition unit 206 acquires the face image captured by the imaging unit 26.
- image data indicating a user's face image may be stored in the storage unit 23 in advance. In this case, the image data is read from the storage unit 23 and used.
- the generation unit 202 generates the face image 431 of the user in which the makeup using the target cosmetic is applied to the second target site using the acquired face image, as in the first embodiment described above.
- the face image 431 is an example of a “fourth image” according to the present invention.
- a screen 43 shown in FIG. 15 is displayed on the display unit 25.
- the screen 43 includes a face image 431. Further, the screen 43 includes a forward button 432. If the user likes this makeup, the user presses a forward button 432 using the input unit 24. When the forward button 432 is pressed, the process proceeds to step S36.
- the output unit 210 displays the purchase screen 44 on the display unit 25.
- the purchase screen 44 includes information on the target cosmetic product among the received cosmetic product information.
- the purchase screen 44 includes the name, image, and product information of the eye shadow A.
- the purchase screen 44 includes a purchase button 441.
- the purchase button 441 is used for an operation for executing the purchase of the eye shadow A. For example, when purchasing the eye shadow A, the user presses the purchase button 441 using the input unit 24. When the purchase button 441 is pressed, a procedure for purchasing eye shadow A by electronic commerce is performed. Thereby, the purchase of the eye shadow A is completed.
- the correction process and the feedback process described in the first embodiment may be performed.
- information on cosmetics corresponding to the color desired by the user is provided.
- This desired color is extracted from a face image of a person different from the user.
- the user can purchase cosmetics corresponding to a desired color by browsing this information.
- the present invention is not limited to the above-described embodiment.
- the embodiment may be modified as follows. Moreover, you may implement combining the following modifications.
- the user terminal 20 has functions of an extraction unit and a determination unit in addition to the functions shown in FIG.
- the extraction unit and the determination unit are realized by the control unit 21.
- the extraction unit and the determination unit are realized by a processor storing a client program in a memory and executing it.
- the display unit 25 displays a screen 36 instead of the screen 32.
- FIG. 16 is a diagram illustrating an example of the screen 36.
- the screen 36 includes a color sample 362 that suits the user.
- the color that suits the user is determined by the following method, for example.
- the extraction unit extracts the skin color from the user's face image 311 acquired by the second acquisition unit 201 by image recognition.
- the determination unit determines a color type that suits the user based on the color extracted by the extraction unit. For example, when the user's skin color is a yellowish color, a yellow-based color is determined as a color type that suits the user. In this case, a yellow-based color is determined as a color that suits the user.
- the screen 36 receives an operation of selecting a makeup color from a plurality of colors including colors that suit the user. According to this modification, the user can easily select a color that suits him.
- a new color may be generated by a user using a plurality of colors, and the generated color may be selected as a makeup color.
- a screen 37 is displayed on the display unit 25 instead of the screen 32.
- FIG. 17 is a diagram illustrating an example of the screen 37.
- the screen 37 includes a plurality of color samples 371.
- the screen 37 accepts an operation for selecting a plurality of colors.
- the screen 37 accepts an operation for selecting a plurality of colors.
- the screen 37 accepts an operation for selecting a plurality of colors.
- a new color obtained by mixing these colors is generated.
- the screen 37 includes a generated new color sample 372.
- the screen 37 accepts an operation for selecting the generated new color. According to this modification, when there is no user-desired color in the plurality of color samples 371, a desired color can be generated using the plurality of colors.
- a color may be selected from an image input to the user terminal 20.
- This image is input to the user terminal 20 using the image reading unit 27, for example.
- This image need not be a human face image.
- This image may be any image as long as it has a color.
- the user likes the color of a certain flower.
- the user captures the flower image 381 using the imaging unit 26.
- the flower image 381 is input to the user terminal 20.
- a screen 38 is displayed on the display unit 25 instead of the screen 32.
- FIG. 18 is a diagram illustrating an example of the screen 38.
- the screen 38 includes a flower photo 381.
- the screen 38 accepts an operation for selecting a color from among the colors included in the flower photograph 381. For example, when the user performs an operation of selecting a part with a flower using the input unit 24, the color of this part is selected. According to this modification, when a user finds a favorite color, the color can be selected as a makeup color.
- cosmetics may be determined in consideration of the user's skin color.
- the extraction unit 102 according to this modification extracts the skin color from the image acquired by the first acquisition unit 101.
- the first calculation unit 103 calculates the degree of similarity between the color obtained by superimposing the makeup color on the color extracted by the extraction unit 102 and the color of the makeup using a predetermined cosmetic product.
- FIG. 19 is a diagram illustrating an example of a functional configuration of the server 10 according to this modification.
- the server 10 has the function of the adding unit 109 in addition to the function shown in FIG.
- the granting unit 109 gives priority to the cosmetic whose similarity calculated by the first calculating unit 103 is higher than the threshold based on the additional information about the cosmetic or the user.
- This additional information includes, for example, the price of cosmetics, the popularity of cosmetics, the release date of cosmetics, the user's skin color type, the brand name of the favorite cosmetics, the technical level of makeup, or the type of cosmetic tool owned by the user Is included.
- the price of cosmetics, the popularity of cosmetics, and the release date of cosmetics are included in the product information stored in the product table 131, for example.
- the user ID assigned to the user is “001”
- these pieces of information are extracted from the product information stored in association with the user ID “001” in the product table 131 and used.
- the user's skin color type, favorite cosmetic brand name, makeup technology level, and the type of makeup tool owned by the user are stored in the user table 133, for example. For example, when the user ID given to the user is “001”, the information stored in association with the user ID “001” is read from the user table 133 and used.
- priority may become high, so that the price of cosmetics is high.
- the popularity of cosmetics is included in the additional information, the higher the popularity, the higher the priority.
- the sale date of cosmetics is included in the additional information, the newer the release date, the higher the priority.
- the priority of the cosmetic product having a color that matches this type is higher than the priority of other cosmetic products. For example, when the user's skin color type is yellow base, the priority of cosmetics based on yellow is higher than the priority of other cosmetics.
- the priority of the cosmetics of this brand is higher than the priority of the cosmetics of other brands. For example, when a user likes a cosmetic brand, the priority of the cosmetic brand is higher.
- the priority of the cosmetics matching the level of the makeup technology is higher than the priorities of the other cosmetic products. For example, when the user's level of makeup technology is low, the priority of cosmetics that can be easily used is higher than the priority of other cosmetics.
- the priority of the cosmetics using this cosmetic tool is higher than the priority of the cosmetics using other cosmetic tools.
- the priority of cosmetics using the brush is higher than the priority of other cosmetics.
- the determining unit 104 determines a predetermined number of cosmetics in the order of the priority given by the assigning unit 109.
- FIG. 20 is an example of the assigned priority.
- the first target color used in the first embodiment described above and the second target color used in the second embodiment described above are collectively referred to as “target colors”.
- the degree of similarity between the target color and the color indicated by the color information C1 is “98”.
- the similarity between the target color and the color indicated by the color information C2 is “92”.
- the similarity between the target color and the color indicated by the color information C3 is “89”.
- the threshold is 80. In this case, both of these similarities are higher than the threshold value.
- priority is given to the combination of eye shadow A, eye shadows B and C, and eye shadow D.
- the eye shadow A is given priority “1”.
- the combination of eye shadows B and C is given priority “3”.
- the eye shadow D is given priority “2”.
- the predetermined number is two.
- the eye shadow A given the priority “1” and the eye shadow D given the priority “2” are determined. According to this modification, it is possible to determine a cosmetic that has a greater appeal to the user.
- the second calculation unit 107 may calculate a correction value according to the characteristics of the display unit 25.
- the correction unit 108 corrects the target color using the correction value calculated by the second calculation unit 107 in the same manner as the correction processing described above. Thereby, the difference in color due to the difference in the characteristics of the display unit 25 can be reduced.
- FIG. 21 is a diagram illustrating an example of the product table 134.
- the explanation information of the makeup method is information that explains the makeup method using the cosmetic. This information may be indicated by characters or an image.
- the transmission unit 105 transmits the description information of the makeup method corresponding to the cosmetic determined by the determination unit 104 to the user terminal 20.
- the description information of the makeup method is read from the product table 134 and used. For example, when the eye shadow A is determined by the determining unit 104, the description information of the makeup method associated with the name of the eye shadow A is read and transmitted.
- the description information of the makeup method transmitted from the server 10 reaches the user terminal 20 via the network 2.
- the receiving unit 204 receives the explanation information of the makeup method.
- the output unit 205 outputs the description information of the makeup method received by the receiving unit 204.
- the output unit 205 causes the display unit 25 to display the makeup method description information.
- description information on the makeup method may be included in the screen 43 or the purchase screen 44 shown in FIG.
- a moving image for applying makeup to a user's face image according to this makeup method may be generated. This moving image is included in a screen 43 shown in FIG. 15, for example. Thereby, the user can know the makeup method using the cosmetics together with information on the cosmetics corresponding to the desired color.
- description information on a makeup method according to a user's makeup technology level may be provided.
- description information on the makeup method is stored for each technical level of makeup.
- the transmission unit 105 transmits, to the user terminal 20, the description information of the cosmetics determined by the determination unit 104 and the makeup method corresponding to the user's makeup technology level.
- This makeup technology level is stored in the user table 133. For example, when the eye shadow A is determined by the determination unit 104 and the user's makeup technology level is “1”, the name of the eye shadow A and the makeup method associated with the technology level “1” The description information is read from the product table 134 and transmitted.
- the description information of the makeup method transmitted from the server 10 reaches the user terminal 20 via the network 2.
- the receiving unit 204 receives the explanation information of the makeup method.
- the output unit 205 outputs the description information of the makeup method received by the receiving unit 204.
- the output unit 205 causes the display unit 25 to display the description information of the makeup method, as described above. According to this modification, the user can know the makeup method according to the technical level of his makeup.
- decoration using the cosmetics may be recommended with the cosmetics.
- the storage unit 13 according to this modification stores a product table 135 instead of the product table 131.
- FIG. 22 is a diagram illustrating an example of the product table 135.
- the product table 135 stores the name, image data, and product information of a makeup tool corresponding to makeup in addition to the name, image data, and product information of the cosmetics stored in the product table 131 shown in FIG. .
- the name of the makeup tool is information for identifying the makeup tool.
- the image data of the makeup tool is data indicating an image of the makeup tool. This image is, for example, a photograph of a makeup tool.
- the product information of the makeup tool is detailed information of the makeup tool. For example, the product information includes the brand name and price of the makeup tool.
- the transmission unit 105 transmits information on the cosmetic tool corresponding to the cosmetic determined by the determination unit 104 to the user terminal 20.
- the makeup tool information includes the name of the makeup tool, image data, and product information. For example, when the eye shadow A is determined by the determination unit 104, the name of the makeup tool, the image data, and the product information stored in association with the eye shadow A are read from the product table 135 and transmitted.
- the information on the makeup tool transmitted from the server 10 reaches the user terminal 20 via the network 2.
- the receiving unit 204 receives information on this makeup tool.
- the output unit 205 outputs the information on the makeup tool received by the receiving unit 204.
- the output unit 205 causes the display unit 25 to display makeup tool information.
- makeup tool information may be included in the purchase screen 35 shown in FIG.
- information on makeup tools may be included in the purchase screen 44 shown in FIG.
- the purchase screens 35 and 44 may include a purchase button used for an operation of executing the purchase of the cosmetic tool. According to this modification, the user can easily purchase a makeup tool necessary for using the cosmetic.
- the extraction unit extracts the skin color from the user's face image by image recognition.
- the determination unit determines the state of the user's skin according to the skin color extracted by the extraction unit.
- This skin condition includes, for example, skin color, unevenness, or smoothness.
- the generation unit 202 changes the color of the face image 341 or 431 generated by the above-described color simulation process or makeup simulation process according to the skin state determined by the determination unit. For example, when the skin color is dull, the color of the face image 341 or 431 is changed so that the color of the makeup looks bad by a method such as lowering the saturation of the makeup color. Thereby, the makeup color of the face image 341 or 431 generated by the color simulation process or the makeup simulation process can be brought close to the makeup color when the user actually applies makeup.
- the face image 341 or 431 of the user to whom makeup is applied is generated, the brightness of the face image 341 or 431 or the hit of light according to scenes such as morning, noon, evening, outside, indoors, etc.
- the condition may be changed.
- makeup suitable for various scenes can be simulated.
- the server 10 may have some of the functions that the user terminal 20 has.
- the server 10 may have the generation unit 202.
- the server 10 may generate the purchase screen 353 or 442. In this case, screen data indicating the generated purchase screen 353 or 442 is transmitted from the server 10 to the user terminal 20.
- the user terminal 20 may have all the functions of the server 10.
- the user terminal 20 functions as an example of the “cosmetic information providing apparatus” according to the present invention. Further, it is not necessary to provide the server 10.
- an apparatus other than the server 10 and the user terminal 20 may have at least a part of the functions of the server 10 or at least a part of the functions of the user terminal 20.
- the present invention may be provided as a cosmetic information providing method including steps of processing performed in the cosmetic information providing system 1. Further, the present invention may be provided as a server program executed in the server 10 or a client program executed in the user terminal 20. These programs may be downloaded via the network 2 such as the Internet. These programs are provided in a state where they are recorded on a computer-readable recording medium such as a magnetic recording medium (magnetic tape, magnetic disk, etc.), an optical recording medium (optical disk, etc.), a magneto-optical recording medium, or a semiconductor memory. May be.
- a computer-readable recording medium such as a magnetic recording medium (magnetic tape, magnetic disk, etc.), an optical recording medium (optical disk, etc.), a magneto-optical recording medium, or a semiconductor memory. May be.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Development Economics (AREA)
- General Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Marketing (AREA)
- Economics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
本発明は、ユーザの所望の色に対応する化粧品の情報を提供することを目的とする。
1-1.構成
図1は、第1実施形態に係る化粧品情報提供システム1の構成の一例を示す図である。化粧品情報提供システム1は、サーバ10とユーザ端末20とを備える。サーバ10とユーザ端末20とは、ネットワーク2を介して接続される。ネットワーク2は、例えばインターネットである。ただし、ネットワーク2は、インターネットに限定されず、他の通信回線であってもよい。
図9は、第1実施形態に係る購入支援処理の一例を示すシーケンス図である。図10は、この購入支援処理におけるユーザ端末20の画面遷移の一例を示す図である。この購入支援処理では、ユーザの顔画像を用いて化粧のカラーシミュレーションが行われる。このカラーシミュレーションの結果を見てユーザが気に入った色を選択すると、その色に対応する化粧品がレコメンドされる。
図12は、補正処理を示すシーケンス図である。補正処理は、上述した購入支援処理によりユーザが実際に化粧品を購入した後に行われる。上述した購入支援処理において、ユーザは、カラーシミュレーションを行った色に近い色の化粧品を購入する。しかし、カラーシミュレーションを行った色と、実際に購入した化粧品を使用したときの色との間に差が生じる場合がある。この補正処理は、次回の購入支援処理においてこの差を減らすために行われる。
上述した購入支援処理においてユーザにより選択された第1の対象色は、ユーザの所望の化粧の色である。この第1の対象色の情報は、化粧品の製造や開発を行うメーカー等の組織にとって有益な情報であると考えられる。特に、第1の対象色が既存の化粧品にはない色である場合や、多くのユーザにより同様の第1の対象色が選択される場合には、第1の対象色の情報は、新たな化粧品の開発に役立つと考えられる。そこで、第1の対象色を示す色情報が化粧品の製造や開発を行うメーカー等の組織に提供されてもよい。
2-1.構成
第2実施形態では、第1実施形態とは異なる購入支援処理が行われる。第2実施形態に係る化粧品情報提供システム1の構成、サーバ10のハードウェア構成、及びユーザ端末20のハードウェア構成は、第1実施形態と同じである。サーバ10の機能構成は、基本的には第1実施形態と同じである。ただし、第1の取得部101は、ユーザとは異なる人物の顔を表す第1の画像を取得する。一方、ユーザ端末20の機能構成は、第1実施形態と異なる。
図14は、第2実施形態に係る購入支援処理の購入支援処理を示すシーケンス図である。図15は、この購入支援処理におけるユーザ端末20の画面遷移の一例を示す図である。購入支援処理では、ユーザは、気に入った化粧が施された他人の顔画像をユーザ端末20に入力する。これにより、この化粧の色に対応する化粧品がレコメンドされる。また、ユーザの顔画像を用いて、レコメンドされた化粧品を用いた化粧のシミュレーションが行われる。
本発明は上述した実施形態に限定されない。実施形態は、以下のように変形してもよい。また、以下の変形例を組み合わせて実施してもよい。
上述したカラーシミュレーション処理において、ユーザが化粧の色を選択する方法は第1実施形態で説明した方法に限定されない。
ユーザの所望の色に対応する化粧品を決定する方法は、上述した第1実施形態又は第2実施形態で説明した方法に限定されない。
図19は、この変形例に係るサーバ10の機能構成の一例を示す図である。サーバ10は、図6に示す機能の他に、付与部109の機能を有する。
上述した第1実施形態及び第2実施形態において、決定部104により決定された化粧品を用いた化粧方法の説明情報が提供されてもよい。この変形例に係る記憶部13には、商品テーブル131に代えて、商品テーブル134が記憶される。
上述した第1実施形態及び第2実施形態において、化粧品とともに、その化粧品を用いて化粧を施す際に使用される化粧道具がレコメンドされてもよい。この変形例に係る記憶部13には、商品テーブル131に代えて、商品テーブル135が記憶される。
上述した第1実施形態及び第2実施形態において、化粧が施されたユーザの顔画像341又は431が生成される際に、ユーザの肌の状態に応じて顔画像341又は431の色が変更されてもよい。この変形例に係るユーザ端末20は、図8に示す機能に加えて、上述した変形例1で説明した抽出部及び判定部と同様の機能を有する。
化粧品情報提供システム1の機能と、その機能の実行主体との組み合わせは、上述した第1実施形態及び第2実施形態に限定されない。例えば、ユーザ端末20が有する機能の一部を、サーバ10が有していてもよい。例えば、サーバ10が生成部202を有していてもよい。また、サーバ10が購入画面353又は442を生成してもよい。この場合、生成された購入画面353又は442を示す画面データがサーバ10からユーザ端末20に送信される。
本発明は、化粧品情報提供システム1において行われる処理のステップを備える化粧品情報提供方法として提供されてもよい。また、本発明は、サーバ10において実行されるサーバプログラム又はユーザ端末20において実行されるクライアントプログラムとして提供されてもよい。これらのプログラムは、インターネット等のネットワーク2を介してダウンロードされてもよい。また、これらのプログラムは、磁気記録媒体(磁気テープ、磁気ディスクなど)、光記録媒体(光ディスクなど)、光磁気記録媒体、半導体メモリなどの、コンピュータが読取可能な記録媒体に記録した状態で提供されてもよい。
Claims (11)
- 第1の色の化粧が施された顔を表す第1の画像を取得する第1の取得部と、
前記取得された第1の画像を解析して、前記第1の色に対応する化粧品を決定する決定部と、
前記決定された化粧品を識別する情報を出力する出力部と
を備える化粧品情報提供システム。 - ユーザの顔を表す第2の画像を取得する第2の取得部と、
前記取得された第2の画像を用いて、前記ユーザにより選択された色の化粧が施された顔を表す第3の画像を生成する生成部とを更に備え、
前記第1の画像は、前記生成された第3の画像である
請求項1に記載の化粧品情報提供システム。 - 前記第1の画像は、ユーザとは異なる人物の顔を表す
請求項1に記載の化粧品情報提供システム。 - 前記ユーザの顔を表す第2の画像を取得する第2の取得部と、
前記取得された第2の画像を用いて、前記決定された化粧品を用いた化粧が施された顔を表す第4の画像を生成する生成部とを更に備え、
前記出力部は、前記生成された第4の画像を出力する
請求項3に記載の化粧品情報提供システム。 - 前記取得された第1の画像から前記第1の色を抽出する抽出部と、
前記抽出された第1の色と、予め定められた化粧品を用いた化粧の第2の色との類似度を算出する第1の算出部を更に備え、
前記決定部は、前記算出された類似度が閾値より高い化粧品を決定する
請求項1から4のいずれか1項に記載の化粧品情報提供システム。 - 前記抽出部は、前記取得された第1の画像から肌の色を抽出し、
前記第1の算出部は、前記抽出された肌の色の上に前記抽出された第1の色を重ねたときに得られる色と前記第2の色との類似度を算出する
請求項5に記載の化粧品情報提供システム。 - 前記予め定められた化粧品又はユーザに関する付加情報に基づいて、前記算出された類似度が閾値より高い化粧品に優先度を付与する付与部を更に備え、
前記決定部は、前記優先度が高い順番に所定の数の化粧品を決定する
請求項5に記載の化粧品情報提供システム。 - 前記出力部により前記情報が出力された後、前記決定された化粧品を用いて第3の色の化粧が施されたユーザの顔を表す第5の画像を取得する第3の取得部と、
前記第1の色と前記第3の色との差に応じて、補正値を算出する第2の算出部と、
前記第1の取得部により、第4の色の化粧が施された顔を表す第6の画像が取得されると、前記算出された補正値を用いて、前記第4の色を補正する補正部とを更に備え、
前記決定部は、前記補正された第4の色に対応する化粧品を決定する
請求項1から7のいずれか1項に記載の化粧品情報提供システム。 - 第1の色の化粧が施された顔を表す第1の画像を取得する第1の取得部と、
前記取得された第1の画像を解析して、前記第1の色に対応する化粧品を決定する決定部と、
前記決定された化粧品を識別する情報を出力する出力部と
を備える化粧品情報提供装置。 - 第1の色の化粧が施された顔を表す第1の画像を取得するステップと、
前記取得された第1の画像を解析して、前記第1の色に対応する化粧品を決定するステップと、
前記決定された化粧品を識別する情報を出力するステップと
を備える化粧品情報提供方法。 - コンピュータに、
第1の色の化粧が施された顔を表す第1の画像を取得するステップと、
前記取得された第1の画像を解析して、前記第1の色に対応する化粧品を決定するステップと、
前記決定された化粧品を識別する情報を出力するステップと
を実行させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/070212 WO2018008138A1 (ja) | 2016-07-08 | 2016-07-08 | 化粧品情報提供システム、化粧品情報提供装置、化粧品情報提供方法、及びプログラム |
US15/322,634 US10607372B2 (en) | 2016-07-08 | 2016-07-08 | Cosmetic information providing system, cosmetic information providing apparatus, cosmetic information providing method, and program |
JP2016555364A JP6055160B1 (ja) | 2016-07-08 | 2016-07-08 | 化粧品情報提供システム、化粧品情報提供装置、化粧品情報提供方法、及びプログラム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/070212 WO2018008138A1 (ja) | 2016-07-08 | 2016-07-08 | 化粧品情報提供システム、化粧品情報提供装置、化粧品情報提供方法、及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018008138A1 true WO2018008138A1 (ja) | 2018-01-11 |
Family
ID=57582288
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/070212 WO2018008138A1 (ja) | 2016-07-08 | 2016-07-08 | 化粧品情報提供システム、化粧品情報提供装置、化粧品情報提供方法、及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US10607372B2 (ja) |
JP (1) | JP6055160B1 (ja) |
WO (1) | WO2018008138A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021153305A1 (ja) * | 2020-01-31 | 2021-08-05 | 株式会社Zozo | メガネ、推奨化粧品提示制御システム及び推奨化粧品提示制御方法 |
CN113850096A (zh) * | 2018-04-24 | 2021-12-28 | 株式会社Lg生活健康 | 移动终端 |
WO2023166911A1 (ja) * | 2022-03-04 | 2023-09-07 | 株式会社Zozo | 情報処理装置、情報処理方法及び情報処理プログラム |
WO2023166910A1 (ja) * | 2022-03-04 | 2023-09-07 | 株式会社Zozo | 情報処理装置、情報処理方法及び情報処理プログラム |
WO2024048274A1 (ja) * | 2022-08-29 | 2024-03-07 | 株式会社 資生堂 | 情報処理装置、情報処理方法及びプログラム |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6986676B2 (ja) * | 2016-12-28 | 2021-12-22 | パナソニックIpマネジメント株式会社 | 化粧品提示システム、化粧品提示方法、及び化粧品提示サーバ |
US11335118B2 (en) * | 2017-05-02 | 2022-05-17 | Nippon Telegraph And Telephone Corporation | Signal retrieval apparatus, method, and program |
CN109508587A (zh) | 2017-09-15 | 2019-03-22 | 丽宝大数据股份有限公司 | 身体信息分析装置及其底妆分析方法 |
KR102343251B1 (ko) * | 2018-04-13 | 2021-12-27 | 샤넬 파르퓜 보트 | 의도된 사용자를 위한 화장품의 선택 방법 |
CN110811115A (zh) * | 2018-08-13 | 2020-02-21 | 丽宝大数据股份有限公司 | 电子化妆镜装置及其脚本运行方法 |
US11010636B2 (en) * | 2018-10-25 | 2021-05-18 | L'oreal | Systems and methods for providing personalized product recommendations using deep learning |
US11676354B2 (en) * | 2020-03-31 | 2023-06-13 | Snap Inc. | Augmented reality beauty product tutorials |
CN115699130A (zh) | 2020-03-31 | 2023-02-03 | 斯纳普公司 | 增强现实美容产品教程 |
US11423652B2 (en) | 2020-06-10 | 2022-08-23 | Snap Inc. | Adding beauty products to augmented reality tutorials |
US20220122354A1 (en) * | 2020-06-19 | 2022-04-21 | Pinterest, Inc. | Skin tone determination and filtering |
WO2022215772A1 (ko) * | 2021-04-07 | 2022-10-13 | 이지현 | 화장품 추천 방법 및 이를 구현하는 화장품 표시 시스템 |
US11373059B2 (en) * | 2021-07-23 | 2022-06-28 | MIME, Inc. | Color image analysis for makeup color prediction model |
US11816144B2 (en) | 2022-03-31 | 2023-11-14 | Pinterest, Inc. | Hair pattern determination and filtering |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007257194A (ja) * | 2006-03-22 | 2007-10-04 | Kao Corp | メイクアップシミュレーション方法 |
JP2008003724A (ja) * | 2006-06-20 | 2008-01-10 | Kao Corp | 美容シミュレーションシステム |
JP2009251850A (ja) * | 2008-04-04 | 2009-10-29 | Albert:Kk | 類似画像検索を用いた商品推薦システム |
JP2014149697A (ja) * | 2013-02-01 | 2014-08-21 | Panasonic Corp | メイクアップ支援装置、メイクアップ支援方法、およびメイクアップ支援プログラム |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001268594A (ja) * | 2000-03-15 | 2001-09-28 | Infiniteface.Com Inc | 三次元ビューティーシミュレーション用クライアントサーバシステム |
JP2001292832A (ja) * | 2000-04-13 | 2001-10-23 | Shiseido Co Ltd | メーキャップカラーイメージ分類法およびメーキャップカラーイメージマップ |
JP2009064423A (ja) * | 2007-08-10 | 2009-03-26 | Shiseido Co Ltd | メイクアップシミュレーションシステム、メイクアップシミュレーション装置、メイクアップシミュレーション方法およびメイクアップシミュレーションプログラム |
JP5261586B2 (ja) | 2007-08-10 | 2013-08-14 | 株式会社 資生堂 | メイクアップシミュレーションシステム、メイクアップシミュレーション装置、メイクアップシミュレーション方法およびメイクアップシミュレーションプログラム |
JP5442966B2 (ja) | 2008-07-10 | 2014-03-19 | 株式会社 資生堂 | ゲーム装置、ゲーム制御方法、ゲーム制御プログラム、及び、該プログラムを記録した記録媒体 |
JP4760999B1 (ja) * | 2010-10-29 | 2011-08-31 | オムロン株式会社 | 画像処理装置、画像処理方法、および制御プログラム |
JP4862955B1 (ja) * | 2010-10-29 | 2012-01-25 | オムロン株式会社 | 画像処理装置、画像処理方法、および制御プログラム |
JP2012181688A (ja) * | 2011-03-01 | 2012-09-20 | Sony Corp | 情報処理装置、情報処理方法、情報処理システムおよびプログラム |
JP2014021782A (ja) * | 2012-07-19 | 2014-02-03 | Canon Inc | 画像処理装置、その制御方法及びプログラム |
WO2014118842A1 (ja) * | 2013-02-01 | 2014-08-07 | パナソニック株式会社 | 美容支援装置、美容支援システム、美容支援方法、並びに美容支援プログラム |
EP3005085A1 (en) * | 2013-05-29 | 2016-04-13 | Nokia Technologies Oy | An apparatus and associated methods |
JP6026655B2 (ja) * | 2013-06-07 | 2016-11-16 | 富士フイルム株式会社 | 透明感評価装置、透明感評価装置の作動方法、透明感評価方法および透明感評価プログラム |
CN104822292B (zh) * | 2013-08-30 | 2019-01-04 | 松下知识产权经营株式会社 | 化妆辅助装置、化妆辅助系统、化妆辅助方法以及化妆辅助程序 |
WO2015029392A1 (ja) * | 2013-08-30 | 2015-03-05 | パナソニックIpマネジメント株式会社 | メイクアップ支援装置、メイクアップ支援方法、およびメイクアップ支援プログラム |
US10553006B2 (en) * | 2014-09-30 | 2020-02-04 | Tcms Transparent Beauty, Llc | Precise application of cosmetic looks from over a network environment |
JP6275086B2 (ja) * | 2015-07-25 | 2018-02-07 | 株式会社オプティム | サーバ、データ提供方法及びサーバ用プログラム |
US10373348B2 (en) * | 2016-08-05 | 2019-08-06 | Optim Corporation | Image processing apparatus, image processing system, and program |
-
2016
- 2016-07-08 US US15/322,634 patent/US10607372B2/en active Active
- 2016-07-08 WO PCT/JP2016/070212 patent/WO2018008138A1/ja active Application Filing
- 2016-07-08 JP JP2016555364A patent/JP6055160B1/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007257194A (ja) * | 2006-03-22 | 2007-10-04 | Kao Corp | メイクアップシミュレーション方法 |
JP2008003724A (ja) * | 2006-06-20 | 2008-01-10 | Kao Corp | 美容シミュレーションシステム |
JP2009251850A (ja) * | 2008-04-04 | 2009-10-29 | Albert:Kk | 類似画像検索を用いた商品推薦システム |
JP2014149697A (ja) * | 2013-02-01 | 2014-08-21 | Panasonic Corp | メイクアップ支援装置、メイクアップ支援方法、およびメイクアップ支援プログラム |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113850096A (zh) * | 2018-04-24 | 2021-12-28 | 株式会社Lg生活健康 | 移动终端 |
WO2021153305A1 (ja) * | 2020-01-31 | 2021-08-05 | 株式会社Zozo | メガネ、推奨化粧品提示制御システム及び推奨化粧品提示制御方法 |
JP2021121880A (ja) * | 2020-01-31 | 2021-08-26 | 株式会社Zozo | メガネ、推奨化粧品提示制御システム及び推奨化粧品提示制御方法 |
JP2021180012A (ja) * | 2020-01-31 | 2021-11-18 | 株式会社Zozo | メガネ、推奨化粧品提示制御システム及び推奨化粧品提示制御方法 |
JP2022002116A (ja) * | 2020-01-31 | 2022-01-06 | 株式会社Zozo | 制御システム及び制御方法 |
JP7469278B2 (ja) | 2020-01-31 | 2024-04-16 | 株式会社Zozo | コンピュータシステム、制御方法及びプログラム |
WO2023166911A1 (ja) * | 2022-03-04 | 2023-09-07 | 株式会社Zozo | 情報処理装置、情報処理方法及び情報処理プログラム |
WO2023166910A1 (ja) * | 2022-03-04 | 2023-09-07 | 株式会社Zozo | 情報処理装置、情報処理方法及び情報処理プログラム |
WO2024048274A1 (ja) * | 2022-08-29 | 2024-03-07 | 株式会社 資生堂 | 情報処理装置、情報処理方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
JP6055160B1 (ja) | 2016-12-27 |
JPWO2018008138A1 (ja) | 2018-07-05 |
US10607372B2 (en) | 2020-03-31 |
US20190197736A1 (en) | 2019-06-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6055160B1 (ja) | 化粧品情報提供システム、化粧品情報提供装置、化粧品情報提供方法、及びプログラム | |
US11854070B2 (en) | Generating virtual makeup products | |
US11854072B2 (en) | Applying virtual makeup products | |
US20240193833A1 (en) | System and method for digital makeup mirror | |
RU2668408C2 (ru) | Устройства, системы и способы виртуализации зеркала | |
US10373348B2 (en) | Image processing apparatus, image processing system, and program | |
US20180268572A1 (en) | Makeup part generating apparatus, makeup part utilizing apparatus, makeup part generating method, makeup part utilizing method, non-transitory computer-readable recording medium storing makeup part generating program, and non-transitory computer-readable recording medium storing makeup part utilizing program | |
JP3984191B2 (ja) | 仮想化粧装置及びその方法 | |
US8498456B2 (en) | Method and system for applying cosmetic and/or accessorial enhancements to digital images | |
JP4435809B2 (ja) | 仮想化粧装置及びその方法 | |
CN109310196B (zh) | 化妆辅助装置以及化妆辅助方法 | |
JP7278724B2 (ja) | 情報処理装置、情報処理方法、および情報処理プログラム | |
JP2010211308A (ja) | メイクアップアドバイス装置、メイクアップアドバイス方法及びプログラム | |
JP2011209887A (ja) | アバター作成方法、アバター作成プログラム、およびネットワークサービスシステム | |
RU2703327C1 (ru) | Способ обработки двухмерного изображения и реализующее его вычислительное устройство пользователя | |
CN113454673A (zh) | 信息处理装置以及程序 | |
CN111429193A (zh) | 商品颜色参数的确定方法、装置、设备及可读存储介质 | |
KR101520863B1 (ko) | 얼굴인식을 이용한 캐릭터 제작 방법 및 이를 지원하는 단말 | |
JP7502711B1 (ja) | プログラム、情報処理方法、情報処理装置 | |
JP7413691B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP2024083217A (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP2024082674A (ja) | 情報処理装置、情報処理方法、およびプログラム | |
CN111582965A (zh) | 扩增实境影像的处理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016555364 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16908182 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 06.06.2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16908182 Country of ref document: EP Kind code of ref document: A1 |