WO2018025402A1 - 画像処理装置、画像処理システム、及びプログラム - Google Patents
画像処理装置、画像処理システム、及びプログラム Download PDFInfo
- Publication number
- WO2018025402A1 WO2018025402A1 PCT/JP2016/073115 JP2016073115W WO2018025402A1 WO 2018025402 A1 WO2018025402 A1 WO 2018025402A1 JP 2016073115 W JP2016073115 W JP 2016073115W WO 2018025402 A1 WO2018025402 A1 WO 2018025402A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- makeup
- face image
- image
- color
- face
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
Definitions
- the present invention relates to an image processing apparatus, an image processing system, and a program.
- Patent Literature 1 describes a makeup simulation system that performs makeup processing on a moving image obtained by photographing a user's face.
- Patent Document 2 describes a game device that performs a makeup simulation using a user's face image.
- An object of the present invention is to make it easy to recognize the effect of makeup in a simulation of makeup.
- the present invention includes an acquisition unit that acquires a face image of a user, a generation unit that generates at least one face image with makeup using the acquired face image, and the acquired face image or the Provided is an image processing apparatus including a combining unit that combines a part of a plurality of face images including at least one generated face image to generate a combined image, and an output unit that outputs the generated combined image. .
- the figure which shows an example of a structure of the purchase assistance system The figure which shows an example of the hardware constitutions of the user terminal 20.
- the figure which shows an example of a function structure of the user terminal. 5 is a flowchart showing an example of the operation of the user terminal 20.
- the figure which shows an example of the screen The figure which shows an example of the screen.
- FIG. The figure which shows an example of the screen 253.
- FIG. The figure which shows an example of the layer 62-1.
- FIG. 1 is a diagram illustrating an example of a configuration of a purchase support system 1 according to the present embodiment.
- the purchase support system 1 in order to support the purchase of cosmetics, a makeup simulation using a user's face image is performed.
- the purchase support system 1 includes a server 10 and a user terminal 20 (an example of an image processing apparatus).
- the server 10 and the user terminal 20 are connected via the network 2.
- the network 2 is the Internet, for example. However, the network 2 is not limited to the Internet, and may be another communication line.
- the server 10 stores pre-registered cosmetic product information.
- the product information of the cosmetic includes, for example, a cosmetic name, brand name, price, and image data representing the cosmetic.
- the server 10 provides the user terminal 20 with cosmetic product information corresponding to the makeup for which the simulation has been performed.
- User terminal 20 is used by a user.
- the user performs a makeup simulation using the user terminal 20.
- the user browses the product information on the cosmetics provided from the server 10 using the user terminal 20.
- FIG. 2 is a diagram illustrating an example of a hardware configuration of the user terminal 20.
- the user terminal 20 is a computer such as a tablet terminal, a portable terminal, or a personal computer.
- the user terminal 20 includes a processor 21, a memory 22, a communication interface 23, an input unit 24, a display unit 25, an imaging unit 26, and a storage unit 27.
- the processor 21 executes various processes according to the program stored in the memory 22.
- a CPU Central Processing Unit
- the memory 22 stores a program executed by the processor 21.
- a RAM Random Access Memory
- the communication interface 23 is connected to the network 2 and performs communication via the network 2.
- the input unit 24 is used to operate the user terminal 20 and inputs information corresponding to the operation to the user terminal 20.
- the input unit 24 for example, a touch panel or a button is used.
- the display unit 25 displays various information.
- a liquid crystal display is used as the display unit 25.
- the imaging unit 26 captures an image.
- a camera is used as the imaging unit 26.
- the storage unit 27 stores various programs and data. As the storage unit 27, for example, a flash memory or a hard disk drive is used.
- FIG. 3 is a diagram illustrating an example of a functional configuration of the user terminal 20.
- the user terminal 20 functions as an acquisition unit 201, a generation unit 202, a synthesis unit 203, and an output unit 204. These functions are realized by one or more processors 21 executing a program stored in the memory 22.
- the acquisition unit 201 acquires a user's face image.
- the acquisition unit 201 acquires a user's face image captured by the imaging unit 26.
- the acquisition unit 201 may acquire the user's face image from the storage unit 27.
- the generating unit 202 generates at least one face image with makeup using the face image acquired by the acquiring unit 201.
- the generation unit 202 may generate a plurality of face images subjected to makeup up to each process.
- makeup up to the process includes both makeup applied in the target process and makeup applied in the process prior to the target process.
- the synthesizing unit 203 synthesizes a part of a plurality of face images including the face image acquired by the acquiring unit 201 or at least one face image generated by the generating unit 202 to generate a synthesized image.
- the plurality of face images are, for example, two face images of a first face image and a second face image.
- the first face image may be a face image acquired by the acquisition unit 201, that is, a face image before applying makeup, or a face image that has been applied up to the first step among a plurality of steps. It may be.
- the second face image may be a face image that has undergone makeup in all of the plurality of steps, or may be a face image that has undergone makeup up to the second step among the plurality of steps. . Note that the first step and the second step are different steps.
- the output unit 204 outputs various images including the synthesized image generated by the synthesizing unit 203. For example, the output unit 204 displays the composite image on the display unit 25. As another example, the output unit 204 may transmit the composite image to the external device via the communication interface 23. As another example, when the user terminal 20 includes a printer, a composite image may be printed.
- FIG. 4 is a flowchart showing an example of the operation of the user terminal 20 according to the present embodiment.
- a makeup simulation is performed using the user's face image.
- the cosmetics corresponding to the makeup with which the simulation was performed are recommended to the user.
- step S11 a makeup simulation process is performed. Specifically, the user first captures his / her face image 261 using the imaging unit 26.
- the acquisition unit 201 acquires the face image 261 captured by the imaging unit 26.
- the output unit 204 displays on the display unit 25 a screen 251 that accepts selection of a part to be applied (hereinafter referred to as “target part”).
- FIG. 5 is a diagram illustrating an example of the screen 251.
- the screen 251 includes a face image 261 before makeup is applied. Further, the screen 251 includes buttons 511 to 514 used for selecting a target part.
- This target part includes face parts such as the entire face, eyes, cheeks, and mouth.
- the “eye” here refers to the eye hole, not the eyeball portion between the eyelids.
- the user first paints the foundation color on the entire face area of the face image 261. In this case, the user presses the button 511 using the input unit 24. Thereby, the entire face is selected as the target part.
- the target part may be selected using a cursor displayed on the screen 251.
- the user uses the input unit 24 to move the cursor to the position of the desired target part, and performs an operation of selecting the target part.
- the target part may be selected by such an operation.
- FIG. 6 is a diagram illustrating an example of the screen 252.
- the screen 252 includes a plurality of color samples 521 used for selecting a foundation color. These color swatches 521 may include not only color swatches 521 with different color types but also color swatches 521 with different color densities and textures. This texture changes depending on, for example, the amount of pearl, the amount of glitter, or the transparency. Note that the color of the color sample 521 may be the color of a product that is actually sold, or may be a color that is unrelated to the color of the product that is actually sold.
- the user uses the input unit 24 to perform an operation of selecting the color sample 521 representing the desired foundation color from the color samples 521. Thereby, the color of the foundation is selected.
- the generation unit 202 applies the selected foundation color to the entire face area included in the face image 261. Specifically, the generation unit 202 extracts the entire face area from the face image 261 by image recognition. For this extraction, for example, a well-known face recognition technique using the positional relationship of feature points such as eyes, nose, and mouth may be used. Subsequently, the generation unit 202 generates a new layer 61, and in this layer 61, paints the color of the selected foundation in a range corresponding to the extracted region.
- FIG. 7 is a diagram illustrating an example of the layer 61.
- the layer 61 includes a layer image 611 representing the color portion of the foundation painted on the entire face area.
- the layer 61 is overlaid on the face image 261 shown in FIG. Thereby, the face image 262 painted with the color of the foundation is displayed.
- the screen 252 accepts selection of a makeup tool used when applying the foundation.
- the screen 252 includes a plurality of icons 522 used for selecting a makeup tool such as a sponge or a brush.
- Each icon 522 includes an image representing a makeup tool. Note that this image may be an image representing a product that is actually sold, or an illustration representing a makeup tool.
- the user uses the input unit 24 to perform an operation of selecting an icon 522 representing a desired makeup tool from the plurality of icons 522. Thereby, a makeup tool is selected. Note that the makeup tool need not necessarily be selected.
- the generation unit 202 paints the foundation color at a density or uniformity of density according to the selected makeup tool.
- the foundation color may be painted with a lower density than the reference density and with a higher degree of uniformity than the reference density.
- the sponge when the sponge is selected, the foundation color may be applied with a higher density than the reference density and with a lower uniformity than the reference density.
- the makeup color density or density uniformity is changed by the makeup tool selected by the user.
- the output unit 204 displays a screen 253 that accepts selection of the color of the eye shadow on the display unit 25.
- FIG. 8 is a diagram illustrating an example of the screen 253.
- the screen 253 includes a plurality of color samples 531 used for selecting the color of the eye shadow.
- the color of the color sample 531 may be the color of a product that is actually sold, or may be a color that is unrelated to the color of the product that is actually sold.
- the user uses the input unit 24 to perform an operation of selecting a color sample 531 representing a desired eye shadow color from a plurality of color samples 531. Thereby, the color of the eye shadow is selected.
- the generation unit 202 applies the selected eye shadow color to the eye region included in the face image 262. Specifically, the generation unit 202 extracts an eye region from the face image 261 by image recognition. Subsequently, the generation unit 202 generates a new layer 62-1 and paints the selected eye shadow color in a range corresponding to the extracted region in the layer 62-1.
- FIG. 9 is a diagram illustrating an example of the layer 62-1.
- the layer 62-1 includes a layer image 621 representing the color portion of the eye shadow painted on the eye area.
- the layer 62-1 is overlaid on the face image 262 shown in FIG. As a result, the face image 263 painted with the color of the foundation and the color of the eye shadow is displayed.
- the screen 253 accepts selection of a makeup tool used when applying the eye shadow.
- the screen 253 includes a plurality of icons 532 used for selecting a makeup tool such as a tip or a brush.
- Each icon 532 includes an image representing a makeup tool. Note that this image may be an image representing a product that is actually sold, or an illustration of a makeup tool.
- the user uses the input unit 24 to perform an operation of selecting an icon 532 representing a desired makeup tool from the plurality of icons 532. Thereby, a makeup tool is selected. Note that the makeup tool need not necessarily be selected.
- the generation unit 202 may apply an eye shadow color at a concentration according to the selected makeup tool or within a range according to the makeup tool. Further, the generation unit 202 may blur the boundary portion of the color of the eye shadow by a range corresponding to the selected makeup tool. For example, when a chip is selected, the color of the eye shadow may be applied in a range that is higher than the reference density and smaller than the reference range. On the other hand, when a brush is selected, the color of the eye shadow is painted in a range that is lower than the reference density and larger than the reference range, and the boundary portion is blurred in a range that is larger than the reference range. May be. As described above, the makeup color selected by the user changes the makeup color density, the range in which makeup is applied, or the range in which the border portion of the makeup color is blurred.
- the user may apply the eye shadow color by overlapping it a plurality of times.
- the user again uses the input unit 24 to again select a color sample 531 representing a desired eye shadow color from among the plurality of color samples 531 shown in FIG. Perform an operation to select. Thereby, the color of the eye shadow to be applied for the second time is selected.
- the color of the eye shadow becomes darker by applying the color of the eye shadow repeatedly.
- the density of the makeup color increases as the number of times of repeated application increases.
- the color of the eye shadow is changed by applying the color of the eye shadow in an overlapping manner. Specifically, the color of the eye shadow is changed to a color obtained by combining the color of the eye shadow applied for the first time and the color of the eye shadow applied for the second time. In this way, when different makeup colors are applied in a superimposed manner, the makeup color is changed to a color obtained by combining all the makeup colors overlaid.
- a different layer may be generated each time the color of the eye shadow is applied once. For example, when the eye shadow color is applied for the first time, the layer 62-1 shown in FIG. 9 is generated. When the eye shadow color is applied for the second time, a new layer 62-2 is generated.
- FIG. 10 is a diagram illustrating an example of the layer 62-2. Similar to the layer image 621, the layer 62-2 includes a layer image 622 representing the color portion of the eye shadow painted on the eye area. When the layer 62-2 is generated, the layer 62-2 is overlaid on the face image 263 shown in FIG. As a result, the face image 264 in which the color of the foundation is applied and the color of the eye shadow is applied twice is displayed.
- FIG. 11 is a diagram illustrating an example of a process for applying makeup.
- the step of applying makeup includes steps 1, 2-1, 2-2, 3, and 4. These steps are performed in the order of steps 1, 2-1, 2-2, 3, and 4.
- step 1 the foundation color is applied to the entire face area.
- step 1 the layer 61 shown in FIG. 7 is generated.
- step 2-1 the eye shadow color is applied to the eye region (first time).
- step 2-1 the layer 62-1 shown in FIG. 9 is generated.
- step 2-2 the eye shadow color is applied over the eye region (second time).
- step 2-2 the layer 62-2 shown in FIG. 10 is generated.
- step 3 a cheek color is applied to the cheek region.
- step 3 a layer including a layer image representing a cheek color portion applied to the cheek region is generated.
- step 4 lipstick is applied to the mouth area.
- step 4 a layer including a layer image representing a lipstick color portion painted in the mouth region is generated.
- steps 1, 3, and 4 are steps separated based on the area where makeup is applied.
- steps 2-1 and 2-2 are steps separated based on the times when makeup is applied.
- the process of applying the color of the eye shadow is divided into processes 2-1 and 2-1, but these processes may be treated as one process.
- the generating unit 202 generates a face image on which makeup has been applied up to the process shown in FIG.
- the face image 262 after applying makeup in step 1 is generated by overlaying the layer 61 shown in FIG. 7 on the face image 261 before applying makeup.
- a face image 263 after applying makeup in steps 1 and 2-1 is generated by overlaying the layer 62-1 shown in FIG. 9 on the face image 262.
- the face image 264 after applying makeup in steps 1, 2-1, and 2-2 is superimposed on the face image 263 by overlapping 62-2 shown in FIG. Generated.
- a face image is generated by sequentially superimposing all layers generated in the steps before the step on the face image 261 before the makeup is applied.
- the screen displayed in this simulation process includes a return button 515.
- the user wants to change the makeup content, the user presses the return button 515 using the input unit 24.
- the output unit 204 displays a list 254 of processes for applying makeup on the display unit 25.
- the process list 254 may be included in a part of the screen already displayed on the display unit 25, or may be displayed in a pop-up format on this screen.
- FIG. 12 is a diagram showing an example of a list 254 of processes for applying makeup.
- the process list 254 includes icons 541 to 546 used for selecting each process.
- Each of the icons 541 to 546 includes thumbnails of face images on which makeup has been applied up to the corresponding process.
- the icon 541 is used to select a state before makeup.
- the icon 541 includes a thumbnail of the face image 261 before makeup is applied.
- the icon 542 is used for selection of the process 1.
- the icon 542 includes a thumbnail of the face image 262 after the makeup is applied in step 1.
- the icon 543 is used for selection in step 2-1.
- the icon 543 includes a thumbnail of the face image 263 after the makeup is applied in step 2-1.
- step 1 an example of the target step
- step 4 the color of the foundation applied in step 1 (an example of the target step)
- the user performs an operation of selecting the icon 542 using the input unit 24.
- the output unit 204 displays the screen 252 shown in FIG. 6 on the display unit 25 again.
- the user uses the input unit 24 to perform an operation of selecting a color sample 521 representing the color of another foundation from the plurality of color samples 521. This selects a different foundation color.
- the generation unit 202 changes the foundation color applied to the entire face area in step 1 to the selected other color. This change in the color of the foundation is reflected in all the face images to which the makeup in Step 1 has been applied. Specifically, the generation unit 202 changes the color of the layer image 611 included in the layer 61 illustrated in FIG. 7 to another selected color. As a result, the color of the foundation is changed to another color in all the face images including the layer 61, that is, in all the face images generated for the processes after the process 1.
- the generation unit 202 may generate a new layer, and in this layer, paint a different foundation color in a range corresponding to the entire face area. In this case, this new layer is used instead of the layer 61.
- At least one process included in the plurality of processes may be deleted later.
- the process 2-1 an example of the target process
- the user performs an operation of selecting and deleting the icon 543 shown in FIG. Thereby, the process 2-1 is deleted.
- the generation unit 202 deletes the layer 62-1 generated in the process 2-1.
- the color portion of the eye shadow applied for the first time is deleted from all the face images including the layer 62-1, that is, all the face images generated for the processes after the process 2-1.
- the screen displayed in the simulation process includes an end button 516.
- the user presses the end button 516 using the input unit 24.
- the end button 516 is pressed, the process proceeds to step S12 shown in FIG.
- the synthesizing unit 203 includes the left half of the face image 261 (an example of the first face image) before applying makeup and a face image (an example of the second face image) on which all the steps of makeup have been applied. Are combined with the right half to generate a composite image 551.
- the face image that has been subjected to makeup in all the steps is a process in which these steps are performed when makeup is performed in steps 1, 2-1, 2, 2, 3, and 4. Is a face image after makeup is applied.
- the synthesis unit 203 first cuts out the left half from the face image 261.
- the synthesizing unit 203 cuts out the right half from the user's face image that has been subjected to makeup in all steps. Then, the synthesizing unit 203 synthesizes these parts to generate a synthesized image 551.
- the output unit 204 displays a screen 255 including the composite image 551 on the display unit 25.
- FIG. 13 is a diagram illustrating an example of the screen 255.
- This screen 255 includes a composite image 551.
- the left half of the composite image 551 is a user's face image before applying makeup, and the right half is a user's face image after applying makeup in all steps.
- the screen 255 includes a process list 254 shown in FIG. The user can freely change the left half or the right half of the composite image 551 by selecting a desired process.
- the left half of the composite image 551 is changed to a face image 263 (an example of the first face image) on which makeup has been applied up to step 2-1 (an example of the first step), and the right side of the composite image 551 is displayed.
- a case is assumed where half is changed to a face image 264 (an example of a second face image) on which makeup has been applied up to step 2-2 (an example of a second step).
- the user performs an operation of selecting the icons 543 and 544 using the input unit 24.
- the synthesis unit 203 cuts out the left half from the face image 263. Further, the synthesis unit 203 cuts out the right half from the face image 264.
- the synthesizing unit 203 then synthesizes these parts to generate a synthesized image 552.
- the composite image 552 is generated, the composite image 552 is displayed instead of the composite image 551.
- the screen 255 includes a determination button 553. If the user likes the makeup that has been simulated, the user presses the enter button 553 using the input unit 24. When the enter button 553 is pressed, the process proceeds to step S13 shown in FIG.
- step S13 the output unit 204 displays the purchase screen 256 on the display unit 25.
- This purchase screen 256 includes product information on cosmetics used for the makeup for which simulation has been performed.
- This cosmetic product information is provided from the server 10.
- the output unit 204 transmits information indicating the makeup on which the simulation is performed to the server 10.
- This information includes information indicating the color of makeup.
- the server 10 determines a cosmetic corresponding to the makeup for which the simulation has been performed. For example, when the simulated makeup color is a makeup color using a cosmetic product that is actually sold, this cosmetic product may be determined.
- a cosmetic product that is used for makeup having a color similar to the simulated makeup color is determined. May be.
- Foundation A, Eyeshadow B, Teak C, and Lipstick D are determined.
- the server 10 transmits the determined cosmetic product information to the user terminal 20.
- FIG. 14 is a diagram illustrating an example of the purchase screen 256.
- the purchase screen 256 includes product information of foundation A, product information of eye shadow B, product information of teak C, and product information of lipstick D.
- the purchase screen 256 includes purchase buttons 561 to 564.
- Purchase buttons 561 to 564 are used for operations for executing the purchase of foundation A, eye shadow B, teak C, and lipstick D, respectively.
- the user presses the purchase button 562 using the input unit 24.
- purchase button 562 is pressed, a procedure for purchasing eye shadow B by electronic commerce is performed. Thereby, the purchase of the eye shadow B is completed.
- the face images before and after the makeup is applied or the face images before and after the makeup in a certain process can be compared in one face image. Therefore, it becomes easy to recognize the effect of makeup in the simulation of makeup.
- the present invention is not limited to the above-described embodiment.
- the embodiment may be modified as follows. Moreover, you may implement combining the following modifications.
- a makeup simulation may be performed using a cosmetic or a makeup tool owned by the user.
- the storage unit 27 of the user terminal 20 stores information indicating cosmetics or makeup tools owned by the user in advance.
- Each screen displayed in the simulation process described above includes a call button.
- the output unit 204 displays a list of the cosmetics or makeup tools on the display unit 25. This list of cosmetics or makeup tools may be included in a part of the screen already displayed on the display unit 25, or may be displayed in a pop-up format on this screen.
- the user uses the input unit 24 to select a desired cosmetic or makeup tool from the list of cosmetics or makeup tools.
- the generation unit 202 generates a face image with makeup according to the selected cosmetic or makeup tool. For example, the generation unit 202 applies a makeup color corresponding to the selected cosmetic to the target portion of the face image.
- the user can perform a makeup simulation using a cosmetic or a makeup tool owned by the user. Therefore, for example, it is possible to purchase cosmetics used for makeup that matches the color of the cosmetics that the user owns.
- the makeup color displayed on the display unit 25 may be different from the actual makeup color.
- the actual makeup color can be estimated from the makeup color displayed on the display unit 25.
- a background or light effect corresponding to the scene may be added to the composite image.
- This scene refers to a scene according to time or place such as outdoor at daytime, office at daytime, restaurant at night.
- the light intensity that illuminates the user such as light intensity, color, direction, and light source position, is different.
- the light effect is an image simulating the light illuminated by the user in each scene.
- the synthesizing unit 203 may add a typical image representing the interior of a restaurant at night to the background of the synthesized image 551 shown in FIG. 13, for example.
- the synthesis unit 203 may add a light effect imitating lighting typically used in a restaurant at night to the synthesized image 551.
- a shadow may be added to the composite image 551 in accordance with the unevenness of the user's face. The unevenness of the user's face may be set in advance.
- a three-dimensional composite image may be generated, and a shadow may be added to the composite image. This three-dimensional composite image may be generated, for example, when the user images his / her head from 360 degrees in all directions with the imaging unit 26.
- a sound that matches the scene selected by the user may be output.
- the user terminal 20 includes a speaker that outputs sound. For example, when a night restaurant scene is selected as described above, music with a calm atmosphere that is played in a night restaurant may be output.
- a face image after a lapse of time after makeup is applied may be displayed.
- the state where the makeup is broken includes, for example, a state where the makeup color is removed, a state where the makeup color is dull, a state where gloss is increased by sweat and sebum, or a state where the makeup color spreads to other areas. It is.
- the generation unit 202 generates a face image after a predetermined time has elapsed since applying makeup in order to simulate the state in which the makeup is broken. Specifically, the generation unit 202 changes the makeup color, the color of the facial image, or the region where the makeup is applied from the facial image immediately after the makeup, thereby obtaining the facial image after a predetermined time has elapsed. Generate.
- a face image after a predetermined time has passed may have a lower density or saturation of makeup color than a face image immediately after makeup. Further, the face image after a predetermined time has passed may be partially higher in gloss than the face image immediately after makeup. Furthermore, the face image after a predetermined time has passed may be enlarged in the area where the makeup color is applied, compared to the face image immediately after the makeup.
- the synthesizing unit 203 may generate a synthesized image by synthesizing the left half of the face image immediately after makeup and the right half of the face image after a predetermined time has elapsed.
- the degree of makeup breaking down may vary depending on the cosmetic.
- selection of a cosmetic used for makeup may be received, and the above-described color change amount or region change amount may be changed depending on the selected cosmetic product.
- the composition unit 203 applies the makeup using a certain cosmetic, then applies the makeup to the left half of the face image after a predetermined time has passed and the makeup using another cosmetic.
- the composite image may be generated by combining the right half of the face image after the elapse of time.
- Modification 4 The method of generating the composite image is not limited to the method described in the embodiment.
- the upper half of a certain face image and the lower half of another face image may be combined.
- a half of a specific area of a certain face image and a portion other than half of a specific area of another face image may be combined.
- the right half of the mouth area of a certain face image and the other area of another face image that is, the left half of the mouth area and the area other than the mouth may be combined.
- a part of three or more face images may be synthesized.
- the ratio of the plurality of face images constituting the composite image may be changed.
- the screen 255 includes a dividing line 554 that indicates boundaries between a plurality of face images that form the composite image.
- the user performs an operation of moving the dividing line 554 using the input unit 24.
- the synthesizing unit 203 re-synthesizes the two face images constituting the synthesized image 551 according to the ratio corresponding to the position of the divided line 554 after the movement.
- Modification 6 In the embodiment described above, some of the plurality of face images are combined and displayed. However, these face images may be alternately displayed without being synthesized.
- the right half or the left half of a plurality of face images on which makeup has been performed up to a plurality of steps performed before the target step may be displayed in order.
- the face image constituting the right half of the composite image 551 shown in FIG. 13 is a face image after the makeup of steps 1, 2-1, 2-2, 3, and 4 has been performed.
- the right halves of a plurality of face images to which makeup has been applied up to these steps may be displayed in order.
- the display time of each face image is shortened, a state in which makeup in each step is applied in order to the right half of the user's face image is displayed like a moving image.
- the purchase screen 256 may include product information on the makeup tool used in the makeup simulation.
- the server 10 stores pre-registered product information of the makeup tool.
- the user terminal 20 transmits information indicating the makeup tool used in the makeup simulation to the server 10.
- the server 10 determines a makeup tool corresponding to the makeup tool used in the makeup simulation based on the information indicating the makeup tool received from the user terminal 20, and transmits product information of the makeup tool to the user terminal 20.
- the makeup tool determined by the server 10 may be a makeup tool that is the same as or similar to the makeup tool used in the makeup simulation, for example, or may be a makeup tool having a similar function.
- Modification 8 The makeup simulation process described above may be performed by the user terminal 20 used by the user and a terminal different from the user terminal 20.
- FIG. 15 is a diagram illustrating an example of the purchase support system 3 (an example of an image processing system) according to a modification.
- the purchase support system 3 includes a terminal 30 in addition to the server 10 and the user terminal 20 shown in FIG.
- the user terminal 20 and the terminal 30 are connected via the network 2.
- the terminal 30 is a computer such as a tablet terminal, a portable terminal, or a personal computer.
- the terminal 30 is used, for example, by a professional makeup artist.
- the user terminal 20 transmits the user's face image captured by the imaging unit 26 to the terminal 30.
- the terminal 30 applies makeup to the face image received from the user terminal 20, and generates a face image after makeup.
- the terminal 30 has the function of the generation unit 202 described above.
- the terminal 30 transmits the face image after makeup to the user terminal 20.
- the user terminal 20 combines a part of the face image before makeup and a part of the face image after makeup to generate a composite image and outputs it. The user can learn makeup suitable for his / her face by viewing the composite image.
- Modification 9 The steps of processing performed in the purchase support system 1 are not limited to the example described in the above-described embodiment. The steps of this process may be interchanged as long as there is no contradiction.
- the present invention may be provided as a method including steps of processing performed in the purchase support system 1 or the user terminal 20.
- the present invention may be provided as a program executed on the user terminal 20.
- This program may be downloaded via the network 2 such as the Internet.
- These programs are provided in a state where they are recorded on a computer-readable recording medium such as a magnetic recording medium (magnetic tape, magnetic disk, etc.), an optical recording medium (optical disk, etc.), a magneto-optical recording medium, or a semiconductor memory. May be.
Abstract
Description
本発明は、化粧のシミュレーションにおいて、化粧の効果を認識し易くすることを目的とする。
図1は、本実施形態に係る購入支援システム1の構成の一例を示す図である。購入支援システム1では、化粧品の購入を支援するために、ユーザの顔画像を用いた化粧のシミュレーションが行われる。購入支援システム1は、サーバ10とユーザ端末20(画像処理装置の一例)とを備える。サーバ10とユーザ端末20とは、ネットワーク2を介して接続される。ネットワーク2は、例えばインターネットである。ただし、ネットワーク2は、インターネットに限定されず、他の通信回線であってもよい。
図4は、本実施形態に係るユーザ端末20の動作の一例を示すフローチャートである。この動作では、ユーザの顔画像を用いて化粧のシミュレーションが行われる。そして、シミュレーションが行われた化粧に対応する化粧品がユーザにレコメンドされる。
図6は、画面252の一例を示す図である。画面252には、ファンデーションの色の選択に用いられる複数の色見本521が含まれる。これらの色見本521には、色の種類が異なる色見本521だけでなく、色の濃度や質感が異なる色見本521が含まれていてもよい。この質感は、例えばパールの量、ラメの量、又は透明度によって変化する。なお、色見本521の色は、実際に販売されている商品の色であってもよいし、実際に販売されている商品の色とは無関係の色であってもよい。ユーザは、入力部24を用いて、色見本521の中から所望のファンデーションの色を表す色見本521を選択する操作を行う。これにより、ファンデーションの色が選択される。
図11は、化粧を施す工程の一例を示す図である。この例では、化粧を施す工程には、工程1、2‐1、2‐2、3、及び4が含まれる。これらの工程は、工程1、2‐1、2‐2、3、及び4という順番で行われる。
本発明は上述した実施形態に限定されない。実施形態は、以下のように変形してもよい。また、以下の変形例を組み合わせて実施してもよい。
上述した実施形態において、ユーザが所有する化粧品又は化粧道具を用いて化粧のシミュレーションが行われてもよい。この場合、ユーザ端末20の記憶部27には、予めユーザが所有する化粧品又は化粧道具を示す情報が格納される。上述したシミュレーション処理で表示される各画面には、呼び出しボタンが含まれる。ユーザが入力部24を用いてこの呼び出しボタンを押すと、記憶部27に記憶された情報に基づいて、ユーザが所有する化粧品又は化粧道具の一覧が生成される。出力部204は、この化粧品又は化粧道具の一覧を表示部25に表示する。この化粧品又は化粧道具の一覧は、既に表示部25に表示されている画面の一部に含まれてもよいし、この画面の上にポップアップ形式で表示されてもよい。
上述した実施形態において、シーンに応じた背景又は光のエフェクトが合成画像に追加されてもよい。このシーンとは、昼の屋外、昼のオフィス、夜のレストラン等、時間又は場所に応じた場面をいう。これらのシーンでは、光の強度、色、方向、光源の位置等、ユーザを照らす光の具合が異なる。光のエフェクトは、各シーンでユーザに照らされる光を模した画像である。
上述したシミュレーション処理において、化粧が施されてから時間が経過した後の顔画像が表示されてもよい。実際にユーザの顔に化粧を施した場合、時間が経過するにつれて化粧が崩れてくる。この化粧が崩れた状態には、例えば化粧の色が取れた状態、化粧の色がくすんだ状態、汗と皮脂により光沢が増した状態、又は化粧の色が他の領域に広がった状態が含まれる。生成部202は、この化粧が崩れた状態をシミュレーションするために、化粧を施してから所定の時間が経過した後の顔画像を生成する。具体的には、生成部202は、化粧直後の顔画像から、化粧の色、顔画像の色、又は化粧が施された領域を変化させることにより、所定の時間が経過した後の顔画像を生成する。
合成画像の生成の仕方は、実施形態で説明した方法に限定されない。例えば、或る顔画像の上半分と他の顔画像の下半分とが合成されてもよい。他の例として、或る顔画像の特定の領域の半分と他の顔画像の特定の領域の半分以外の部分とが合成されてもよい。例えば、或る顔画像の口の領域の右半分と、他の顔画像のこれ以外の領域、すなわち口の領域の左半分と口以外の領域とが合成されてもよい。また、3個以上の顔画像の一部が合成されてもよい。
上述した実施形態において、合成画像が生成された後に、合成画像を構成する複数の顔画像の比率が変更されてもよい。図13に示すように、画面255には、合成画像を構成する複数の顔画像の境界を示す分割線554が含まれる。ユーザは、入力部24を用いて、分割線554を移動する操作を行う。ここでは、合成画像551上の分割線554が移動された場合を想定する。分割線554が移動されると、合成部203は、移動後の分割線554の位置に対応する比率に従って、合成画像551を構成する2個の顔画像を合成し直す。これにより、例えば合成画像551上の分割線554が画面255に向かって右方向に移動された場合には、合成画像551の右半分を構成する顔画像の部分が減り、左半分を構成する顔画像の部分が増える。反対に、この分割線554が画面255に向かって左方向に移動された場合、合成画像551の左半分を構成する顔画像の部分が減り、右半分を構成する顔画像の部分が増える。
上述した実施形態では、複数の顔画像の一部が合成されて表示されていた。しかし、これらの顔画像が合成されずに交互に表示されてもよい。他の例として、合成画像において、対象の工程以前に行われた複数の工程までの化粧が施された複数の顔画像の右半分又は左半分が順番に表示されてもよい。例えば、図13に示す合成画像551の右半分を構成する顔画像は、工程1、2‐1、2‐2、3、及び4の化粧が施された後の顔画像である。この場合、合成画像551の右半分において、これらの各工程までの化粧が施された複数の顔画像の右半分が順番に表示されてもよい。各顔画像の表示時間を短くした場合には、ユーザの顔画像の右半分に順番に各工程の化粧が施されていく様子が動画のように表示される。
上述した実施形態において、購入画面256には、化粧のシミュレーションで用いられた化粧道具の商品情報が含まれていてもよい。この場合、サーバ10には、予め登録された化粧道具の商品情報が格納される。ユーザ端末20は、化粧のシミュレーションで用いられた化粧道具を示す情報をサーバ10に送信する。サーバ10は、ユーザ端末20から受信した化粧道具を示す情報に基づいて、化粧のシミュレーションで用いられた化粧道具に対応する化粧道具を決定し、その化粧道具の商品情報をユーザ端末20に送信する。このとき、サーバ10で決定される化粧道具は、例えば化粧のシミュレーションで用いられた化粧道具と同一又は類似の化粧道具であってもよいし、同様の機能を有する化粧道具であってもよい。
上述した化粧のシミュレーション処理は、ユーザが使用するユーザ端末20と、ユーザ端末20とは異なる端末とにより行われてもよい。
購入支援システム1において行われる処理のステップは、上述した実施形態で説明した例に限定されない。この処理のステップは、矛盾のない限り、入れ替えられてもよい。また、本発明は、購入支援システム1又はユーザ端末20において行われる処理のステップを備える方法として提供されてもよい。
本発明は、ユーザ端末20において実行されるプログラムとして提供されてもよい。このプログラムは、インターネット等のネットワーク2を介してダウンロードされてもよい。また、これらのプログラムは、磁気記録媒体(磁気テープ、磁気ディスクなど)、光記録媒体(光ディスクなど)、光磁気記録媒体、半導体メモリなどの、コンピュータが読取可能な記録媒体に記録した状態で提供されてもよい。
Claims (10)
- ユーザの顔画像を取得する取得部と、
前記取得された顔画像を用いて、化粧が施された少なくとも1の顔画像を生成する生成部と、
前記取得された顔画像又は前記生成された少なくとも1の顔画像を含む複数の顔画像の一部を合成して合成画像を生成する合成部と、
前記生成された合成画像を出力する出力部と
を備える画像処理装置。 - 前記化粧を施す工程は、複数の工程を含み、
前記複数の顔画像は、第1の顔画像と第2の顔画像を含み、
前記第1の顔画像は、前記取得された顔画像又は前記複数の工程のうち第1の工程までの化粧が施された顔画像であり、
前記第2の顔画像は、前記複数の工程の化粧が施された顔画像又は前記複数の工程のうち前記第1の工程とは異なる第2の工程までの化粧が施された顔画像である
請求項1に記載の画像処理装置。 - 前記化粧を施す工程は、複数の工程を含み、
前記生成部は、前記複数の工程のうち対象の工程で施された化粧の色が変更されると、前記少なくとも1の顔画像に前記化粧の色の変更を反映する
請求項1又は2に記載の画像処理装置。 - 前記化粧を施す工程は、複数の工程を含み、
前記生成部は、前記複数の工程から対象の工程が削除されると、前記対象の工程で施された化粧の色部分を前記少なくとも1の顔画像から削除する
請求項1から3のいずれか1項に記載の画像処理装置。 - 前記化粧を施す工程は、前記化粧が前記顔画像に含まれる複数の領域に対して施される場合において、前記化粧が施される領域に基づいて区切られ、又は前記化粧の色が複数回重ねて塗られる場合において、前記化粧の色が塗られた回に基づいて区切られる
請求項2から4のいずれか1項に記載の画像処理装置。 - 前記生成部は、前記ユーザにより選択された化粧道具によって、前記化粧の色の濃度、前記濃度の均一度、前記化粧が施される範囲、又は前記色の境界部分をぼかす範囲を変更する
請求項1から5のいずれか1項に記載の画像処理装置。 - 前記合成部は、前記ユーザにより選択されたシーンに応じた背景又は光のエフェクトを前記合成画像に追加する
請求項1から6のいずれか1項に記載の画像処理装置。 - 前記合成部は、前記合成画像を構成する前記複数の顔画像の比率が変更されると、前記変更された比率に従って、前記複数の顔画像の一部を合成する
請求項1から7のいずれか1項に記載の画像処理装置。 - ユーザの顔画像を取得する取得部と、
前記取得された顔画像を用いて、化粧が施された少なくとも1の顔画像を生成する生成部と、
前記取得された顔画像又は前記生成された少なくとも1の顔画像を含む複数の顔画像の一部を合成して合成画像を生成する合成部と、
前記生成された合成画像を出力する出力部と
を備える画像処理システム。 - コンピュータに、
ユーザの顔画像を取得するステップと、
前記取得された顔画像を用いて、化粧が施された少なくとも1の顔画像を生成するステップと、
前記取得された顔画像又は前記生成された少なくとも1の顔画像を含む複数の顔画像の一部を合成して合成画像を生成するステップと、
前記生成された合成画像を出力するステップと
を実行させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/096,341 US10373348B2 (en) | 2016-08-05 | 2016-08-05 | Image processing apparatus, image processing system, and program |
PCT/JP2016/073115 WO2018025402A1 (ja) | 2016-08-05 | 2016-08-05 | 画像処理装置、画像処理システム、及びプログラム |
JP2018531712A JP6448869B2 (ja) | 2016-08-05 | 2016-08-05 | 画像処理装置、画像処理システム、及びプログラム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/073115 WO2018025402A1 (ja) | 2016-08-05 | 2016-08-05 | 画像処理装置、画像処理システム、及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018025402A1 true WO2018025402A1 (ja) | 2018-02-08 |
Family
ID=61073550
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/073115 WO2018025402A1 (ja) | 2016-08-05 | 2016-08-05 | 画像処理装置、画像処理システム、及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US10373348B2 (ja) |
JP (1) | JP6448869B2 (ja) |
WO (1) | WO2018025402A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021072515A (ja) * | 2019-10-30 | 2021-05-06 | 大日本印刷株式会社 | 画像処理装置、印画物作製システム及び印画物作製方法 |
JP2021519992A (ja) * | 2018-04-24 | 2021-08-12 | エルジー ハウスホールド アンド ヘルスケア リミテッド | 移動端末機及び化粧品自動認識システム |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018008138A1 (ja) * | 2016-07-08 | 2018-01-11 | 株式会社オプティム | 化粧品情報提供システム、化粧品情報提供装置、化粧品情報提供方法、及びプログラム |
CN106780662B (zh) | 2016-11-16 | 2020-09-18 | 北京旷视科技有限公司 | 人脸图像生成方法、装置及设备 |
CN106780658B (zh) * | 2016-11-16 | 2021-03-09 | 北京旷视科技有限公司 | 人脸特征添加方法、装置及设备 |
EP3700190A1 (en) * | 2019-02-19 | 2020-08-26 | Samsung Electronics Co., Ltd. | Electronic device for providing shooting mode based on virtual character and operation method thereof |
US20220101404A1 (en) * | 2020-09-28 | 2022-03-31 | Snap Inc. | Selecting color values for augmented reality-based makeup |
US20220101418A1 (en) * | 2020-09-28 | 2022-03-31 | Snap Inc. | Providing augmented reality-based makeup product sets in a messaging system |
US11798202B2 (en) | 2020-09-28 | 2023-10-24 | Snap Inc. | Providing augmented reality-based makeup in a messaging system |
KR20230147649A (ko) * | 2021-02-23 | 2023-10-23 | 베이징 센스타임 테크놀로지 디벨롭먼트 컴퍼니 리미티드 | 이미지 처리 방법, 장치, 전자 기기 및 기억 매체 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000069404A (ja) * | 1998-08-25 | 2000-03-03 | Konami Co Ltd | 画像プリント作成装置 |
JP2004094917A (ja) * | 2002-07-08 | 2004-03-25 | Toshiba Corp | 仮想化粧装置及びその方法 |
JP2008003724A (ja) * | 2006-06-20 | 2008-01-10 | Kao Corp | 美容シミュレーションシステム |
WO2008102440A1 (ja) * | 2007-02-21 | 2008-08-28 | Tadashi Goino | 化粧顔画像生成装置及び方法 |
JP2010515489A (ja) * | 2007-01-05 | 2010-05-13 | マイスキン インコーポレイテッド | 皮膚を撮像するためのシステム、装置、及び方法 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7634103B2 (en) * | 2001-10-01 | 2009-12-15 | L'oreal S.A. | Analysis using a three-dimensional facial image |
US7437344B2 (en) * | 2001-10-01 | 2008-10-14 | L'oreal S.A. | Use of artificial intelligence in providing beauty advice |
US7039222B2 (en) * | 2003-02-28 | 2006-05-02 | Eastman Kodak Company | Method and system for enhancing portrait images that are processed in a batch mode |
JP5261586B2 (ja) | 2007-08-10 | 2013-08-14 | 株式会社 資生堂 | メイクアップシミュレーションシステム、メイクアップシミュレーション装置、メイクアップシミュレーション方法およびメイクアップシミュレーションプログラム |
US20090231356A1 (en) * | 2008-03-17 | 2009-09-17 | Photometria, Inc. | Graphical user interface for selection of options from option groups and methods relating to same |
US10872322B2 (en) * | 2008-03-21 | 2020-12-22 | Dressbot, Inc. | System and method for collaborative shopping, business and entertainment |
JP5442966B2 (ja) | 2008-07-10 | 2014-03-19 | 株式会社 資生堂 | ゲーム装置、ゲーム制御方法、ゲーム制御プログラム、及び、該プログラムを記録した記録媒体 |
CN102077242B (zh) * | 2008-07-30 | 2013-08-07 | 松下电器产业株式会社 | 3d纹理的超分辨率的图像生成设备和方法 |
EP2786343A4 (en) * | 2011-12-04 | 2015-08-26 | Digital Makeup Ltd | DIGITAL MAKE-UP |
US8908904B2 (en) * | 2011-12-28 | 2014-12-09 | Samsung Electrônica da Amazônia Ltda. | Method and system for make-up simulation on portable devices having digital cameras |
JP6198555B2 (ja) * | 2012-11-13 | 2017-09-20 | 株式会社オカダ電子 | 画像比較機能を備えた顧客管理システム、画像比較機能を備えた顧客管理システムを利用したゴルフレッスン支援システム |
US10321747B2 (en) * | 2013-02-01 | 2019-06-18 | Panasonic Intellectual Property Management Co., Ltd. | Makeup assistance device, makeup assistance system, makeup assistance method, and makeup assistance program |
US20170177927A1 (en) * | 2014-02-17 | 2017-06-22 | Nec Solution Innovators, Ltd. | Impression analysis device, game device, health management device, advertising support device, impression analysis system, impression analysis method, and program recording medium |
US10916044B2 (en) * | 2015-07-21 | 2021-02-09 | Sony Corporation | Information processing apparatus, information processing method, and program |
WO2017013925A1 (ja) * | 2015-07-21 | 2017-01-26 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
JP6275086B2 (ja) * | 2015-07-25 | 2018-02-07 | 株式会社オプティム | サーバ、データ提供方法及びサーバ用プログラム |
EP3396619A4 (en) * | 2015-12-25 | 2019-05-08 | Panasonic Intellectual Property Management Co., Ltd. | MAKE UP PART GENERATION, MAKE UP PART USE, MAKE UP PART GENERATION, MAKE UP PART USE, MAKE UP PART GENERATION AND MAKE UP PART USE |
US11315173B2 (en) * | 2016-09-15 | 2022-04-26 | GlamST LLC | Applying virtual makeup products |
US10540697B2 (en) * | 2017-06-23 | 2020-01-21 | Perfect365 Technology Company Ltd. | Method and system for a styling platform |
-
2016
- 2016-08-05 US US16/096,341 patent/US10373348B2/en active Active
- 2016-08-05 JP JP2018531712A patent/JP6448869B2/ja active Active
- 2016-08-05 WO PCT/JP2016/073115 patent/WO2018025402A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000069404A (ja) * | 1998-08-25 | 2000-03-03 | Konami Co Ltd | 画像プリント作成装置 |
JP2004094917A (ja) * | 2002-07-08 | 2004-03-25 | Toshiba Corp | 仮想化粧装置及びその方法 |
JP2008003724A (ja) * | 2006-06-20 | 2008-01-10 | Kao Corp | 美容シミュレーションシステム |
JP2010515489A (ja) * | 2007-01-05 | 2010-05-13 | マイスキン インコーポレイテッド | 皮膚を撮像するためのシステム、装置、及び方法 |
WO2008102440A1 (ja) * | 2007-02-21 | 2008-08-28 | Tadashi Goino | 化粧顔画像生成装置及び方法 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021519992A (ja) * | 2018-04-24 | 2021-08-12 | エルジー ハウスホールド アンド ヘルスケア リミテッド | 移動端末機及び化粧品自動認識システム |
JP7064022B2 (ja) | 2018-04-24 | 2022-05-09 | エルジー ハウスホールド アンド ヘルスケア リミテッド | 移動端末機及び化粧品自動認識システム |
JP2022090005A (ja) * | 2018-04-24 | 2022-06-16 | エルジー ハウスホールド アンド ヘルスケア リミテッド | 移動端末機及び化粧品自動認識システム |
JP7379582B2 (ja) | 2018-04-24 | 2023-11-14 | エルジー ハウスホールド アンド ヘルスケア リミテッド | 移動端末機及び化粧品自動認識システム |
JP2021072515A (ja) * | 2019-10-30 | 2021-05-06 | 大日本印刷株式会社 | 画像処理装置、印画物作製システム及び印画物作製方法 |
JP7314767B2 (ja) | 2019-10-30 | 2023-07-26 | 大日本印刷株式会社 | 画像処理装置、印画物作製システム及び印画物作製方法 |
Also Published As
Publication number | Publication date |
---|---|
JP6448869B2 (ja) | 2019-01-09 |
US20190156522A1 (en) | 2019-05-23 |
US10373348B2 (en) | 2019-08-06 |
JPWO2018025402A1 (ja) | 2018-11-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6448869B2 (ja) | 画像処理装置、画像処理システム、及びプログラム | |
JP6778877B2 (ja) | メイクパーツ作成装置、メイクパーツ利用装置、メイクパーツ作成方法、メイクパーツ利用方法、メイクパーツ作成プログラム、およびメイクパーツ利用プログラム | |
JP6055160B1 (ja) | 化粧品情報提供システム、化粧品情報提供装置、化粧品情報提供方法、及びプログラム | |
KR101306221B1 (ko) | 3차원 사용자 아바타를 이용한 동영상 제작장치 및 방법 | |
JP5324031B2 (ja) | 美容シミュレーションシステム | |
KR20210119438A (ko) | 얼굴 재연을 위한 시스템 및 방법 | |
CN112396679B (zh) | 虚拟对象显示方法及装置、电子设备、介质 | |
JP2011209887A (ja) | アバター作成方法、アバター作成プログラム、およびネットワークサービスシステム | |
CN111935505B (zh) | 视频封面生成方法、装置、设备及存储介质 | |
CN101925345A (zh) | 化妆方法、化妆模拟装置、以及化妆模拟程序 | |
EP3912136A1 (en) | Systems and methods for generating personalized videos with customized text messages | |
US11842433B2 (en) | Generating personalized videos with customized text messages | |
CN113709549A (zh) | 特效数据包生成、图像处理方法、装置、设备及存储介质 | |
GB2549160A (en) | Image processing system and method | |
KR20160134883A (ko) | 영상컨텐츠 적용 디지털액터의 운용방법 | |
CN116744820A (zh) | 数字彩妆师 | |
KR20180039321A (ko) | 반응형영상 제작방법, 반응형영상 생성파일 제작방법, 반응형영상 기반의 광고효과 분석방법 및 이를 이용한 프로그램 | |
JP2013178789A (ja) | 美容シミュレーションシステム | |
CN112083863A (zh) | 图像处理方法、装置、电子设备及可读存储介质 | |
KR20200092893A (ko) | 3d스캔데이터를 이용한 증강현실 동영상제작시스템 및 그 방법 | |
JP2000306092A (ja) | デジタル画像処理により実現される鏡及びその処理をコンピュータに行わせるためのプログラムを内蔵した媒体 | |
CN114793286A (zh) | 基于虚拟形象的视频编辑方法和系统 | |
Qiu et al. | Fencing Hallucination: An Interactive Installation for Fencing with AI and Synthesizing Chronophotographs | |
JP2024505359A (ja) | デジタルメイクアップアーティスト | |
Antoine | Understanding and improving the relation between interaction simplicity and expressivity for producing illustrations |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2018531712 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16911661 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 15/05/2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16911661 Country of ref document: EP Kind code of ref document: A1 |