WO2017179134A1 - 化粧シミュレーションシステム、化粧シミュレーション方法、および化粧シミュレーションプログラム - Google Patents
化粧シミュレーションシステム、化粧シミュレーション方法、および化粧シミュレーションプログラム Download PDFInfo
- Publication number
- WO2017179134A1 WO2017179134A1 PCT/JP2016/061846 JP2016061846W WO2017179134A1 WO 2017179134 A1 WO2017179134 A1 WO 2017179134A1 JP 2016061846 W JP2016061846 W JP 2016061846W WO 2017179134 A1 WO2017179134 A1 WO 2017179134A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- contour
- makeup simulation
- designation
- component part
- makeup
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
Definitions
- the present invention relates to a makeup simulation system for performing makeup simulation on a face image, a makeup simulation method, and a makeup simulation program.
- Patent Document 1 describes a technique related to a display method when applying makeup or adjusting appearance while confirming the display content of a captured image of a user.
- Patent Document 2 describes a technique for assisting a user's makeup using three-dimensional information acquired using a three-dimensional measuring unit.
- JP2013-223001A Japanese Patent Laying-Open No. 2015-197710
- Patent Document 2 it is considered that the use of three-dimensional information makes it easier to recognize the face state compared to Patent Document 1, but it is necessary to project a measurement pattern by the light projecting unit. Therefore, there is a problem that the apparatus becomes expensive.
- the present invention has an object to provide a makeup simulation system, a makeup simulation method, and a makeup simulation program that perform makeup simulation on a face image by accepting designation of a plurality of feature points by plotting.
- the present invention provides the following solutions.
- the invention according to the first feature is A makeup simulation system for performing makeup simulation on a face image, A component part type designation receiving means for receiving designation of the type of component part of the face image; A feature point designation accepting unit that accepts designation of a plurality of feature points by plotting on the contour of the component that has received the designation; Contour drawing means for drawing the contour of the component part based on a plurality of feature points designated by the plot; A makeup simulation system is provided.
- a component part type designation receiving unit that receives designation of the type of component part of the face image, and the component part that has received the designation
- a feature point designation accepting unit for accepting designation of a plurality of feature points by plotting on the contour of the contour
- a contour drawing unit for drawing the contours of the component parts based on the plurality of feature points designated by the plot.
- the invention according to the first feature is a category of a makeup simulation system, but the makeup simulation method and the makeup simulation program have the same functions and effects.
- the invention according to the second feature is a makeup simulation system which is the invention according to the first feature, There is provided a makeup simulation system including four points of the top, bottom, leftmost and rightmost of the component parts as feature points to be designated by the plot.
- the makeup simulation system according to the invention according to the first feature has the top, bottom, leftmost, and rightmost features of the component parts as the feature points designated by the plot. Includes 4 points.
- the invention according to the third feature is a makeup simulation system which is the invention according to the first or second feature, Attribute information input receiving means for receiving input of attribute information of the face image,
- the contour drawing means provides a makeup simulation system characterized in that the contour of the component parts is drawn in consideration of the inputted attribute information.
- the makeup simulation system includes attribute information input reception means for receiving input of attribute information of the face image, and the contour drawing means. Draws the contour of the component part in consideration of the input attribute information.
- the invention according to the fourth feature is a makeup simulation system which is the invention according to any one of the first to third features, When the designated component is an eye, from the outline of the eye drawn by the outline drawing means, a heel drawing means for predicting and drawing a heel outline; A makeup simulation system is provided.
- the makeup simulation system according to the invention according to any one of the first to third features is drawn by the contour drawing means when the designated component part is an eye.
- eyelid drawing means for predicting and drawing the eyelid contour from the eye contour.
- the invention according to a fifth feature is a makeup simulation system that is an invention according to any one of the first to fourth features, Drawing outline fine adjustment means for performing fine adjustment of the drawn outline when there is a deviation between the drawn outline and the contour of the constituent parts of the face image;
- a makeup simulation system is provided.
- the makeup simulation system in the makeup simulation system according to any one of the first to fourth aspects, there is a gap between the drawn outline and the outline of the constituent parts of the face image.
- a drawing contour fine adjustment means for performing fine adjustment of the drawn contour.
- the invention according to a sixth feature is a makeup simulation system that is an invention according to any one of the first to fifth features, A screen sharing means for sharing a screen with a terminal connected via a network; Remote operation means capable of operating the shared screen from a terminal connected via the network; A makeup simulation system is provided.
- a makeup simulation system includes a screen sharing means for sharing a screen with a terminal connected via a network, Remote operation means capable of operating the shared screen from a terminal connected via a network.
- the invention according to the seventh feature is a makeup simulation method for performing makeup simulation on a face image, A component part type designation receiving step for receiving designation of the type of the component part of the face image; A feature point designation receiving step for accepting designation of a plurality of feature points by plotting on the contour of the component part that has received the designation; An outline drawing step for drawing an outline of the component part based on a plurality of feature points designated by the plot; A makeup simulation method is provided.
- the invention according to the eighth feature is a makeup simulation system for performing makeup simulation on a face image.
- a component part type designation receiving step for receiving designation of the type of the component part of the face image;
- a feature point designation step for accepting designation of a plurality of feature points by plotting on the contour of the component part that has received the designation,
- a makeup simulation program is provided.
- a makeup simulation system for performing makeup simulation on a face image by accepting designation of a plurality of feature points by plotting.
- FIG. 1 is a schematic diagram of a preferred embodiment of the present invention.
- FIG. 2 is a diagram illustrating the relationship between the function blocks of the terminal 100 and each function.
- FIG. 3 is a flowchart of the makeup simulation process of the terminal 100.
- FIG. 4 is a diagram illustrating the relationship between the function blocks of the terminal 100 having the attribute information input reception function and the functions.
- FIG. 5 is a flowchart of makeup simulation processing of the terminal 100 having the attribute information input reception function.
- FIG. 6 is a diagram illustrating the relationship between the function blocks of the terminal 100 having the wrinkle drawing function and each function.
- FIG. 7 is a flowchart of makeup simulation processing of the terminal 100 having the wrinkle drawing function.
- FIG. 1 is a schematic diagram of a preferred embodiment of the present invention.
- FIG. 2 is a diagram illustrating the relationship between the function blocks of the terminal 100 and each function.
- FIG. 3 is a flowchart of the makeup simulation process of the terminal 100.
- FIG. 4 is a diagram
- FIG. 8 is a diagram illustrating the relationship between the function blocks of the terminal 100 having the contour fine adjustment function and each function.
- FIG. 9 is a flowchart of makeup simulation processing of the terminal 100 having the contour fine adjustment function.
- FIG. 10 is a diagram illustrating the functional blocks of the connection source terminal 100a and the connection destination terminal 100b having the screen sharing function and the remote operation function, and the relationship between the functions.
- FIG. 11 is a flowchart of the connection source terminal 100a and the connection destination terminal 100b having a screen sharing function and a remote operation function.
- FIG. 12 is a diagram illustrating the functional blocks of the terminal 100 and the server 300 and the relationship between the functions when the server includes a contour database.
- FIG. 13 is a flowchart of the terminal 100 and the server 300 when the server includes a contour database.
- FIG. 14 is an example of a face image acquisition screen display on the terminal 100.
- FIG. 15 is an example of a display of a component part type designation reception screen on the terminal 100.
- FIG. 16 is an example of a feature point designation reception screen displayed on the terminal 100.
- FIG. 17 is an example of a display of a contour drawing screen on the terminal 100.
- FIG. 18 is an example of a display of a makeup drawing screen on the terminal 100.
- FIG. 19 is an example of a display of the attribute information input acceptance screen on the terminal 100.
- FIG. 20 is an example of a display of the outline drawing screen of the eyelids on the terminal 100.
- FIG. 21 is an example of the display of the contour fine adjustment screen on the terminal 100.
- FIG. 22 is an example of a display when screen sharing and remote operation are performed between the connection source terminal 100a and the connection destination terminal 100b.
- FIG. 23 shows an example of the data structure of the contour database.
- FIG. 24 shows an example of the data structure of the contour database when classified for each attribute information.
- the terminal 100 includes a camera unit 110, an input unit 120, an output unit 130, a storage unit 140, and a control unit 150.
- the input unit 120 implements a component part type designation receiving module 121 and a feature point designation receiving module 122 in cooperation with the control unit 150.
- the output unit 130 implements the contour drawing module 131 in cooperation with the control unit 150.
- the terminal 100 acquires a face image (step S01).
- a face image for performing a makeup simulation is determined by performing imaging by the camera unit 110 or selecting an image stored in the terminal 100.
- the component part type designation accepting module 121 of the terminal 100 accepts designation of a component part from the user (step S02).
- the user is allowed to select the type of the constituent parts of the face for which the makeup simulation is to be performed from options such as the right eye, the left eye, the nose, the mouth, and the entire face.
- appropriate processing suitable for each part can be performed thereafter.
- the feature point designation receiving module 122 accepts feature point designation from the user (step S03).
- the feature points are, for example, the leftmost, rightmost, topmost, and bottommost points of the component parts, and are a plurality of points that help to infer the contour from the face image.
- the specification of the feature points is performed by the user plotting the points on the face image.
- the contour drawing module 131 performs contour analogy based on a plurality of designated feature points, and draws the contour on the face image (step S04).
- the analogy process of the contours of the component parts is to prepare a database that stores the contour data of the component parts, and by matching the feature points of the specified contours with the database, the contour data that matches the feature points is obtained. get.
- the user can draw a proper contour and perform a makeup simulation at an appropriate position according to the contour by simply plotting the feature points without requiring a special device.
- FIG. 2 is a diagram illustrating the relationship between the function blocks of the terminal 100 and each function.
- the camera unit 110, the input unit 120, the output unit 130, the storage unit 140, and the control unit 150 are configured.
- the input unit 120 implements a component part type designation receiving module 121 and a feature point designation receiving module 122 in cooperation with the control unit 150.
- the output unit 130 implements the contour drawing module 131 in cooperation with the control unit 150.
- the terminal 100 may be a general information terminal that allows a user to take an image with a camera, and is an information device or an electrical appliance having a function to be described later.
- General information appliances such as mobile phones, smart phones, camera functions, or PCs with tablet PCs, notebook PCs, wearable devices, or displays that can be connected to external cameras such as webcams It can be an electrical appliance.
- the smartphone illustrated as the terminal 100 is just one example.
- the terminal 100 includes a camera in the camera unit 110.
- the camera unit 110 converts the captured image into digital data and stores it in the storage unit 140.
- the captured image may be a still image or a moving image.
- a part of the moving image is cut out by the operation of the control unit 150 and stored in the storage unit 140 as a still image. It is also possible to do this.
- an image obtained by photographing is a precise image having a necessary amount of information, and the number of pixels and image quality can be designated.
- the input unit 120 implements a component part type designation receiving module 121 and a feature point designation receiving module 122 in cooperation with the control unit 150. Further, it is assumed that a function necessary for specifying imaging by the camera unit 110 is provided. As an example, a liquid crystal display that realizes a touch panel function, a keyboard, a mouse, a pen tablet, a hardware button on the apparatus, a microphone for performing voice recognition, and the like can be provided. The function of the present invention is not particularly limited by the input method.
- the output unit 130 implements the contour drawing module 131 in cooperation with the control unit 150. Further, it is assumed that a function necessary for displaying a captured image and a simulation image is provided. For example, forms such as a liquid crystal display, a PC display, and projection onto a projector can be considered. The function of the present invention is not particularly limited by the output method.
- the storage unit 140 includes a data storage unit such as a hard disk or a semiconductor memory, and stores captured moving images and still images, data necessary for contour analogy processing, makeup method data, and the like. In addition, a contour database storing contour data is provided as necessary.
- the control unit 150 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like.
- a CPU Central Processing Unit
- RAM Random Access Memory
- ROM Read Only Memory
- FIG. 3 is a flowchart of the makeup simulation process of the terminal 100. The processing executed by the modules of each device described above will be described together with this processing.
- the terminal 100 acquires a face image for which a makeup simulation is desired (step S101).
- a face image for performing a makeup simulation is determined by performing imaging by the camera unit 110 or selecting an image stored in the terminal 100.
- FIG. 14 is an example of a face image acquisition screen displayed on the terminal 100.
- imaging by the camera unit 110 is started by selecting a camera icon 1401. You can shoot yourself using the in-camera on your smartphone or tablet, or you can have another person shoot using the rear camera (out camera) or take a picture of yourself in the mirror You may do it.
- a determination button 1403 an image for performing makeup simulation is determined.
- the cancel button 1404 is selected, the displayed image is discarded, and the image is taken again or selected from the library.
- the component part type designation accepting module 121 of the terminal 100 accepts designation of a component part from the user (step S102).
- the user is allowed to select the type of the constituent parts of the face for which the makeup simulation is to be performed from options such as the right eye, left eye, nose, mouth, and the entire face.
- appropriate processing suitable for each part can be performed thereafter.
- FIG. 15 is an example of a display of a component part type designation reception screen on the terminal 100.
- the radio button 1501 allows the user to select a component from the eyes (both eyes), right eye, left eye, nose, mouth, cheek, and the entire face. The options listed here are merely examples, and may be set according to the system.
- the right eye is selected with a radio button.
- a determination button 1502 is selected, a component part for which makeup simulation is performed is determined. Until the selection with the radio button is performed, the determination button 1502 may be grayed out.
- a cancel button 1503 is selected, the designation of the component part is discarded and the component part selection screen is terminated. In this case, the process may return to step S101 again to acquire a face image.
- the feature point designation receiving module 122 of the terminal 100 accepts feature point designation from the user (step S103).
- the feature points are, for example, the leftmost, rightmost, topmost, and bottommost points of the component parts, and are a plurality of points that help to infer the contour from the face image.
- the specification of the feature points is performed by the user plotting the points on the face image.
- FIG. 16 is an example of a display of a feature point designation reception screen on the terminal 100.
- FIG. 16 illustrates an example in which feature points are plotted using a touch panel assuming that the terminal 100 is a smartphone.
- the right eye that is the component part selected in step S102 is displayed, and the user is made to input the four feature points of the leftmost, rightmost, topmost, and bottommost.
- the user plots the leftmost feature point 1601, the rightmost feature point 1602, the uppermost feature point 1603, and the lowermost feature point 1604.
- the feature points may be plotted by the user one by one in order, or temporary feature points may be displayed on the image to allow the user to make adjustments.
- the determination button 1605 is selected, the feature point is determined.
- a cancel button 1606 is selected, the feature point input is discarded and the feature point input screen is terminated. In this case, the process may return to step S102 again to select a component part.
- the contour drawing module 131 performs contour analogy based on a plurality of designated feature points, and draws the contour on the face image (step S104).
- the analogy process of the contours of the component parts is to prepare a database that stores the contour data of the component parts, and by matching the feature points of the specified contours with the database, the contour data that matches the feature points is obtained. get.
- FIG. 23 shows an example of the data structure of the contour database.
- Contour data is stored for each component such as the right eye, left eye, nose, mouth, and entire face.
- the contour data includes data necessary for drawing the contours of the component parts together with the face image.
- the data necessary for drawing the contour may be pixel position data of the contour or vector data.
- the contour drawing module 131 acquires appropriate contour data by collating the designated feature points with each contour data, and performs drawing on the output unit 130 of the terminal 100 based on the acquired contour data.
- attribute information such as sex, age, race, etc. may be stored together with the contour data. Details of the attribute information will be described later.
- FIG. 17 is an example of a contour drawing screen display on the terminal 100.
- a contour is drawn on the image in the display field 1705 based on the estimated contour data in accordance with the input feature point.
- contour data is determined.
- a cancel button 1702 is selected, the contour data is discarded and the contour drawing confirmation screen is terminated. In this case, it is possible to return to step S103 again and input the feature points.
- the change button 1703 is selected, collation with the contour database is performed again, and another contour data is acquired and displayed.
- a plurality of first, second, and third candidate contour data may be acquired and switched to the display of the next candidate contour data by selecting the change button 1703. Good.
- the fine adjustment button 1704 is selected, the user is caused to finely adjust the contour. Details of the fine adjustment of the contour will be described later.
- the terminal 100 draws a makeup image along the contour on the output unit 130 (step S105).
- the user selects a desired makeup method and performs drawing in accordance with the contour determined in step S104.
- FIG. 18 is an example of a display of a makeup drawing screen on the terminal 100.
- the makeup method selected by the user is drawn according to the contour.
- the user since the right eye is selected as the component part, the user is prompted to select eye makeup.
- the makeup method is reflected on the image in the display column 1804.
- the selected makeup method is discarded and the makeup drawing screen is terminated. In this case, the outline may be confirmed again by returning to step S104.
- the image displayed in the display field 1804 is only the right eye, but the makeup method may be drawn on the entire face image. Further, by repeating steps S102 to S105, a makeup simulation may be performed in order for each component part, and finally a makeup simulation of the entire face may be completed.
- FIG. 4 is a diagram illustrating the relationship between the function blocks of the terminal 100 having the attribute information input reception function and the functions.
- the input unit 120 and the control unit 150 cooperate to implement the attribute information input reception module 123.
- FIG. 5 is a flowchart of the makeup simulation process of the terminal 100 having the attribute information input reception function.
- the face image acquisition (step S201), component part type designation reception (step S203), feature point designation reception (step S204), contour drawing (step S205), and makeup drawing along the contour (step S206) in FIG. This corresponds to face image acquisition (step S101), component part type designation reception (step S102), feature point designation reception (step S103), contour drawing (step S104), and makeup drawing along the contour (step S105) in FIG. .
- the attribute information input acceptance in step S202 will be mainly described.
- the attribute information input acceptance module 123 accepts input of attribute information from the user after acquiring the face image in step S201 (step S202).
- FIG. 19 is an example of the display of the attribute information input acceptance screen on the terminal 100.
- the attribute information here is information of a person of a face image that is considered to affect the contours of the constituent parts of the face, such as gender, age, and race.
- a radio button 1901 is used to select whether the sex is male or female.
- a radio button 1902 allows the user to select whether his / her age is in his teens, 20s, 30s, 40s, 50s, 60s or more.
- a radio button 1903 is used to select whether the race is Japanese or not. These options are not necessarily those described above, and may be set appropriately according to the system.
- the process may return to step S201 again to acquire a face image, or may proceed to the next step S203.
- step S204 feature point designation reception in step S203 is the same as the component part type designation reception in step S102 and the feature point designation reception processing in step S103 in FIG.
- FIG. 24 shows an example of the data structure of the contour database when classified for each attribute information.
- the contour database data shown in FIG. 23 is classified for each attribute information.
- the contour data of the right eye of a Japanese teenager is classified according to whether it is male, female or gender unknown. If the attribute information input in step S202 is a teenage Japanese male and the designation of the component part is the right eye, the top right "male + teenage + Japanese" right eye of FIG.
- the contour data database may be first inquired. Therefore, if an appropriate item is not found, the right eye contour data database, such as “gender unknown + teenage + Japanese” or “male + 20 + Japanese”, is considered.
- Inquiry shall be made. It is assumed that it is possible to set for each system in what order the inquiry is made to the database for each attribute information. Further, when appropriate contour data cannot be found, the contour data considered to be closest may be used. At this time, when there are a plurality of candidate contour data, those having similar attribute information may be selected with priority.
- FIG. 6 is a diagram illustrating the relationship between the function blocks of the terminal 100 having the wrinkle drawing function and each function.
- the output unit 130 and the control unit 150 cooperate to implement the eyelid drawing module 132.
- FIG. 7 is a flowchart of a makeup simulation process of the terminal 100 having a wrinkle drawing function.
- the face image acquisition (step S301), component part type designation reception (step S302), feature point designation reception (step S303), contour drawing (step S304), and makeup drawing along the contour (step S307) in FIG. This corresponds to face image acquisition (step S101), component part type designation reception (step S102), feature point designation reception (step S103), contour drawing (step S104), and makeup drawing along the contour (step S105) in FIG. . Therefore, here, step S305 and step S306 will be mainly described.
- the eyelid drawing module 132 checks whether or not the component part designated in step S302 is an eye (step S305). If the designated component is not an eye, the process proceeds to step S307.
- the eyelid drawing module 132 performs the eyelid outline drawing (step S306).
- FIG. 20 is an example of the display of the outline drawing screen of the eyelids on the terminal 100.
- the right eye is designated as a component part is shown.
- the eyelid drawing module 132 analogizes the contour of the eyelid of the right eye and draws the contour of the eyelid on the image in the display field 2005.
- FIG. 20 shows an example in which the outline of the eyelid is drawn with a broken line.
- the determination button 2001 is selected, the outline data of the eyelid is determined.
- the cancel button 2002 is selected, the outline data of the eyelid is discarded and the confirmation screen for drawing the outline of the eyelid is ended. In this case, the process may return to step S304 again to perform the right eye contour drawing.
- step S306 a plurality of first candidate, second candidate, and third candidate are acquired as the outline data of the eyelid, and the selection of the change button 2003 displays the outline data of the next candidate eyelid. You may switch to.
- the fine adjustment button 2004 the user is caused to finely adjust the outline of the eyelid. Details of fine adjustment of the outline of the heel will be described later.
- the eyelid contour analogy processing by the eyelid drawing module 132 may be performed by predicting the eyelid thickness based on the eye contour, or the eyelid contour in the contour database may include the contour of the eyelid as attached information. Data may be registered and the data may be acquired.
- a makeup image is drawn along the contours of the drawn right eye and eyelid (step S307).
- the user selects a desired makeup method and performs drawing in accordance with the contour determined in steps S304 and S306.
- FIG. 8 is a diagram illustrating the relationship between the function blocks of the terminal 100 having the contour fine adjustment function and each function.
- the input unit 120 and the control unit 150 cooperate to implement the contour fine adjustment module 124.
- FIG. 9 is a flowchart of makeup simulation processing of the terminal 100 having the contour fine adjustment function.
- the face image acquisition (step S401), component part type designation reception (step S402), feature point designation reception (step S403), contour drawing (step S404), and makeup drawing along the contour (step S407) in FIG. This corresponds to face image acquisition (step S101), component part type designation reception (step S102), feature point designation reception (step S103), contour drawing (step S104), and makeup drawing along the contour (step S105) in FIG. . Therefore, here, step S405 and step S406 will be mainly described.
- the contour fine adjustment module 124 confirms whether or not the contour fine adjustment is performed after the contour drawing in step S404 (step S405).
- whether or not to perform fine adjustment may be selected by the user, or an image analysis is performed to determine whether or not there is a deviation between the contour drawn in S404 and the contour of the component part of the actual face image. If there is a deviation, fine adjustment may be performed. If fine contour adjustment is to be performed, the process proceeds to step S406. If fine contour adjustment is not to be performed, the process proceeds to step S407.
- the contour fine adjustment module 124 finely adjusts the contours of the component parts based on the input from the user (step S406).
- FIG. 21 is an example of the display of the contour fine adjustment screen on the terminal 100. Here, an example in which the right eye is designated as a component part is shown.
- the user finely adjusts the contour by designating and moving a point on the contour of the right eye drawn in the display field 2103.
- the point to be specified and moved does not need to be the feature point input earlier, and can be moved by specifying a point on the contour.
- the contour fine adjustment module 124 redraws the contour by smoothly connecting the moved point and a point on the original contour in the vicinity of the point. Make sure the line is not interrupted.
- an enlargement button 2104 and a reduction button 2105 may be provided so that the user can freely enlarge and reduce the image in the display field 2103 to facilitate fine adjustment of the contour.
- the enter button 2101 is selected, the finely adjusted contour data is saved.
- a cancel button 2102 is selected, the finely adjusted outline data is discarded and the outline fine adjustment screen is terminated. In this case, the contour drawing may be performed again by returning to step S404.
- FIG. 21 shows an example of normal contour fine adjustment, but the same applies to the case of performing the contour fine adjustment of the eyelids.
- FIG. 10 is a diagram illustrating a relationship between the blocks of the connection source terminal 100a and the connection destination terminal 100b having a screen sharing function and a remote operation function, and each function.
- Each of the connection source terminal 100a and the connection destination terminal 100b includes a communication unit 160 in addition to the functions of the terminal 100 described with reference to FIG.
- the communication unit 160 implements the screen sharing module 161 and the remote operation module 162 in cooperation with the control unit 150. It is assumed that the terminals are connected via the communication network 200.
- the communication network 200 may be a public communication network or a dedicated communication network.
- the connection source terminal 100a is a terminal that issues a screen sharing request to the connection destination terminal 100b on the network via the communication network 200.
- the communication unit 160 includes a device for enabling communication with other devices, for example, a WiFi (Wireless Fidelity) compliant device compliant with IEEE 802.11 or an IMT-2000 standard such as a third generation or fourth generation mobile communication system.
- Wireless devices that comply with It may be a wired LAN connection.
- FIG. 11 is a flowchart of the connection source terminal 100a and the connection destination terminal 100b having a screen sharing function and a remote operation function. The processing executed by the modules of each device described above will be described together with this processing.
- the connection source terminal 100a and the connection destination terminal 100b are described as performing direct communication. However, a configuration may be adopted in which a server is interposed therebetween depending on the system.
- the screen sharing module 161 of the connection source terminal 100a notifies the connection destination terminal 100b of the start of screen sharing (step S501).
- the notification of screen sharing start from the connection source terminal 100a to the connection destination terminal 100b is illustrated, but the authentication processing for maintaining security, the screen sharing approval on the connection destination terminal 100b side, are illustrated.
- Processing necessary for the normal screen sharing system such as processing and response processing from the connection destination terminal 100b to the connection source terminal 100a, is also performed.
- the response processing from the connection destination terminal 100b to the connection source terminal 100a is performed by the screen sharing module 161 of the connection destination terminal 100b.
- the screen sharing module 161 of the connection source terminal 100a creates shared data for screen sharing (step S502).
- the appropriate flowcharts are executed while executing the flowcharts of FIGS. 3, 5, 7, and 9.
- the screen data of the connection source terminal 100a is converted into a format suitable for display on the connection destination terminal 100b as shared data.
- the screen sharing module 161 of the connection source terminal 100a transmits the created shared data to the connection destination terminal 100b (step S503).
- the screen sharing module 161 of the connection destination terminal 100b receives the shared data (step S504), and displays a shared screen based on the received shared data (step S505).
- the remote operation module 162 of the connection destination terminal 100b confirms whether or not to operate the shared screen (step S506).
- step S508 the remote operation module 162 of the connection destination terminal 100b accepts an operation input from the user (step S507).
- the operation on the connection destination terminal 100b is equivalent to an operation that can be performed on the input unit 120 of the connection source terminal 100a.
- the remote operation module 162 of the connection destination terminal 100b transmits operation data to the connection source terminal 100a (step S508).
- the operation in step S507 is not performed, only the response may be transmitted instead of transmitting the operation data.
- the screen sharing module 161 of the connection source terminal 100a receives operation data from the connection destination terminal 100b (step S509).
- the screen sharing module 161 of the connection source terminal 100a updates the screen display based on the received operation data (Step S510). For example, when the outline of the connection destination terminal 100b is finely adjusted, the screen after the fine adjustment is displayed on the output unit 130 of the connection source terminal 100a.
- the screen sharing module 161 of the connection source terminal 100a confirms whether or not to end the screen sharing (step S511).
- step S512 If the screen sharing is not terminated, the process returns to step S502 and the process is continued.
- the connection destination terminal 100b is notified of the end of the screen sharing, and the screen sharing process is terminated (step S512).
- connection source terminal 100a determines whether or not to end screen sharing is confirmed only at the connection source terminal 100a.
- the screen on the connection source terminal 100a may be shared instead of the screen on the connection source terminal 100a according to the system or the user's request.
- FIG. 22 is an example of display when screen sharing and remote operation are performed between the connection source terminal 100a and the connection destination terminal 100b.
- the connection source terminal 100 a and the connection destination terminal 100 b are sharing a screen via the communication network 200. It is assumed that the user 2201 of the connection source terminal 100a selects the right eye as a component part and tries to execute the makeup simulation, but the contour drawing is not successful and the above-described contour fine adjustment processing cannot be performed as expected. .
- the user 2202 of the connection destination terminal 100b can perform a makeup simulation in accordance with a desired contour by finely adjusting the contour on the screen of the connection destination terminal 100b.
- the user 2202 may be a friend of the user 2201 or the like, a help desk or operator of a makeup simulation system, or a specialist such as a makeup artist who gives advice on makeup simulation. There may be.
- FIG. 12 is a diagram illustrating the functional blocks of the terminal 100 and the server 300 and the relationship between the functions when the server includes a contour database.
- the communication network 200 may be a public communication network or a dedicated communication network.
- the server 300 includes a communication unit 310, a control unit 320, and a storage unit 330.
- the communication unit 310 can communicate with other devices by wire or wireless.
- the control unit 320 includes a CPU, RAM, ROM, and the like.
- the storage unit 330 includes a data storage unit such as a hard disk or a semiconductor memory, and also includes a contour database 331.
- FIG. 13 is a flowchart diagram of the terminal 100 and the server 300 when the server includes a contour database.
- Face image acquisition step S601
- attribute information input reception step S602
- component part type specification reception step S603
- feature point specification reception step S604
- contour drawing step S611
- the makeup drawing step S612 includes face image acquisition (step S101) in FIG. 3, attribute information input reception (step S202) in FIG. 5, component part type specification reception (step S102), and feature point specification reception (step S102), respectively. This corresponds to step S103), contour drawing (step S104), and makeup drawing along the contour (step S105). Therefore, here, the description will focus on steps S605 to S610.
- the contour drawing module 131 After receiving the feature point designation in step S604, the contour drawing module 131 transmits the type of component and the data of the designated feature points to the server 300 (step S605). When attribute information is input, the attribute information is also transmitted. In addition, when a face image is also used for analogy processing of the contours of the component parts, the face image may be transmitted together.
- the server 300 receives data necessary for contour analogy processing from the terminal 100 (step S606).
- the server 300 refers to the contour database 331 based on the received data, and performs contour analogy processing (step S607).
- the contour data corresponding to the feature point is acquired.
- the contour data is preferentially acquired from the database corresponding to the attribute information as described in the process of FIG.
- the server 300 performs outline data creation processing based on the estimated outline data (step S608).
- the face image used in the makeup simulation and the image registered in the contour database 331 may differ in actual size. In this case, the feature point of the face image and the contour data in the contour database 331 are different.
- the comparison needs to be performed with scaling. If the contour analogy process is performed after scaling, the contour data is scaled according to the size suitable for the actual face image size.
- the server 300 transmits the created contour data to the terminal 100 (step S609).
- the contour drawing module 131 of the terminal 100 receives contour data from the server 300 (step S610).
- the contour drawing module 131 draws a contour on the face image based on the received contour data (step S611).
- the terminal 100 draws a makeup image along the contour on the output unit 130 (step S612).
- the outline database in the server 300 it is not necessary to provide the database in the terminal 100. Therefore, the capacity of the storage unit 140 of the terminal 100 is not compressed. Moreover, since the contour data in the database of the server 300 can be increased at any time, it is possible to provide a more accurate makeup simulation system.
- the means and functions described above are realized by a computer (including a CPU, an information processing apparatus, and various terminals) reading and executing a predetermined program.
- the program is provided in a form recorded on a computer-readable recording medium such as a flexible disk, CD (CD-ROM, etc.), DVD (DVD-ROM, DVD-RAM, etc.), compact memory, and the like.
- the computer reads the program from the recording medium, transfers it to the internal storage device or the external storage device, stores it, and executes it.
- the program may be recorded in advance in a storage device (recording medium) such as a magnetic disk, an optical disk, or a magneto-optical disk, and provided from the storage device to a computer via a communication line.
- 100 terminal 100a connection source terminal, 100b connection destination terminal, 200 communication network, 300 server,
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
顔画像に化粧シミュレーションを行う化粧シミュレーションシステムであって、
前記顔画像の構成パーツの種類の指定を受ける構成パーツ種類指定受付手段と、
前記指定を受けた構成パーツの輪郭上に、複数の特徴点の指定をプロットにより受け付ける特徴点指定受付手段と、
前記プロットにより指定された複数の特徴点に基づいて、前記構成パーツの輪郭を描画する輪郭描画手段と、
を備えることを特徴とする化粧シミュレーションシステムを提供する。
前記プロットにより指定させる特徴点として、前記構成パーツの最も上、最も下、最も左、最も右の4点を含むことを特徴とする化粧シミュレーションシステムを提供する。
前記顔画像の属性情報の入力を受ける属性情報入力受付手段を備え、
前記輪郭描画手段は、前記入力された属性情報を加味して、前記構成パーツの輪郭を描画することを特徴とする化粧シミュレーションシステムを提供する。
前記指定された構成パーツが目である場合、前記輪郭描画手段により描画した目の輪郭から、さらに、瞼の輪郭を予測して描画する瞼描画手段と、
を備えることを特徴とする化粧シミュレーションシステムを提供する。
前記描画した輪郭と前記顔画像の構成パーツの輪郭との間にズレがある場合に、描画した輪郭の微調整を行わせる描画輪郭微調整手段と、
を備えることを特徴とする化粧シミュレーションシステムを提供する。
ネットワークを介して接続された端末と画面を共有する画面共有手段と、
前記ネットワークを介して接続された端末から前記共有した画面に対して操作を行える遠隔操作手段と、
を備えることを特徴とする化粧シミュレーションシステムを提供する。
前記顔画像の構成パーツの種類の指定を受ける構成パーツ種類指定受付ステップと、
前記指定を受けた構成パーツの輪郭上に、複数の特徴点の指定をプロットにより受け付ける特徴点指定受付ステップと、
前記プロットにより指定された複数の特徴点に基づいて、前記構成パーツの輪郭を描画する輪郭描画ステップと、
を備えることを特徴とする化粧シミュレーション方法を提供する。
前記顔画像の構成パーツの種類の指定を受ける構成パーツ種類指定受付ステップ、
前記指定を受けた構成パーツの輪郭上に、複数の特徴点の指定をプロットにより受け付ける特徴点指定ステップ、
前記プロットにより指定された複数の特徴点に基づいて、前記構成パーツの輪郭を描画する輪郭描画ステップ、
を実行させるための化粧シミュレーションプログラムを提供する。
本発明の概要について図1に基づいて、説明する。端末100は、図2に示すように、カメラ部110、入力部120、出力部130、記憶部140、制御部150から構成される。入力部120は制御部150と協働して構成パーツ種類指定受付モジュール121、特徴点指定受付モジュール122を実現する。また、出力部130は制御部150と協働して輪郭描画モジュール131を実現する。
図2は、端末100の機能ブロックと各機能の関係を示す図である。カメラ部110、入力部120、出力部130、記憶部140、制御部150から構成される。入力部120は制御部150と協働して構成パーツ種類指定受付モジュール121、特徴点指定受付モジュール122を実現する。また、出力部130は制御部150と協働して輪郭描画モジュール131を実現する。
図3は、端末100の化粧シミュレーション処理のフローチャート図である。上述した各装置のモジュールが実行する処理について、本処理に併せて説明する。
図4は、属性情報入力受付機能を備えた端末100の機能ブロックと各機能の関係を示す図である。図2で説明した機能に加えて、入力部120と制御部150とが協働して属性情報入力受付モジュール123を実現する。
図6は、瞼描画機能を備えた端末100の機能ブロックと各機能の関係を示す図である。図2で説明した機能に加えて、出力部130と制御部150とが協働して瞼描画モジュール132を実現する。
図8は、輪郭微調整機能を備えた端末100の機能ブロックと各機能の関係を示す図である。図2で説明した機能に加えて、入力部120と制御部150とが協働して輪郭微調整モジュール124を実現する。
図10は、画面共有機能と遠隔操作機能を備えた接続元端末100aと接続先端末100bのブロックと各機能の関係を示す図である。接続元端末100aと接続先端末100bは、それぞれ図2で説明した端末100の機能に加えて、通信部160を備える。通信部160は、制御部150と協働して、画面共有モジュール161と遠隔操作モジュール162を実現する。端末間は、通信網200を介して接続されているものとする。通信網200は、公衆通信網でも専用通信網でも良い。ここでは、接続元端末100aを、通信網200を介して、ネットワーク上にある接続先端末100bに対して、画面共有依頼を実行する端末であるとする。
Claims (8)
- 顔画像に化粧シミュレーションを行う化粧シミュレーションシステムであって、
前記顔画像の構成パーツの種類の指定を受ける構成パーツ種類指定受付手段と、
前記指定を受けた構成パーツの輪郭上に、複数の特徴点の指定をプロットにより受け付ける特徴点指定受付手段と、
前記プロットにより指定された複数の特徴点に基づいて、前記構成パーツの輪郭を描画する輪郭描画手段と、
を備えることを特徴とする化粧シミュレーションシステム。 - 前記プロットにより指定させる特徴点として、前記構成パーツの最も上、最も下、最も左、最も右の4点を含むことを特徴とする請求項1に記載の化粧シミュレーションシステム。
- 前記顔画像の属性情報の入力を受ける属性情報入力受付手段を備え、
前記輪郭描画手段は、前記入力された属性情報を加味して、前記構成パーツの輪郭を描画することを特徴とする請求項1または請求項2に記載の化粧シミュレーションシステム。 - 前記指定された構成パーツが目である場合、前記輪郭描画手段により描画した目の輪郭から、さらに、瞼の輪郭を予測して描画する瞼描画手段と、
を備えることを特徴とする請求項1から請求項3のいずれかに記載の化粧シミュレーションシステム。 - 前記描画した輪郭と前記顔画像の構成パーツの輪郭との間にズレがある場合に、描画した輪郭の微調整を行わせる描画輪郭微調整手段と、
を備えることを特徴とする請求項1から請求項4のいずれかに記載の化粧シミュレーションシステム。 - ネットワークを介して接続された端末と画面を共有する画面共有手段と、
前記ネットワークを介して接続された端末から前記共有した画面に対して操作を行える遠隔操作手段と、
を備えることを特徴とする請求項1から請求項5のいずれかに記載の化粧シミュレーションシステム。 - 顔画像に化粧シミュレーションを行う化粧シミュレーション方法であって、
前記顔画像の構成パーツの種類の指定を受ける構成パーツ種類指定受付ステップと、
前記指定を受けた構成パーツの輪郭上に、複数の特徴点の指定をプロットにより受け付ける特徴点指定受付ステップと、
前記プロットにより指定された複数の特徴点に基づいて、前記構成パーツの輪郭を描画する輪郭描画ステップと、
を備えることを特徴とする化粧シミュレーション方法。 - 顔画像に化粧シミュレーションを行う化粧シミュレーションシステムに、
前記顔画像の構成パーツの種類の指定を受ける構成パーツ種類指定受付ステップ、
前記指定を受けた構成パーツの輪郭上に、複数の特徴点の指定をプロットにより受け付ける特徴点指定ステップ、
前記プロットにより指定された複数の特徴点に基づいて、前記構成パーツの輪郭を描画する輪郭描画ステップ、
を実行させるための化粧シミュレーションプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/061846 WO2017179134A1 (ja) | 2016-04-12 | 2016-04-12 | 化粧シミュレーションシステム、化粧シミュレーション方法、および化粧シミュレーションプログラム |
JP2018511805A JP6427297B2 (ja) | 2016-04-12 | 2016-04-12 | 化粧シミュレーションシステム、化粧シミュレーション方法、および化粧シミュレーションプログラム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/061846 WO2017179134A1 (ja) | 2016-04-12 | 2016-04-12 | 化粧シミュレーションシステム、化粧シミュレーション方法、および化粧シミュレーションプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017179134A1 true WO2017179134A1 (ja) | 2017-10-19 |
Family
ID=60042425
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/061846 WO2017179134A1 (ja) | 2016-04-12 | 2016-04-12 | 化粧シミュレーションシステム、化粧シミュレーション方法、および化粧シミュレーションプログラム |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6427297B2 (ja) |
WO (1) | WO2017179134A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109191569A (zh) * | 2018-09-29 | 2019-01-11 | 深圳阜时科技有限公司 | 一种模拟化妆装置、模拟化妆方法及设备 |
US10691932B2 (en) | 2018-02-06 | 2020-06-23 | Perfect Corp. | Systems and methods for generating and analyzing user behavior metrics during makeup consultation sessions |
WO2023188160A1 (ja) * | 2022-03-30 | 2023-10-05 | 日本電気株式会社 | 入力支援装置、入力支援方法、及び非一時的なコンピュータ可読媒体 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009053981A (ja) * | 2007-08-28 | 2009-03-12 | Kao Corp | 化粧シミュレーション装置 |
JP2012119798A (ja) * | 2010-11-30 | 2012-06-21 | Casio Comput Co Ltd | 画像生成方法、画像生成装置及びプログラム |
JP2014056546A (ja) * | 2012-09-14 | 2014-03-27 | Konica Minolta Inc | 情報共有システム及び共有端末並びに共有制御プログラム |
JP2014115821A (ja) * | 2012-12-10 | 2014-06-26 | Secom Co Ltd | 顔特徴抽出装置および顔認証システム |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3567261B2 (ja) * | 2002-03-29 | 2004-09-22 | カシオ計算機株式会社 | 似顔絵画像表示制御装置及び似顔絵画像表示制御方法 |
JP2008059108A (ja) * | 2006-08-30 | 2008-03-13 | Hitachi Ltd | 画像処理装置,画像処理方法、そのプログラムおよび人流監視システム |
-
2016
- 2016-04-12 JP JP2018511805A patent/JP6427297B2/ja active Active
- 2016-04-12 WO PCT/JP2016/061846 patent/WO2017179134A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009053981A (ja) * | 2007-08-28 | 2009-03-12 | Kao Corp | 化粧シミュレーション装置 |
JP2012119798A (ja) * | 2010-11-30 | 2012-06-21 | Casio Comput Co Ltd | 画像生成方法、画像生成装置及びプログラム |
JP2014056546A (ja) * | 2012-09-14 | 2014-03-27 | Konica Minolta Inc | 情報共有システム及び共有端末並びに共有制御プログラム |
JP2014115821A (ja) * | 2012-12-10 | 2014-06-26 | Secom Co Ltd | 顔特徴抽出装置および顔認証システム |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10691932B2 (en) | 2018-02-06 | 2020-06-23 | Perfect Corp. | Systems and methods for generating and analyzing user behavior metrics during makeup consultation sessions |
CN109191569A (zh) * | 2018-09-29 | 2019-01-11 | 深圳阜时科技有限公司 | 一种模拟化妆装置、模拟化妆方法及设备 |
WO2023188160A1 (ja) * | 2022-03-30 | 2023-10-05 | 日本電気株式会社 | 入力支援装置、入力支援方法、及び非一時的なコンピュータ可読媒体 |
Also Published As
Publication number | Publication date |
---|---|
JP6427297B2 (ja) | 2018-11-21 |
JPWO2017179134A1 (ja) | 2018-11-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6200483B2 (ja) | 画像処理システム、画像処理方法、および画像処理プログラム | |
US10593088B2 (en) | System and method for enabling mirror video chat using a wearable display device | |
KR101665229B1 (ko) | 증강된 가상 현실을 사용하는, 원격 참여자들 간의 개선된 통신의 제어 | |
EP3383022A1 (en) | Method and camera device for processing image | |
CN108600632B (zh) | 拍照提示方法、智能眼镜及计算机可读存储介质 | |
US10237495B2 (en) | Image processing apparatus, image processing method and storage medium | |
US9081430B2 (en) | Pointing control device, integrated circuit thereof and pointing control method | |
JP6212533B2 (ja) | 画面共有システム、画面共有方法、および画面共有プログラム | |
JP2013021680A (ja) | 画像に関する処理支援システム、情報処理装置、及び画像に関する処理影支援方法 | |
WO2019011091A1 (zh) | 拍照提醒方法、装置、终端和计算机存储介质 | |
WO2017179134A1 (ja) | 化粧シミュレーションシステム、化粧シミュレーション方法、および化粧シミュレーションプログラム | |
CN109977868A (zh) | 图像渲染方法及装置、电子设备和存储介质 | |
WO2018055659A1 (ja) | 画面共有システム、画面共有方法及びプログラム | |
JP2017188787A (ja) | 撮像装置、画像合成方法、および画像合成プログラム | |
KR101738896B1 (ko) | 패턴 복사를 이용한 가상 피팅 시스템 및 그 방법 | |
KR20220053393A (ko) | 촬영 가이드 제공 서버, 촬영 가이드 제공 방법 및 이를 구현한 전자 장치 | |
JP6140327B2 (ja) | メッセージ送信システム、メッセージ送信方法、プログラム | |
KR20190101802A (ko) | 전자 장치 및 그의 증강 현실 객체 제공 방법 | |
KR101520863B1 (ko) | 얼굴인식을 이용한 캐릭터 제작 방법 및 이를 지원하는 단말 | |
JP5904887B2 (ja) | メッセージ送信システム、メッセージ送信方法、プログラム | |
CN112804451B (zh) | 利用多个摄像头进行拍照的方法和系统以及移动装置 | |
CN110428492B (zh) | 三维唇部重建方法、装置、电子设备及存储介质 | |
US11341595B2 (en) | Electronic device for providing image related to inputted information, and operating method therefor | |
CN111295692B (zh) | 一种立体水印贴纸实现方法、移动终端及存储介质 | |
JP2010224706A (ja) | 似顔絵作成システム、制御サーバ、クライアント端末、似顔絵作成方法、およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2018511805 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16898594 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22.01.2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16898594 Country of ref document: EP Kind code of ref document: A1 |