CN103885461A - Movement method for makeup tool of automatic makeup machine - Google Patents

Movement method for makeup tool of automatic makeup machine Download PDF

Info

Publication number
CN103885461A
CN103885461A CN201210563393.8A CN201210563393A CN103885461A CN 103885461 A CN103885461 A CN 103885461A CN 201210563393 A CN201210563393 A CN 201210563393A CN 103885461 A CN103885461 A CN 103885461A
Authority
CN
China
Prior art keywords
image
color make
instrument
edge
transfer table
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201210563393.8A
Other languages
Chinese (zh)
Other versions
CN103885461B (en
Inventor
王雪龄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZONGJING INVESTMENT Co Ltd
Original Assignee
ZONGJING INVESTMENT Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZONGJING INVESTMENT Co Ltd filed Critical ZONGJING INVESTMENT Co Ltd
Priority to CN201210563393.8A priority Critical patent/CN103885461B/en
Publication of CN103885461A publication Critical patent/CN103885461A/en
Application granted granted Critical
Publication of CN103885461B publication Critical patent/CN103885461B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a movement method for a makeup tool of an automatic makeup machine. The method comprises driving a mobile bench to drive an image acquisition module and the makeup tool to move relative to a portion to be made up of a user; during the movement process of the mobile bench, sequentially acquiring the image of the portion to be made up by use of the image acquisition module to respectively obtain multiple images; sequentially comparing the image difference between two continuously acquired images in the images; according to the comparison result of the image difference, determining whether the mobile bench is positioned; and when the mobile bench is positioned, performing makeup on the portion to be made up by use of the makeup tool.

Description

Automatically the moving method of the color make-up instrument of color make-up machine
Technical field
The present invention relates to a kind of moving method of color make-up instrument of automatic color make-up machine.
Background technology
Liking to be beautiful is people's nature, and therefore all big enterprises all release omnifarious skin care products and cosmetics are chosen for consumer on the market.But, want to draw and oneself like and be applicable to oneself dressing needs repeating make-up technique, and need buy more various cosmetics and cosmetic applicators, to draw different eyebrow types, various eye lines, eyelashes, informer, face adornment, lip adornment, to repair and hold and various color change, but along with skill level difference and the cosmetics of make-up technique are of a great variety, the effect that often make up effect out and consumer expect at heart has drop to a certain degree.
Along with the continuous evolution of Information technology, existing a kind of color make-up or the skin care products analogue means on probation researched and proposed.User utilizes this color make-up or skin care products analogue means on probation, can before buying, first on screen, simulate its adornment after effect, replace and try out color make-up product on the spot.But, on this screen, simulate the effect of color make-up, still need to be dependent on artificial makeup skill and painted and be applied on face.And, may not be equal to and on screen, simulate the effect presenting by the artificial true effect of making up of user.
Therefore, the color make-up effect of simulating on screen will more can positively be presented if can use robotization control to carry out color make-up.In three-dimensional robotization control, generally mostly be to utilize laser, infrared ray or ultrasonic that the information of relative distance is provided.But the instrument cost of laser ranging is comparatively expensive.And laser ranging has the risk damaging of pair human body in the use, therefore in the inapplicable application of the range finding at relative human motion.Although the instrument of infrared distance measuring more just, is easily made and safety, precision is low and directivity is poor.Ultrasonic range finding is subject to surrounding environment influence larger, and therefore precision is lower.
Summary of the invention
In one embodiment, a kind of moving method of color make-up instrument of automatic color make-up machine comprises that driving one transfer table moves with the wish makeup position that drives an image acquisition module and a color make-up instrument relative usage person; In the moving process of transfer table, utilize image acquisition module sequentially to capture the image of wanting makeup position to obtain respectively multiple images; Sequentially compare in these a little images and continue and capture the image difference between two images that arrive; Judge according to the comparative result of image difference whether transfer table reaches location; And in the time that transfer table reaches location, utilize color make-up instrument to carry out makeup to wanting makeup position.
In certain embodiments, the comparison step of the image difference between each two images can comprise: carry out the signature analysis of each image, to obtain the characteristic image of the corresponding same characteristic features of wanting makeup position in each image; And utilize a processing unit to calculate the change in size between the characteristic image of two images.
Now, the determining step that whether transfer table reaches location can comprise and utilizes processing unit to calculate the distance between transfer table and wish makeup position with change in size with in the displacement of transfer table between acquisition two images; The distance relatively calculating and a threshold values; And in the time that distance is less than or equal to threshold values, judge that transfer table reaches location.
Wherein, above-mentioned change in size can be the image magnification ratio between the characteristic image of two images.
Wherein, wanting makeup portion bit position is to immobilize.
In certain embodiments, the comparison step of the image difference between each two images can comprise: carry out the edge analysis of two images to obtain the edge in each image; And calculate the variable quantity between the edge of two images.
Now, the determining step that whether transfer table reaches location can comprise the variable quantity and the set variable quantity that relatively calculate; And in the time that the variable quantity calculating has set variable quantity, judge that transfer table reaches location.
Wherein, the comparison step of the image difference between each two images can more comprise: before at least one the edge analysis carrying out in two images, the position of transfer table and the camera parameter of image acquisition module are adjusted the size of not carrying out the image of edge analysis in two images when obtaining two images.Now, edge analysis step comprises: the edge analysis that carries out not carrying out in two images the image of adjusted size; And the edge analysis of image after adjusting.
Wherein, the comparison step of the image difference between each two images can more comprise: calculating before variable quantity the picture size at the edge that the position of transfer table and the camera parameter of image acquisition module one of are adjusted in two images when obtaining two images.Now, the calculation procedure of variable quantity comprises: calculate the variable quantity between the edge after edge and the adjustment of not carrying out picture size adjustment in these a little edges.
In certain embodiments, the pixel count between a little edges is poor for this reason for variable quantity.Now, edge can comprise the image and the image of wanting the impression on makeup position of the front end of color make-up instrument.
In certain embodiments, the variable quantity difference of the correspondence position between a little edges for this reason.Now, edge can comprise the image of the front end of color make-up instrument.
In certain embodiments, the edge analysis step of each image comprises: carry out the signature analysis of image to obtain the image of the front end of color make-up instrument in image; Centered by the image of this front end, launch an analysis form; And the edge analysis of image block of analyzing image in form is to obtain edge.Wherein, the size of analysis form is less than the size of analyzing the affiliated image of form.
Now, edge can comprise the image of the front end of color make-up instrument.In addition, edge can more comprise the image of wanting the impression on makeup position.
To sum up, carry out color make-up according to the moving method of the color make-up instrument of automatic color make-up machine of the present invention with robotization control, can more positively present the color make-up effect of simulating on screen.Moreover, be to judge precisely and safely for utilizing image to change whether transfer table reaches location according to the moving method of the color make-up instrument of automatic color make-up machine of the present invention.
Describe the present invention below in conjunction with the drawings and specific embodiments, but not as a limitation of the invention.
Accompanying drawing explanation
Fig. 1 is the automatic schematic perspective view of color make-up machine according to an embodiment of the invention;
Fig. 2 is the automatic summary block scheme of color make-up machine according to an embodiment of the invention;
Fig. 3 is according to the process flow diagram of the moving method of the color make-up instrument of the automatic color make-up machine of first embodiment of the invention;
Fig. 4 and Fig. 5 are according to the process flow diagram of the moving method of the color make-up instrument of the automatic color make-up machine of second embodiment of the invention;
Fig. 6 is the schematic diagram of the first embodiment of the first image;
Fig. 7 is the schematic diagram of the first embodiment of the second image;
Fig. 8 and Fig. 9 are according to the process flow diagram of the moving method of the color make-up instrument of the automatic color make-up machine of third embodiment of the invention;
Figure 10 is the thin portion process flow diagram of an embodiment of step 357;
Figure 11 is the thin portion process flow diagram of an embodiment of step 359;
Figure 12 is the local process flow diagram according to fourth embodiment of the invention;
Figure 13 is the local process flow diagram according to fifth embodiment of the invention;
Figure 14 is the local process flow diagram according to sixth embodiment of the invention;
Figure 15 is the local process flow diagram according to seventh embodiment of the invention;
Figure 16 is the schematic diagram of the second embodiment of the first image;
Figure 17 is the schematic diagram of the second embodiment of the second image;
Figure 18 is the local process flow diagram according to eighth embodiment of the invention; Figure 19 is the schematic diagram of the 3rd embodiment of the first image;
Figure 20 is the schematic diagram of the 3rd embodiment of the second image.
Wherein, Reference numeral
10 automatic color make-up machines
11 mobile modules
13 color make-up instruments
14 stuck-modules
15 driver elements
17 image acquisition modules
18 user interfaces
19 processing units
20 users
111 transfer tables
112 lifters
113 horizontal rail
114 telescopic platforms
141 lower-jaw supports
143 crown keepers
144 by the portion of holding
Po1 the first image
Po2 the second image
Pf1 First Characteristic image
Pf2 Second Characteristic image
The image of P13 color make-up instrument
The image of Pp nib
Pe1 nib edge
Pe2 nib edge
The image of Pb body
Pe11 nib edge
The image of Ps impression
Pe21 nib edge
Pe22 indentation edge
Embodiment
Below in conjunction with accompanying drawing, structural principle of the present invention and principle of work are described in detail:
The term such as " first ", " second " of below addressing, it is the element of distinguishing indication, but not in order to sequence or limit the otherness of institute finger element, and also non-in order to limit the scope of the invention.
Referring to figs. 1 through Fig. 3, color make-up machine 10 comprises mobile module 11, color make-up instrument 13, stuck-module 14, driver element 15, image acquisition module 17, user interface 18 and processing unit 19 automatically.
Processing unit 19 is electrically connected driver element 15, image acquisition module 17 and user interface 18.
User interface 18 provides user to carry out color make-up design to bidimensional image, so that produces the makeup instruction corresponding to color make-up design result.Now, 19 of processing units can be applied to cosmetic with color make-up instrument 13 automatically according to makeup instruction user 20 surface, to complete makeup.
Mobile module 11 comprises transfer table 111, lifter 112, horizontal rail 113 and telescopic platform 114.Horizontal rail 113 is located on lifter 112, and can make horizontal rail 113 move up and down along first direction (as, the Y direction in figure) by the adjustment of lifter 152.Telescopic platform 114 is slidedly arranged in horizontal rail 113, and telescopic platform 114 can move left and right along second direction (as, the X-direction in figure) in horizontal rail 113.Transfer table 111 is located on telescopic platform 114, and transfer table 111 can move forward and backward along third direction (as, the Z-direction in figure) on telescopic platform 114.
In this, utilize driver element 15(that processing unit 19 controls for example: motor) drive transfer table 111, lifter 112 and telescopic platform 114, to cause the transfer table 111 can be thereupon three-dimensional mobile with precise positioning.
Color make-up instrument 13 and image acquisition module 17 are arranged on transfer table 111.In this, can driven by transfer table 111 under the condition of (that is, can move along with transfer table 111), color make-up instrument 13 and image acquisition module 17 can direct or indirect mode be arranged on transfer table 111.In this, the sense side of image acquisition module 17 is the positions (for example: face, eye, nail, the back of the hand or arm etc.) of wanting makeup towards user 20.
In this, according to the mode that applies of cosmetic, color make-up instrument 13 can be contact, contactless or its combination.Contactless color make-up instrument 13 is shower nozzle, nozzle or its equivalent instrument etc. such as.Contactless color make-up instrument 13 can have one or more discharge openings, and its discharge opening is in order to spray the surface of cosmetic to user 20 wish makeup position.Contact color make-up instrument 13 for example whitewashes, powder puff, a coating tool or its equivalent instrument etc.Contact color make-up instrument 13 can have one or more front ends (for example: free end or the nib etc. of bristle), and its front end is in order to be coated with the surface of cosmetic to user 20 wish makeup position.
Stuck-module 14 is to arrange thereon for user 20 wish makeup position, with regular user 20 position.In certain embodiments, be example with face (that is, wanting makeup position), stuck-module 14 comprises lower-jaw support 141 and crown keeper 143.Lower-jaw support 141 use are put its lower jaw for user, use the head (face) that props up user 20.Crown keeper 143 is located at the top of lower-jaw support 141.In this, crown keeper 143 coordinates forehead place of face to become arcuation by the portion of holding 144.When use, user 20 its forehead can be posted by crown keeper 143 by the portion of holding 144, and face's chin is reclined on lower-jaw support 141, to guarantee the position of user 20 face with respect to transfer table 111.
In makeup instruction, there is the two-dimentional trace information being formed by multiple anchor points.Each anchor point is made up of an X coordinate figure and a Building Y scale value.Processing unit 19 sequentially carries out the location, position of transfer table 111 according to each anchor point in makeup instruction.
Processing unit 19 utilizes driver element 11 to control the position (step S30) of transfer table 111 in X-axis and Y-axis according to the X-axis coordinate figure in anchor point and Y-axis coordinate values, and by implementing to control and determine the position of transfer table 111 on Z axis according to the moving method of the color make-up instrument of automatic color make-up machine of the present invention.In other words, can be realized by the executable software program of processing unit 19 according to the moving method of the color make-up instrument of automatic color make-up machine of the present invention.
The control of following exemplary illustrated processing unit 19 shift position on Z axis to transfer table 111.
Processing unit 19 drives transfer table 111 according to makeup instruction control driver element 150, causes transfer table 111 to drive image acquisition module 17 and color make-up instrument 13 relative usage persons' 20 wish makeup position to move, and moves (step S31) along Z axis.
In the moving process of transfer table 111, image acquisition module 17 sequentially captures user 20 the image at wish makeup position to obtain respectively multiple images (step S33).
Processing unit 19 receives the image that image acquisition module 17 is captured.And processing unit 19 sequentially relatively continues and captures the image difference (step S35) between two images that arrive.
Processing unit 19 judges according to the comparative result of image difference whether transfer table 111 reaches the location (step S37) on Z axis.
In the time that transfer table 111 reaches location, utilize color make-up instrument 13 to carry out makeup (step S39) to user 20 wish makeup position.
In step 35 and step 37, can have and analyze image size and change and analyze the embodiment such as edge variation in image.
For convenience of description, below the position first moving to is referred to as to primary importance, continues primary importance and (the next position of moving to is referred to as the second place, image that first acquisition is arrived, the image capturing in primary importance) be referred to as the first image and continue the first image and the image (image, capturing in the second place) that captures is referred to as the second image.
First the embodiment that image size changes is analyzed in exemplary explanation.
With reference to Fig. 4 and Fig. 5, processing unit 19 is controlled driver element 15 and is driven transfer table 111 to approach towards user 20 wish makeup position, so that transfer table 111 drives color make-up instrument 13 on it and image acquisition module 17 and moves to primary importance (step S311).
Upper in current location (being now primary importance), utilize the image that image acquisition module 17 is set captures user 20 wish makeup position, to obtain the first image (step S331).
Processing unit 19 receives the first image, and the signature analysis that carries out the first image is to obtain the characteristic image (for convenience of description, being below referred to as First Characteristic image) (step S351) in the first image.With reference to Fig. 6, in the first image Po1, there is the First Characteristic image Pf1 of corresponding user's 20 feature.For instance, in the time thering is a mole (user 20 feature) on face (user 20), in the first image Po1 capturing, there is the image (, First Characteristic image Pf1) of mole.
After having captured the first image, processing unit 19 is controlled driver element 15 again and is driven transfer table 111 to approach towards user 20 wish makeup position, so that transfer table 111 drives color make-up instrument 13 on it and image acquisition module 17 and moves to the next position (, the second place) (step S313) from primary importance.
Then, upper in current location (being now the second place), utilize image acquisition module 17 again to capture the image at user 20 wish makeup position, to obtain the second image (step S333).
Processing unit 19 receives the second image, and the signature analysis that carries out the second image is to obtain the characteristic image (for convenience of description, being below referred to as Second Characteristic image) (step S353) in the second image.With reference to Fig. 7, in the second image Po2, there is the Second Characteristic image Pf2 of corresponding user's 20 feature.And, the Second Characteristic image Pf2 in the second image Po2 and the corresponding user's 20 of First Characteristic image Pf1 in the first image Po1 same characteristic features.For instance, in the time thering is a mole (user 20 feature) on face (user 20), 19 of processing units find out respectively acquisition to the first image Po1 in this mole image (, First Characteristic image Pf1) and the second image Po2 in the image (, Second Characteristic image Pf1) of same mole.
Processing unit 19 calculates the change in size (step S355) of First Characteristic image Pf1 and Second Characteristic image Pf1.In this, processing unit 19 can be learnt change in size by calculating the long image magnification ratio such as compare of Area Ratio, pixel ratio or the picture of First Characteristic image Pf1 and Second Characteristic image Pf1.
Then, processing unit 19 with the change in size that calculates and and the displacement of transfer table 111 (, distance between primary importance and the second place) calculate in acquisition distance between transfer table 111 and user 20 wish makeup position when the second image, i.e. distance (step S371) between the second place and user's 20 wish makeup position.
For instance, the magnification of the first image is following formula 1, and the magnification of the second image is following formula 2.And, can be learnt the relation with following formula 3 of P1 and P2 by primary importance and the second place.
H1/H=P1/Q formula 1
H2/H=P2/Q formula 2
P1=P2+X formula 3
Wherein, image distance, the h2 that h1 is that the picture of the first image is long, H is object distance while being acquisition the first image of object height, P1, Q is image acquisition module is that the picture of the second image is long, object distance and the X of P2 while being acquisition the second image is the distance between primary importance and the second place.And, in the time that the second place is compared primary importance near user 20 wish makeup position, X be on the occasion of.Otherwise in the time that the second place is compared primary importance away from user 20 wish makeup position, X is negative value.
Can be obtained the relation between the first image and the second image with following formula 5 by formula 1, formula 2 and formula 3.
P 2 = X h 1 h 2 - 1 Formula 4
Wherein, h1/h2 represents the change in size between First Characteristic image Po1 and Second Characteristic image Po2.
In certain embodiments, suppose that the distance calculating is the distance of judging between the surface at the discharge opening of color make-up instrument 13 or front end and user's 20 wish makeup position, in the time that element is assembled, can make the discharge opening of color make-up instrument 13 aim at the lens of image acquisition module 17, or processing unit 19 is considered the relative distance between the discharge opening of color make-up instrument 13 and the lens of image acquisition module 17 in the lump in the time calculating distance (step S371) or while setting the threshold values that judges use.
Then, the distance that processing unit 19 relatively calculates and threshold values (step S373), to judge whether the distance calculating is greater than threshold values (step S375).
In the time that distance is greater than threshold values, processing unit 19 judges that transfer table 111 not yet reaches location (step S377), and processing unit 19 is controlled driver element 15 again and is driven transfer table 111 to move towards user 20 wish makeup position, so that transfer table 111 moves to the next position (step S313).Then, utilize the image that image acquisition module 17 is set again captures user 20 wish makeup position, to obtain next image (step S333).Now, in the step of execution that continues, the image directly being captured in a former step S333 is as the first image, and the new image this time being captured in step S333 is as the second image.Now, (the first image can carry out signature analysis again, need not perform step S351), and in step S355, be directly to use previous obtained signature analysis result (the Second Characteristic image of, analyzing in a former step S353 is as First Characteristic image) to calculate.
In the time that distance is less than or equal to threshold values, processing unit 19 judges that transfer table 111 reaches location (step S379), then controls color make-up instrument 13 and applies the surface (step S391) of cosmetic to user 20 wish makeup position.Complete and apply after step (step S391), when transfer table 111 is when carrying out the movement (step S41) of next anchor point, processing unit 19 is controlled driver element 15 and is stopped driving transfer table 111.In the time that transfer table 111 need carry out the movement (step S41) of next anchor point, 19 of processing units control driver elements 15 drive transfer table 111 down an anchor point move, return to step S311 and re-execute each step.
In certain embodiments, threshold values can be between the set scope being limited by the first numerical value and second value.Wherein, the first numerical value is less than second value.
In other words, processing unit 19 is by the distance calculating (step S375) compared with set scope.
When (the comparative result in step S375 drops between the first numerical value and second value for distance, equal the first numerical value or second value or be greater than the first numerical value and be less than second value) time, processing unit 19 judges that transfer table 111 reaches location (step S379).
In the time that the comparative result in step S375 is less than the first numerical value for distance, processing unit 19 judges that transfer table 111 does not reach location (step S377), and controlling driver element 15 drives transfer table 111 to move towards the direction at the wish makeup position away from user 20 again, so that transfer table 111 moves to the next position, return to step S313 and re-execute each step.
In the time that the comparative result in step S375 is greater than second value for distance, processing unit 19 judges that transfer table 111 does not reach location (step S411), and controlling driver element 150 drives transfer table 111 to move towards user 20 wish makeup position again, so that transfer table 111 moves to the next position, return to step S313 and re-execute each step.
Although above-mentioned First Characteristic image Po1 and Second Characteristic image Po2 be take the image of mole as example, non-restriction of the present invention.In certain embodiments, characteristic image (, First Characteristic image Po1 and Second Characteristic image Po2) for example can be a bit, for example, in (: the image of the image of mole, the image of scar or acne etc.), lines (: the image of the hairs such as eyelashes, fine hair, hair or mole hair or scar) or image arbitrarily complete pattern etc.
In certain embodiments, when on user 20 wish makeup position during without feature, can whether contact with user 20 wish makeup position and determine whether and reach location (, the embodiment of edge variation in analysis image) by the front end of detecting color make-up instrument 13.
With reference to Fig. 8 and Fig. 9, in other words, in step S33, in the image that image acquisition module 17 is captured, except the image at user 20 wish makeup position, also there is the image (step S332 and step S334) of the front end of color make-up instrument 13.
Processing unit 19 carries out the edge analysis of the first image in adjacent two images to obtain one first edge (step S357), and the edge analysis that carries out the second image in adjacent two images is to obtain one second edge (step S359).
Then, processing unit 19 calculates the variable quantity (step S363) between the first edge and the second edge, and the variable quantity calculating is compared (step S374) with a set variable quantity, to determine whether variable quantity between the two reaches set variable quantity (step S376).
When variable quantity between the first edge and the second edge has set variable quantity (that is, the variable quantity calculating is less than or equal to set variable quantity), processing unit 19 judges that transfer table 111 reaches location (step S379).Now, processing unit 19 is controlled color make-up instrument 13 and is applied the surface (step S391) of cosmetic to user 20 wish makeup position.
Complete and apply after step (step S391), when transfer table 111 is when carrying out the movement (step S41) of next anchor point, processing unit 19 is controlled driver element 15 and is stopped driving transfer table 111.In the time that transfer table 111 need carry out the movement (step S41) of next anchor point, 19 of processing units control driver elements 15 drive transfer table 111 down an anchor point move, return to step S311 and re-execute each step.
Unchanged or do not reach set variable quantity (between the first edge and the second edge, the variable quantity calculating is less than set variable quantity) time, processing unit 19 judges that transfer table 111 not yet reaches location (step S377), and processing unit 19 is controlled driver element 15 again and is driven transfer table 111 to move towards user 20 wish makeup position, so that transfer table 111 moves to the next position (step S313).Then, utilize the image that image acquisition module 17 is set again captures user 20 wish makeup position, to obtain next image (step S333 ').Now, in the step of execution that continues, the image directly being captured in a former step S334 is as the first image, and the new image this time being captured in step S333 ' is as the second image.Now, (the first image can carry out edge analysis again, need not perform step S357), and in step S363, be directly to use previous obtained edge analysis result (the second edge of, analyzing in last step S357 is as the first edge) to calculate.
In certain embodiments, in the time that the variable quantity between the first edge and the second edge reaches set variable quantity, can represent that the front end of color make-up instrument 13 has touched user 20 wish makeup position, therefore can stop the movement of transfer table 111 to carry out makeup.
When unchanged between the first edge and the second edge or while being less than set variable quantity, represent that the front end of color make-up instrument 13 does not contact or just touch user 20 wish makeup position, therefore can order about again transfer table 111 and advance to user 20 wish makeup position.
In certain embodiments, set variable quantity can be between the set scope being limited by the first threshold values and the second threshold values.Wherein, the first threshold values is less than the second threshold values.
In other words, in step S374, processing unit 19 by the variable quantity calculating compared with set scope.
When the comparative result in step S376 is that (variable quantity drops between the first threshold values and the second threshold values, equal the first threshold values or the second threshold values or be greater than the first threshold values and be less than the second threshold values) time, processing unit 19 judges that transfer table 111 reaches location (step S379), stops driving transfer table 111 and controls color make-up instrument 13 and carry out makeup (step S391) and control driver element 15.
When the comparative result in step S376 is that variable quantity is while being less than the first threshold values, processing unit 19 judges that transfer table 111 does not reach location (step S377), drive transfer table 111 to approach towards user 20 wish makeup position and control again driver element 15, so that transfer table 111 moves to the next position (step S313).
When the comparative result in step S376 is that variable quantity is while being greater than the second threshold values, processing unit 19 judges that transfer table 111 does not reach location (step S377), drive transfer table 111 to move towards the direction at the wish makeup position away from user 20 and control again driver element 15, so that transfer table 111 moves to the next position (step S313).
In certain embodiments, with reference to Figure 10 and Figure 11, in the edge analysis (step S357 or step S359) of each image, processing unit 19 can first carry out the signature analysis of image (the first image or the second image) to obtain the image (step S3571 or step S3591) of the front end of color make-up instrument 13 in image, and launches an analysis form (step S3573 or step S3593) centered by the image of the front end of color make-up instrument 13.Wherein, the size of analyzing the shown image block of form is less than the size of former image.
Then the edge analysis that, processing unit 19 is analyzed image block in form is to obtain the edge (the first edge or the second edge) (step S3575 or step S3595) in image block.
In certain embodiments, calculating before variable quantity, processing unit 19 can first carry out successively obtaining know two images wherein the adjusted size of an image (or its edge image) to cause two images (or its edge image) of successively obtaining to there is identical magnification.
In certain embodiments, with reference to Figure 12 and Figure 13, processing unit 19 can be carrying out before edge analysis, the position of transfer table 111 during first according to two images of acquisition (, primary importance and the second place) and the camera parameter (for example: focal length, image distance etc.) of image acquisition module 17 one of adjust in two images (, the first image or the second image) image size (step S356 or step S358) so that two images have identical magnification.Then, first image of processing unit 19 after adjusting or adjust after the edge analysis of the second image to obtain edge in image (step S357 ' or step S359 ').
In certain embodiments, with reference to Figure 14, obtaining the first edge (step S357) afterwards, processing unit 19 can directly be adjusted according to the camera parameter of primary importance, the second place and image acquisition module 17 size (step S361) of the edge image at the first edge, and then by the second edge with adjust after the first edge relatively obtain variable quantity between the two (step S363 ').
In certain embodiments, with reference to Figure 15, obtaining the second edge (step S359) afterwards, processing unit 19 can directly be adjusted according to the camera parameter of primary importance, the second place and image acquisition module 17 size (step S362) of the edge image at the second edge, and then by the first edge with adjust after the second edge relatively obtain variable quantity between the two (step S363 ").
Wherein, when processing unit 19 is while adjusting the size of the image (, the first image) first obtained or its edge image (, the first edge), step S356 or step S361 will obtain (, after step S313) after the information of the second place at processing unit 19 and carry out.
When the distance at primary importance and user's 20 wish makeup position is during compared with the distance at the second place and user's 20 wish makeup position, in step S356 or step S361, processing unit 19 is to amplify the first image or the first edge according to the camera parameter of primary importance, the second place and image acquisition module 17.
Moreover, when the distance at primary importance and user's 20 wish makeup position is during compared with the distance at the second place and user's 20 wish makeup position, in step S358 or step S362, processing unit 19 is to dwindle the second image or the second edge according to the camera parameter of primary importance, the second place and image acquisition module 17.
Otherwise, when the distance at primary importance and user's 20 wish makeup position is during compared with the near distance at the second place and user's 20 wish makeup position, in step S356 or step S361, processing unit 19 is to dwindle the first image or the first edge according to the camera parameter of primary importance, the second place and image acquisition module 17.
Moreover, when the distance at primary importance and user's 20 wish makeup position is during compared with the distance at the second place and user's 20 wish makeup position, in step S358 or step S362, processing unit 19 is to amplify the second image or the second edge according to the camera parameter of primary importance, the second place and image acquisition module.
For instance, the magnification of the first image is following formula 5, and the magnification of the second image is following formula 6.And, can be learnt the relation with following formula 7 of P1 and P2 by primary importance and the second place.
H1/H=P1/Q formula 5
H2/H=P2/Q formula 6
P1=P2+X formula 7
Wherein, image distance, the h2 that h1 is that the picture of the first image is long, H is object distance while being acquisition the first image of object height, P1, Q is image acquisition module is that the picture of the second image is long, object distance and the X of P2 while being acquisition the second image is the distance between primary importance and the second place.
Can obtain the relation between the first image and the second image with following formula 9 by formula 5, formula 6, formula 7 and basic optical formula (formula 8).
1/P2+1/Q=1/f2 formula 8
H1/h2=1+X/P2=1+X (1/f2+1/Q) formula 9
Wherein, focal length when f2 is acquisition the second image.
Therefore, processing unit 19 can carry out according to formula 9 adjustment of image size or edge image size.
In certain embodiments, the impression (for example: depression, lines) that above-mentioned edge variation (variable quantity between the first edge and the second edge) can occur because of the compressing of the front end of color make-up instrument 13 because of the deformation that touches user 20 wish makeup position and occur or user 20 wish makeup position corresponding to the front end of color make-up instrument 13.
For instance, when in the time of the second image capture, the front end of color make-up instrument 13 does not deform, the first edge in the first image (, the edge of the image of the front end of color make-up instrument 13) position and the second image in the second edge (, the edge of the image of the front end of color make-up instrument 13) position identical haply, the position at the second edge is to drop on the correspondence position at the first edge in the second image.
And when there is deformation because touching user 20 wish makeup position in the front end of color make-up instrument 13 in the time of the second image capture, the position at the position at the first edge and the second edge can be different in correspondence,, the position at the second edge can be offset, thereby not on the correspondence position at the first edge.
Want take color make-up instrument 13 as writing brush makeup position as user 20 cheek be example, transfer table 111 is in the time of primary importance, writing brush not yet touches user 20 cheek.Now, the image Pp(that image acquisition module 17 has a writing brush nib in the first image Po1 that primary importance captured, the image of the front end of corresponding color make-up instrument 13 in the image P13 of color make-up instrument 13), as shown in figure 16.With reference to the 16th figure, after the edge analysis of the first image Po1, can obtain the first edge is the nib edge Pe1 of the image Pp of writing brush nib.
Then, transfer table 111 is in the time of the second place, and writing brush touches user 20 cheek.Now, the image Pp(that image acquisition module 17 has equally writing brush nib in the second image Po2 that the second place captured is, the image of the front end of corresponding color make-up instrument 13 in the image P13 of color make-up instrument), but writing brush is because being urged to the deformation of cheek institute, as shown in figure 17.With reference to Figure 17, after the edge analysis of the second image Po2, can obtain the second edge is the nib edge Pe2 of the image Pp of writing brush nib.
Nib edge Pe1 is compared with nib edge Pe2, and the position that can obtain part pixel in nib edge Pe2 is different from respective pixel in nib edge Pe1.In other words, the variable quantity (, the amount of pixels that correspondence position is different) between the first edge and the second edge drops between the first threshold values and the second threshold values, and therefore processing unit 19 can judge that the position of transfer table 111 on Z axis reaches location.
In certain embodiments, with reference to Figure 16 to Figure 18, carrying out step S363(or step S363 ' or step S363 ") before, processing unit 19 can first carry out the contraposition (step S360) of the first image and the second image.In step S34, processing unit 19 for example carries out the signature analysis of the first image and the second image, to obtain the image (: the image of feature on user 20 wish makeup position or the image Pb of the body of color make-up instrument 13) (step S3601) of the first image and the second image same characteristic features, and utilizes the image of same characteristic features by the first image and the second image contraposition (step S3603).Then, processing unit 19 carries out the calculation procedure (step S363, step S363 ' or step S363 ") of variable quantity between the first edge and the second edge again.
In another case, when in the time of the second image capture, the front end of color make-up instrument 13 is not urged to user 20 wish makeup position, the first edge in the first image (, the edge of the image of the front end of color make-up instrument 13) pixel count and the second image in the pixel count at the second edge (, the edge of the image of the front end of color make-up instrument 13) identical haply.
And when be urged at the front end of acquisition color make-up instrument 13 when the second image user 20 wish makeup position surface and while causing user 20 wish makeup position that impression occurs, the first edge in the first image (, the edge of the image of the front end of color make-up instrument 13) pixel count can be less than the pixel count at the second edge (, the edge of the edge of the image of the front end of color make-up instrument 13 and the image of impression) in the second image.
Moreover, even if the front end of color make-up instrument 13 is all urged to the surface at user 20 wish makeup position when the first image and the second image capture, but because of the displacement difference of transfer table 111, the degree of depth that when the first image and the second image capture, the front end of color make-up instrument 13 is oppressed is also not identical, thereby the size of the impression that occurs of the surface at user 20 wish makeup position is also thereupon different.For example: the size of the darker impression of the degree of depth larger (cave in darker or lines is more).Now, in the first image and the second image, the pixel count at edge also can different because of the size difference of impression (for example: the larger pixel count of size is more).
Be example again take color make-up instrument 13 as eyebrow pencil, transfer table 111 is in the time of primary importance, and eyebrow pencil not yet touches user 20 cheek.Now, the image Pp(that image acquisition module 17 has an eyebrow pencil nib in the first image Po1 that primary importance captured, the image of the front end of corresponding color make-up instrument 13 in the image P13 of color make-up instrument 13), as shown in figure 19.With reference to Figure 19, after the edge analysis of the first image Po1, can obtain the first edge is the nib edge Pe11 of the image Pp of eyebrow pencil nib.
Then, transfer table 111 is in the time of the second place, and eyebrow pencil touches user 20 cheek.Now, the image Pp(that image acquisition module 17 has equally eyebrow pencil nib in the second image Po2 that the second place captured is, the image of the front end of corresponding color make-up instrument 13 in the image P13 of color make-up instrument 13) and also have eyebrow pencil nib to be urged to the image Ps of impression that cheek produces, as shown in figure 20.With reference to Figure 20, after the edge analysis of the second image Po2, can obtain the second edge and comprise the nib edge Pe21 of image Pp of eyebrow pencil nib and the indentation edge Pe22 of the image Ps of impression.
The first edge is compared with the second edge, can be obtained the second edge and have more indentation edge Pe22 than the first edge.In other words, the variable quantity (, the amount of pixels of indentation edge Pe22) between the first edge and the second edge drops between the first threshold values and the second threshold values, and therefore processing unit 19 can judge that the position of transfer table 111 on Z axis reaches location.
In certain embodiments, step S35 can first implement to analyze image size and change, and in the time that step S35 can not find feature, changing into continues implements to analyze the edge variation in image, uses by corresponding edge variable quantity and determines whether and reach location.For instance, when can not find feature in step S351 or step S353 time, change execution step S357, and subsequent steps S357 carries out subsequent step.
Wherein, the non-restriction of the present invention of the execution sequence of step, under zone of reasonableness, some step can be exchanged its execution sequence or be carried out simultaneously.For instance, in certain embodiments, processing unit 19 can capture after image at image acquisition module 17, continues immediately and carries out the signature analysis of the image being captured.Moreover processing unit 19 also can be in the time that next image be arrived in image acquisition module 17 acquisitions, the signature analysis of the image that before just carrying out, once acquisition is arrived.In other words, step S351 can carry out between step S331 and step S313 or and step S313 or step S353 carries out simultaneously or carry out between step S313 and step S353.In certain embodiments, processing unit 19 can capture after image at image acquisition module 17, continues immediately and carries out the edge analysis of the image being captured.Moreover processing unit 19 also can be in the time that next image be arrived in image acquisition module 17 acquisitions, the edge analysis of the image that before just carrying out, once acquisition is arrived.In other words, step S357 can carry out between step S331 ' and step S313 or and step S313 or step S359 carries out simultaneously or carry out between step S313 and step S359.
In certain embodiments, can be realized by a computer program according to the moving method of the color make-up instrument of automatic color make-up machine of the present invention, so that when computing machine (, automatically color make-up machine processing unit 19) loader and carry out after can complete the moving method of the color make-up instrument of the automatic color make-up machine of arbitrary embodiment according to the present invention.In certain embodiments, computer program can be a medium capable of reading record, and said procedure is stored in medium capable of reading record and is written into for a computing machine.In certain embodiments, said procedure itself can be computer program, and transfers in computing machine via wired or wireless mode.
In sum, carry out color make-up according to the moving method of the color make-up instrument of automatic color make-up machine of the present invention with robotization control, can more positively present the color make-up effect of simulating on screen.Moreover, be to judge precisely and safely for utilizing image to change whether transfer table reaches location according to the moving method of the color make-up instrument of automatic color make-up machine of the present invention.
Certainly; the present invention also can have other various embodiments; in the situation that not deviating from spirit of the present invention and essence thereof; those of ordinary skill in the art are when making according to the present invention various corresponding changes and distortion, but these corresponding changes and distortion all should belong to the protection domain of the appended claim of the present invention.

Claims (22)

1. a moving method for the color make-up instrument of automatic color make-up machine, is characterized in that, comprising:
Drive a transfer table to drive an image acquisition module to move relative to a user wish makeup position with a color make-up instrument;
In the moving process of this transfer table, utilize image that this image acquisition module sequentially captures this wish makeup position to obtain respectively multiple images;
Sequentially compare in those images and continue and capture the image difference between two images that arrive;
Judge according to the each comparative result of this image difference whether this transfer table reaches location; And
In the time that this transfer table reaches location, utilize this color make-up instrument to carry out makeup to this wish makeup position.
2. the moving method of the color make-up instrument of automatic color make-up machine according to claim 1, is characterized in that, respectively the comparison step of the image difference between this two image comprises:
The signature analysis that carries out each this image is to obtain in each this image the characteristic image of the same characteristic features to wanting makeup position; And
Utilize a processing unit to calculate the change in size between those characteristic images of this two this image.
3. the moving method of the color make-up instrument of automatic color make-up machine according to claim 2, is characterized in that, this determining step comprises:
Utilize this processing unit to calculate the distance between this transfer table and this wish makeup position with this change in size with in the displacement that captures this transfer table between this two image;
Relatively this distance and a threshold values; And
In the time that this distance is less than or equal to this threshold values, judge that this transfer table reaches location.
4. the moving method of the color make-up instrument of automatic color make-up machine according to claim 1, is characterized in that, this determining step comprises:
Utilize a processing unit with in this two image to should want change in size between the characteristic image of same characteristic features at makeup position and between this two image of acquisition the displacement of this transfer table calculate the distance between this transfer table and this wish makeup position;
Relatively this distance and a threshold values; And
In the time that this distance is less than or equal to this threshold values, judge that this transfer table reaches location.
5. according to the moving method of the color make-up instrument of the automatic color make-up machine described in any one in claim 2 to 4, it is characterized in that, this change in size is the image magnification ratio between those characteristic images.
6. the moving method of the color make-up instrument of automatic color make-up machine according to claim 5, is characterized in that, this wish makeup portion bit position is to immobilize.
7. according to the moving method of the color make-up instrument of the automatic color make-up machine described in any one in claim 1 to 4, it is characterized in that, this wish makeup portion bit position is to immobilize.
8. the moving method of the color make-up instrument of automatic color make-up machine according to claim 1, is characterized in that, respectively the comparison step of the image difference between this two image comprises:
The edge analysis that carries out this two image is to obtain the edge in each this image; And
Calculate the variable quantity between those edges of this two image.
9. the moving method of the color make-up instrument of automatic color make-up machine according to claim 8, is characterized in that, this determining step comprises:
This variable quantity relatively calculating and a set variable quantity; And
In the time that this variable quantity calculating has this set variable quantity, judge that this transfer table reaches location.
10. the moving method of the color make-up instrument of automatic color make-up machine according to claim 9, is characterized in that, respectively the comparison step of the image difference between this two image more comprises:
Before at least one the edge analysis carrying out in this two image, the position of this transfer table and the camera parameter of this image acquisition module are adjusted the size of not carrying out the image of this edge analysis in this two image when obtaining this two image;
Wherein, this edge analysis step comprises:
Carry out not carrying out in this two image the edge analysis of the image of adjusted size; And
The edge analysis of this image after adjusting.
The moving method of the color make-up instrument of 11. automatic color make-up machines according to claim 9, is characterized in that, respectively the comparison step of the image difference between this two image more comprises:
Calculating before this variable quantity the picture size at this edge that the position of this transfer table and the camera parameter of this image acquisition module one of are adjusted in this two image when obtaining this two image;
Wherein, the calculation procedure of this variable quantity comprises:
Calculate the variable quantity between this edge after edge and the adjustment of not carrying out picture size adjustment in those edges.
The moving method of the color make-up instrument of 12. automatic color make-up machines according to claim 8, is characterized in that, respectively the comparison step of the image difference between this two image more comprises:
Before at least one the edge analysis carrying out in this two image, the position of this transfer table and the camera parameter of this image acquisition module are adjusted the size of not carrying out the image of this edge analysis in this two image when obtaining this two image;
Wherein, this edge analysis step comprises:
Carry out not carrying out in this two image the edge analysis of the image of adjusted size; And
The edge analysis of this image after adjusting.
The moving method of the color make-up instrument of 13. automatic color make-up machines according to claim 8, is characterized in that, respectively the comparison step of the image difference between this two image more comprises:
Calculating before this variable quantity the picture size at this edge that the position of this transfer table and the camera parameter of this image acquisition module one of are adjusted in this two image when obtaining this two image;
Wherein, the calculation procedure of this this variable quantity comprises:
Calculate in those edges, do not carry out picture size adjustment edge and adjust after know the variable quantity between this edge.
The moving method of the color make-up instrument of the automatic color make-up machine described in any one in 14. according to Claim 8 to 13, is characterized in that, this variable quantity is that the pixel count between those edges is poor.
The moving method of the color make-up instrument of 15. automatic color make-up machines according to claim 14, is characterized in that, this edge comprises the image of the impression on the image of front end and this wish makeup position of this color make-up instrument.
The moving method of the color make-up instrument of the automatic color make-up machine described in any one in 16. according to Claim 8 to 13, is characterized in that, this variable quantity is the difference of the correspondence position between those edges.
The moving method of the color make-up instrument of 17. automatic color make-up machines according to claim 16, is characterized in that, this edge comprises the image of the front end of this color make-up instrument.
The moving method of the color make-up instrument of the automatic color make-up machine described in any one in 18. according to Claim 8 to 13, is characterized in that, respectively the edge analysis step of this image comprises:
Carry out the signature analysis of this image to obtain the image of the front end of this color make-up instrument in this image;
Centered by the image of this front end, launch an analysis form, wherein the size of this analysis form is less than the size of this affiliated image of this analysis form; And
Carry out the edge analysis of the image block of this image in this analysis form to obtain this edge.
The moving method of the color make-up instrument of 19. automatic color make-up machines according to claim 18, is characterized in that, this edge comprises the image of this front end.
The moving method of the color make-up instrument of 20. automatic color make-up machines according to claim 19, is characterized in that, this edge more comprises the image of the impression on this wish makeup position.
The moving method of the color make-up instrument of the automatic color make-up machine described in any one in 21. according to Claim 8 to 13, is characterized in that, this edge comprises the image of the front end of this color make-up instrument.
The moving method of the color make-up instrument of 22. automatic color make-up machines according to claim 21, is characterized in that, this edge more comprises the image of the impression on this wish makeup position.
CN201210563393.8A 2012-12-21 2012-12-21 Automatically the moving method of the color make-up instrument of color make-up machine Active CN103885461B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210563393.8A CN103885461B (en) 2012-12-21 2012-12-21 Automatically the moving method of the color make-up instrument of color make-up machine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210563393.8A CN103885461B (en) 2012-12-21 2012-12-21 Automatically the moving method of the color make-up instrument of color make-up machine

Publications (2)

Publication Number Publication Date
CN103885461A true CN103885461A (en) 2014-06-25
CN103885461B CN103885461B (en) 2017-03-01

Family

ID=50954408

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210563393.8A Active CN103885461B (en) 2012-12-21 2012-12-21 Automatically the moving method of the color make-up instrument of color make-up machine

Country Status (1)

Country Link
CN (1) CN103885461B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108078262A (en) * 2017-12-23 2018-05-29 中合国际知识产权股份有限公司 A kind of multi-functional cosmetic chair
CN108308834A (en) * 2018-02-05 2018-07-24 天津机电职业技术学院 A kind of makeup machine

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07306009A (en) * 1994-05-10 1995-11-21 Casio Comput Co Ltd Method for matching position of transparent substrate
JPH1151946A (en) * 1997-08-08 1999-02-26 Fuji Xerox Co Ltd Shape measuring device
US6286517B1 (en) * 1998-12-22 2001-09-11 Pearl Technology Holdings, Llc Fingernail and toenail decoration using ink jets
US6341831B1 (en) * 1999-03-09 2002-01-29 Paul J. Weber Skin decoration apparatus and method
CN1489112A (en) * 2002-10-10 2004-04-14 北京中星微电子有限公司 Sports image detecting method
CN1601549A (en) * 2003-09-26 2005-03-30 中国科学院自动化研究所 Human face positioning and head gesture identifying method based on multiple features harmonization
US20050135675A1 (en) * 2003-12-19 2005-06-23 Institute For Information Industry Simulation method for makeup trial and the device thereof
JP2007213561A (en) * 2006-01-16 2007-08-23 Honda Motor Co Ltd Vehicle periphery supervisory unit
JP2007216000A (en) * 2006-01-17 2007-08-30 Shiseido Co Ltd Makeup simulation system, makeup simulation device, makeup simulation method and makeup simulation program
US20090185746A1 (en) * 2008-01-22 2009-07-23 The University Of Western Australia Image recognition
CN101593352A (en) * 2009-06-12 2009-12-02 浙江大学 Driving safety monitoring system based on face orientation and visual focus
CN201716740U (en) * 2010-07-06 2011-01-19 上海慧广科技发展有限公司 Intelligent facial recognition system
JP3172833U (en) * 2011-10-25 2012-01-12 エキスパートマグネティックス株式会社 Makeup robot
US20120067364A1 (en) * 2010-09-21 2012-03-22 Zong Jing Investment, Inc. Facial make-up application machine and make-up application method using the same
CN102496165A (en) * 2011-12-07 2012-06-13 四川九洲电器集团有限责任公司 Method for comprehensively processing video based on motion detection and feature extraction
CN102567727A (en) * 2010-12-13 2012-07-11 中兴通讯股份有限公司 Method and device for replacing background target
US20120223956A1 (en) * 2011-03-01 2012-09-06 Mari Saito Information processing apparatus, information processing method, and computer-readable storage medium

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07306009A (en) * 1994-05-10 1995-11-21 Casio Comput Co Ltd Method for matching position of transparent substrate
JPH1151946A (en) * 1997-08-08 1999-02-26 Fuji Xerox Co Ltd Shape measuring device
US6286517B1 (en) * 1998-12-22 2001-09-11 Pearl Technology Holdings, Llc Fingernail and toenail decoration using ink jets
US6341831B1 (en) * 1999-03-09 2002-01-29 Paul J. Weber Skin decoration apparatus and method
CN1489112A (en) * 2002-10-10 2004-04-14 北京中星微电子有限公司 Sports image detecting method
CN1601549A (en) * 2003-09-26 2005-03-30 中国科学院自动化研究所 Human face positioning and head gesture identifying method based on multiple features harmonization
US20050135675A1 (en) * 2003-12-19 2005-06-23 Institute For Information Industry Simulation method for makeup trial and the device thereof
JP2007213561A (en) * 2006-01-16 2007-08-23 Honda Motor Co Ltd Vehicle periphery supervisory unit
JP2007216000A (en) * 2006-01-17 2007-08-30 Shiseido Co Ltd Makeup simulation system, makeup simulation device, makeup simulation method and makeup simulation program
US20090185746A1 (en) * 2008-01-22 2009-07-23 The University Of Western Australia Image recognition
CN101593352A (en) * 2009-06-12 2009-12-02 浙江大学 Driving safety monitoring system based on face orientation and visual focus
CN201716740U (en) * 2010-07-06 2011-01-19 上海慧广科技发展有限公司 Intelligent facial recognition system
US20120067364A1 (en) * 2010-09-21 2012-03-22 Zong Jing Investment, Inc. Facial make-up application machine and make-up application method using the same
CN102567727A (en) * 2010-12-13 2012-07-11 中兴通讯股份有限公司 Method and device for replacing background target
US20120223956A1 (en) * 2011-03-01 2012-09-06 Mari Saito Information processing apparatus, information processing method, and computer-readable storage medium
JP3172833U (en) * 2011-10-25 2012-01-12 エキスパートマグネティックス株式会社 Makeup robot
CN102496165A (en) * 2011-12-07 2012-06-13 四川九洲电器集团有限责任公司 Method for comprehensively processing video based on motion detection and feature extraction

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
EROL SAHIN 等: "Development of a Visual Object Localization Module for Mobile Robots", 《ADVANCED MOBILE ROBOTS,1999(EUROBOT ’99) 1999 THIRD EUROPEAN WORKSHOP ON》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108078262A (en) * 2017-12-23 2018-05-29 中合国际知识产权股份有限公司 A kind of multi-functional cosmetic chair
CN108308834A (en) * 2018-02-05 2018-07-24 天津机电职业技术学院 A kind of makeup machine

Also Published As

Publication number Publication date
CN103885461B (en) 2017-03-01

Similar Documents

Publication Publication Date Title
EP2747030A2 (en) Method for moving color-makeup tool of automatic color-makeup machine
JP6435516B2 (en) Makeup support device, makeup support method, and makeup support program
US20200167983A1 (en) Precise application of cosmetic looks from over a network environment
US20210177124A1 (en) Information processing apparatus, information processing method, and computer-readable storage medium
Arora et al. Experimental Evaluation of Sketching on Surfaces in VR.
US9607347B1 (en) Systems and methods of 3D scanning and robotic application of cosmetics to human
US10404890B2 (en) Manicure device and manicure, health management, and information pushing methods
JP6778877B2 (en) Makeup parts creation device, makeup parts utilization device, makeup parts creation method, makeup parts usage method, makeup parts creation program, and makeup parts utilization program
TWI543726B (en) Automatic coloring system and method thereof
US9811717B2 (en) Systems and methods of robotic application of cosmetics
CN204580251U (en) A kind of manicure device
Rahman et al. Augmented rendering of makeup features in a smart interactive mirror system for decision support in cosmetic products selection
WO2008098235A2 (en) System and method for providing simulated images through cosmetic monitoring
JP2008017936A (en) Makeup apparatus and method
US20150366327A1 (en) Cosmetics Applicator System and Method
CN101779218A (en) Makeup simulation system, makeup simulation apparatus, makeup simulation method, and makeup simulation program
CN108466265B (en) Mechanical arm path planning and operation method, device and computer equipment
Treepong et al. Makeup creativity enhancement with an augmented reality face makeup system
CN103885461A (en) Movement method for makeup tool of automatic makeup machine
KR101719927B1 (en) Real-time make up mirror simulation apparatus using leap motion
WO2021070698A1 (en) Automatic makeup machine, method, program, and control device
KR20200069483A (en) Apparatus for automatic facial makeup
CN103853067A (en) Automatic coloring system and method thereof
Lo et al. Brush footprint acquisition and preliminary analysis for Chinese calligraphy using a robot drawing platform
CN103884315A (en) Distance measuring method and computer program product

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant