US20140161507A1 - Automatic coloring system and method - Google Patents

Automatic coloring system and method Download PDF

Info

Publication number
US20140161507A1
US20140161507A1 US13/829,526 US201313829526A US2014161507A1 US 20140161507 A1 US20140161507 A1 US 20140161507A1 US 201313829526 A US201313829526 A US 201313829526A US 2014161507 A1 US2014161507 A1 US 2014161507A1
Authority
US
United States
Prior art keywords
coloring
automatic
image
instructions
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/829,526
Inventor
Charlene Hsueh-Ling Wong
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zong Jing Investment Inc
Original Assignee
Zong Jing Investment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zong Jing Investment Inc filed Critical Zong Jing Investment Inc
Assigned to ZONG JING INVESTMENT,INC. reassignment ZONG JING INVESTMENT,INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WONG, CHARLENE HSUEH-LING
Publication of US20140161507A1 publication Critical patent/US20140161507A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D44/005Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D40/00Casings or accessories specially adapted for storing or handling solid or pasty toiletry or cosmetic substances, e.g. shaving soaps or lipsticks
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D33/00Containers or accessories specially adapted for handling powdery toiletry or cosmetic substances
    • A45D33/02Containers or accessories specially adapted for handling powdery toiletry or cosmetic substances with dispensing means, e.g. sprinkling means
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D34/00Containers or accessories specially adapted for handling liquid toiletry or cosmetic substances, e.g. perfumes
    • A45D34/04Appliances specially adapted for applying liquid, e.g. using roller or ball

Definitions

  • the present invention relates to a coloring technology, and more particularly to an automatic coloring system and method for coloring a three-dimensional object.
  • a simulation device for trying color makeup or a care product is provided by some research.
  • a user may simulate an effect of makeup on a screen before purchase instead of trying a color makeup product in person; for example, US Patent Publication No. 2005/0135675A1.
  • simulating the effect of the color makeup on the screen still depends on manual makeup skills that apply the color makeup on the human face.
  • the real effect of manual makeup performed by the user is not necessarily equal to the effect presented by the simulation on the screen.
  • an automatic coloring system is used for coloring a three-dimensional object.
  • the automatic coloring system includes an automatic coloring machine, and the automatic coloring machine includes a first connecting interface, a material supply module, a moving module, at least one coloring tool, and a control unit.
  • the material supply module has at least one pigment.
  • the coloring tool is disposed on the moving module.
  • the control unit is connected electrically to the first connecting interface, the material supply module, and the moving module.
  • the first connecting interface is used for receiving a coloring procedure in a wireless manner or in a wired manner.
  • the coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof.
  • the control unit sequentially executes the coloring instructions in the coloring procedure, and according to the executed coloring instruction controls the material supply module to select at least one pigment and controls the moving module to move one coloring tool to apply the selected pigment to the three-dimensional object.
  • the automatic coloring system may further include an electronic device, and the electronic device includes a processing unit, a user interface, and a second connecting interface.
  • the processing unit is connected electrically to the user interface and the second connecting interface.
  • the processing unit is used for receiving an appearance image of the three-dimensional object, and generating an outline image through feature analysis of the appearance image.
  • the user interface is used for displaying the outline image, and sequentially outputting at least one edit instruction corresponding to the outline image, so that the processing unit obtains the coloring procedure in response to the edit instruction.
  • the second connecting interface then outputs the coloring procedure to the first connecting interface in a wireless manner or in a wired manner.
  • the automatic coloring system may further include an image capturing module, and the image capturing module is used for capturing the appearance image of the three-dimensional object.
  • the electronic device, the automatic coloring machine, and the image capturing module may be devices capable of being separated from each other. Alternatively, the image capturing module is built in the electronic device or in the automatic coloring machine.
  • an automatic coloring method includes receiving an appearance image of a three-dimensional object; generating an outline image through feature analysis on the appearance image; displaying the outline image on a user interface; using the user interface to sequentially output at least one edit instruction corresponding to the outline image; in response to the at least one edit instruction, obtaining a coloring procedure; and outputting the obtained coloring procedure to an automatic coloring machine in a wireless manner or in a wired manner.
  • the coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof.
  • the automatic coloring method may further include sequentially executing the coloring instructions in the coloring procedure.
  • An execution step of each coloring instruction includes: according to the executed coloring instruction, controlling the material supply module of the automatic coloring machine to select at least one pigment; and according to the executed coloring instruction, controlling the moving module of the automatic coloring machine to move a coloring tool, so as to apply the selected pigment to the three-dimensional object.
  • each coloring instruction includes track information represented by two-dimensional coordinates or represented by three-dimensional coordinates.
  • the outline image may be a three-dimensional simulated image.
  • the automatic coloring system and method according to the present invention are used for coloring a three-dimensional object.
  • the electronic device executes a coloring design process to obtain a coloring procedure corresponding to the three-dimensional object.
  • the coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof.
  • the coloring procedure is output from the electronic device to the automatic coloring machine.
  • the automatic coloring machine directly executes the coloring instructions in the coloring procedure sequentially.
  • the automatic coloring system and method according to the present invention have the coloring design process (executed by the electronic device), and the actual coloring process (executed by the automatic coloring machine), that are separable, so that the user can design and exchange a colored pattern anytime anywhere.
  • the automatic coloring system and method according to the present invention enable an external device to provide a coloring procedure to be directly executed by the automatic coloring machine, thereby facilitating simplification of the structure of the automatic coloring machine.
  • the automatic coloring machine can execute the actual coloring process more precisely.
  • the coloring action in the coloring design process is closer to that in the actual coloring process.
  • FIG. 1 is a schematic block diagram of an automatic coloring system according to a first embodiment of the present invention
  • FIG. 2 is a schematic block diagram of an electronic device according to a first embodiment of the present invention.
  • FIG. 3 is a schematic block diagram of an electronic device according to a first embodiment of the present invention.
  • FIG. 4 is a schematic block diagram of an automatic coloring machine according to a first embodiment of the present invention.
  • FIG. 5 is a schematic view of an automatic coloring system according to a second embodiment of the present invention.
  • FIG. 6 is a schematic view of an automatic coloring system according to a third embodiment of the present invention.
  • FIG. 7 is a flow chart of an automatic coloring method according to a first embodiment of the present invention.
  • FIG. 8 is a schematic view of a user interface according to an embodiment
  • FIG. 9 is a schematic view of a tool option according to an embodiment
  • FIG. 10A is a schematic view of a color palette option according to a first embodiment
  • FIG. 10B is a schematic view of a color palette option according to a second embodiment
  • FIG. 11 is a schematic view of an automatic coloring system according to a fourth embodiment of the present invention.
  • FIG. 12 is a schematic view of a template option according to an embodiment.
  • FIG. 13 is a flow chart of an automatic coloring method according to a second embodiment of the present invention.
  • an automatic coloring system 10 includes an electronic device 11 and an automatic coloring machine 12 .
  • the electronic device 11 may output a coloring procedure corresponding to a colored pattern of a three-dimensional object 14 to the automatic coloring machine 12 , and the automatic coloring machine 12 colors the three-dimensional object 14 by executing the coloring procedure.
  • the electronic device 11 may be a device capable of executing an application or an equivalent device thereof, such as a portable electronic device or a personal computer.
  • the portable electronic device may be a smart phone, a notebook computer, a tablet computer, or other equivalent device.
  • the three-dimensional object 14 may be a human body, a specific part of a human body (such as the face, an eye, and a nail), or an article (such as a mask and a cup).
  • the automatic coloring system 10 further includes an image capturing module 13 :
  • the image capturing module 13 is used for capturing an appearance image Pf of the three-dimensional object 14 .
  • the electronic device 11 , the automatic coloring machine 12 , and the image capturing module 13 may be devices capable of being separated from each other.
  • the separable image capturing module 13 is, for example, a digital camera or a webcam.
  • the image capturing module 13 is an image pickup device capable of color photographing.
  • the image capturing module 13 may be built in the electronic device 11 (as shown in FIG. 2 ) or in the automatic coloring machine 12 (as shown in FIG. 4 ).
  • the electronic device 11 includes a processing unit 110 , a user interface 120 , a connecting interface 130 , and a storage unit 140 .
  • the automatic coloring machine 12 includes a control unit 210 , a connecting interface 230 , a material supply module 240 , a moving module 250 , and at least one coloring tool 260 and 262 .
  • the connecting interface 230 of the automatic coloring machine 12 is referred to as the first connecting interface 230
  • the connecting interface 130 of the electronic device 11 is referred to as the second connecting interface 130 .
  • the processing unit 110 is connected electrically to the user interface 120 , the second connecting interface 130 , and the storage unit 140 .
  • the second connecting interface 130 is used for being connected electrically to the first connecting interface 230 of the automatic coloring machine 12 in a wireless manner or in a wired manner.
  • the electrical connection in the wired manner may be direct connection (for example, the first connecting interface 230 and the second connecting interface 130 are a male connector and a female connector, which are physical connectors, respectively), or indirect connection (for example, through a connecting cable 15 or an equivalent device thereof).
  • the electronic device 11 may have the built-in image capturing module 13 , and the image capturing module 13 is connected electrically to the processing unit 110 , as shown in FIG. 2 .
  • the appearance image Pf captured by the image capturing module 13 may be transmitted to the processing unit 110 , or may be stored in the storage unit 140 in advance.
  • the electronic device 11 may further include another connecting interface 132 .
  • the connecting interface 132 is referred to as the third connecting interface 132 .
  • the third connecting interface 132 is connected electrically to the processing unit 110 .
  • the image capturing module 13 outside the electronic device 11 is connected to the third connecting interface 132 in a wireless manner, in a directly connected manner, or through a connecting cable, so that the image capturing module 13 is connected electrically to the processing unit 110 through the third connecting interface 132 , as shown in FIG. 3 .
  • the appearance image Pf captured by the image capturing module 13 may be transmitted to the processing unit 110 through the third connecting interface 132 .
  • the image capturing module 13 may be a Charge Coupled Device (CCD) element, a Complementary Metal Oxide Semiconductor (CMOS) element, or other equivalent element.
  • the image capturing module 13 is an image pickup device capable of color photographing.
  • the user interface 120 may be a touch screen, a combination of a touch screen and at least one physical button, a combination of a screen and an input assembly (for example, a keyboard, a mouse, a handwriting pad, or a combination thereof), or an equivalent device.
  • the control unit 210 is connected electrically to the first connecting interface 230 , the material supply module 240 , and the moving module 250 .
  • the coloring tool 260 and 262 are disposed on the moving module 250 .
  • the material supply module 240 has at least one pigment.
  • An example of makeup of a human face is taken in the following to exemplarily illustrate the structure of the automatic coloring machine 12 in detail.
  • the three-dimensional object 14 is the face of a user.
  • the automatic coloring machine 12 may further include a table 202 and a face positioning module 220 .
  • the control unit 210 , the face positioning module 220 , and the moving module 250 are disposed on the table 202 .
  • the face positioning module 220 is disposed to be corresponding to the moving module 250 .
  • the face positioning module 220 is provided, so that the head of the user is disposed on the face positioning module 220 , so as to ensure the position of the face.
  • the face positioning module 220 includes a lower-jaw support 221 and an overhead positioning member 222 .
  • the lower-jaw support 221 is used by the user to place the lower jaw thereof, so as to support the head (face), of the user.
  • the overhead positioning member 222 is disposed above the lower-jaw support 221 .
  • the overhead positioning member 222 is slightly inverted U-shaped, and an arc-shaped holding portion 223 is formed in an upper middle position corresponding to the forehead.
  • the user may urge the forehead thereof against the holding portion 223 of the overhead positioning member 222 , and urge the chin against the lower-jaw support 221 , so as to ensure the face of the user to be opposite to the position of the moving module 250 .
  • the moving module 250 includes a moving block 251 , a lifter 252 , a horizontal rail 253 , and a telescopic platform 254 .
  • the horizontal rail 253 spans and is above the lifter 252 , and by adjusting the lifter 252 , the horizontal rail 253 is enabled to move vertically along a first direction (for example, the Y-axis direction).
  • the telescopic platform 254 is slidably disposed on the horizontal rail 253 , and the telescopic platform 254 can move left and right on the horizontal rail 253 along a second direction (for example, the X-axis direction in the drawing).
  • the moving block 251 is disposed on the telescopic platform 254 , and the moving block 251 can move back and forth on the telescopic platform 254 along a third direction (for example, the Z-axis direction in the drawing). Further, a motor controlled by the control unit 210 drives the moving block 251 , the lifter 252 , and the telescopic platform 254 , so that the moving block 251 can move in a three-dimensional manner accordingly to be precisely positioned.
  • a motor controlled by the control unit 210 drives the moving block 251 , the lifter 252 , and the telescopic platform 254 , so that the moving block 251 can move in a three-dimensional manner accordingly to be precisely positioned.
  • the material supply module 240 controls output and makeup operations thereof through the control unit 210 .
  • the material supply module 240 is disposed on the moving block 251 of the moving module 250 .
  • the material supply module 240 stores various coloring materials.
  • An output port of the material supply module 240 is appropriately connected to each coloring tool 260 and 262 , and supplies a corresponding pigment to the coloring tool 260 and 262 .
  • the coloring tool 260 and 262 may be a spray head, a nozzle, or a coating pen.
  • the material supply module 240 may have a supply cup and an air pressure pipe.
  • the supply cup stores a pigment.
  • the air pressure pipe is connected to an air compressor, provides air flowing to the output port, and can absorb the pigment in the supply cup and spray the pigment out through the output port.
  • the material supply module 240 may be designed to have a rotary wheel, various output ports are disposed in the rotary wheel, so as to output the pigment to the outside.
  • the output ports are disposed on the circumference of the rotary wheel. Rotating of the rotary wheel results in different pigments.
  • the diversified material supply module 240 facilitates automatic coating using different coloring tools 260 and 262 or pigments.
  • a control module 204 may be disposed on the table 202 .
  • the control module 204 has the control unit 210 and the first connecting interface 230 .
  • the first connecting interface 230 receives a coloring procedure from the electronic device 11 in a wireless manner or in a wired manner, and transmits the received coloring procedure to the control unit 210 to sequentially execute each coloring instruction in the coloring procedure.
  • the coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof.
  • Each coloring instruction may include track information represented by two-dimensional coordinates or track information represented by three-dimensional coordinates.
  • control unit 210 controls, based on track information in a currently executed coloring instruction, movement of the moving module 250 , to make the moving block 251 move to be positioned.
  • the automatic coloring machine 12 may further include a range finding device 270 .
  • the range finding device 270 is mounted on the moving block 251 of the moving module 250 .
  • the range finding device 270 can measure a position in the third direction to provide a position signal and a calibration signal, so as to convert a two-dimensional image into a three-dimensional image for operation, thereby ensuring that the coloring tool 260 and 262 contacts the face of the user safely or keeps a safe distance from the face of the user.
  • the control unit 210 controls movement of the moving module 250 based on track information in a currently executed coloring instruction, so as to make the moving block 251 drive the coloring tool to apply a selected pigment to the face of the user. Further, according to the type of the selected coloring tool and the position signal obtained by the range finding device 270 , the control unit 210 controls a distance of movement of the moving module 250 relative to the face, so that the moving block 251 moves the coloring tool to the position for contacting the face of the user safely or a position for keeping a safe distance from the face of the user.
  • the range finding device 270 may be a laser range finder, a tellurometer, an infrared range finder, an image capturing module, or other equivalent range finding devices.
  • the three-dimensional object 14 may be an eye of the user.
  • the aforementioned face positioning module 220 may be an eye mask to enable the eye of the user to correspond to the moving module 250 of the automatic coloring machine 12 .
  • the second connecting interface 130 may be a wireless transceiver module, a Universal Serial Bus (USB), or an External Serial Advanced Technology Attachment (e-SATA) connector.
  • the third connecting interface may be a wireless transceiver module, a USB, or an e-SATA connector.
  • the wireless transceiver module may adopt various wireless communications technologies in the prior art, such as the Bluetooth technology, the Wireless Fidelity (WiFi) technology, and the Near Field Communication (NFC) technology.
  • WiFi Wireless Fidelity
  • NFC Near Field Communication
  • the pigment may be powdery, foamy, gelatinous, in a liquid state, and in any one of the three phases or a combination thereof, for example, shining pieces, mist or other special state.
  • the pigment is, for example, a makeup base material, a concealing material, an eyebrow color material, a cheek color material, a labial makeup material, a decorative color makeup material, basic care material, various colors of inks, or various colors of dyeing materials, which may be mixed arbitrarily.
  • the operation of the automatic coloring system 10 is illustrated below in detail for demonstration. Please refer to FIG. 1 to FIG. 8 , in which the storage unit 140 stores a coloring application.
  • the processing unit 110 executes the coloring application, so as to display a coloring editing window 121 on the user interface 120 (Step S 21 ).
  • the coloring editing window 121 includes an image preview box 122 and a design function bar 124 .
  • the design function bar 124 has an edit option 125 , a return option 126 , a clear option 127 , a complete option 128 , and a file option 129 .
  • the edit option 125 has a tool option 1251 and a color palette option 1252 , as shown in FIG. 9 .
  • the tool option 1251 and the color palette option 1252 may be located on the same level of menu, as shown in FIG. 9 and FIG. 10A . In some embodiments, the tool option 1251 and the color palette option 1252 may be located on different levels of menu, as shown in FIG. 10B .
  • FIG. 10B in which the tool option 1251 has multiple tool pictures A1 and A2, and each tool picture A1 and A2 is connected to a color palette option 1252 .
  • the coloring application provides the color palette option 1252 connected to the tool picture A1, for selection by the user.
  • the color palette option 1252 has multiple color pictures C1 and C2 to be selected by the user.
  • the processing unit 110 may receive an appearance image Pf of a three-dimensional object 14 from the image capturing module 13 , read a stored appearance image Pf from the storage unit 140 , or receive an appearance image Pf from an external electronic device or storage device (Step S 23 ).
  • the appearance image Pf may be a plane simulated image, i.e. there is 2D image of the three-dimensional object 14 in the appearance image.
  • the appearance image Pf may be a three-dimensional simulated image, i.e. there is 3D model of the three-dimensional object 14 in the appearance image.
  • the processing unit 110 performs feature analysis on the received appearance image Pf, so as to generate an outline image Pp (Step S 25 ).
  • the processing unit 110 may directly read a stored outline image Pp from the storage unit 140 , or receive an outline image Pp from an external electronic device or storage device.
  • the user may use the file option 129 to select an outline image Pp to be displayed in the image preview box 122 .
  • the processing unit 110 then displays the outline image Pp in the image preview box 122 on the user interface 120 (Step S 27 ).
  • the user may use the edit option 125 to perform coloring design of the outline image Pp.
  • the user may use the tool option 1251 to select a coloring tool (that is, click a tool picture A1/A2 in the tool option).
  • a coloring tool that is, click a tool picture A1/A2 in the tool option
  • use the color palette option 1252 to select a color to be used (that is, click a color picture C1/C2 in the color palette option)
  • use the selected coloring tool and color to perform a coloring action on the outline image Pp in the image preview box 122 that is, move the mouse to perform simulated coloring on the outline image Pp
  • the user interface 120 For each coloring action performed by the user, the user interface 120 outputs an edit instruction in response to the coloring action of the user (Step S 29 ), so that the coloring application (that is, the processing unit 110 ) generates a coloring instruction in response to the edit instruction.
  • the coloring instruction includes tool information indicating the coloring tool selected by the user, color information indicating the color selected by the user, and track information indicating a movement track of the coloring action.
  • the track information is formed of multiple consecutive positioning points.
  • the start of the coloring action corresponds to a first positioning point
  • the end of the coloring action corresponds to a last positioning point
  • a movement process of the coloring action corresponds to a second positioning point to a penultimate positioning point sequentially.
  • Each positioning point may be coordinate data.
  • the user may click the complete option 128 to make the user interface 120 output a confirm instruction.
  • the coloring application that is, the processing unit 110
  • the confirm instruction sequences the multiple coloring instructions corresponding to the multiple coloring actions according to a generation order, so as to generate a coloring procedure (Step S 31 ), and output the generated coloring procedure to the outside or store the generated coloring procedure in the storage unit 140 (Step S 33 ).
  • the coloring procedure has multiple coloring instructions, and the coloring instructions are sequenced according to each individual generation order thereof (that is, an order in which the user performs the multiple coloring actions).
  • the coloring application may obtain a colored pattern Pc through the outline image Pp in the image preview box 122 in response to the confirm instruction.
  • the processing unit 110 may store the colored pattern Pc and the coloring procedure corresponding to the colored pattern Pc in the storage unit 140 , so as to form a pattern database.
  • the coloring application may have a pattern database.
  • the pattern database is stored in the storage unit 140 .
  • the pattern database has one or more colored patterns Pc that are edited and stored in advance, and each colored pattern Pc has a corresponding coloring procedure Sp. Therefore, during next time of use, the user may directly use the file option 129 to select a colored pattern Pc to be used from the pattern database, and display the colored pattern Pc in the image preview box 122 , which is for confirmation by the user.
  • the example that the user edits and designs the colored pattern Pc is provided, by the present invention is not limited thereto. That is to say, please refer to FIG. 11 , in which after the user captures the appearance image Pf of the three-dimensional object 14 (such as the face, an eye, or other objects), to be colored, the appearance image Pf or the outline image Pp may be transmitted to another electronic device 11 ′ through the second connecting interface 130 in a wireless manner, in a wired manner or in other far-end transmission manner.
  • the second connecting interface 130 may be a telecommunication module, so as to transmit the appearance image Pf acting as a multimedia messaging (MMS) (short message).
  • MMS multimedia messaging
  • a designer may perform coloring design on the outline image Pp through the electronic device 11 ′, that is, Step S 25 to Step S 31 or Step S 27 to Step S 31 .
  • the colored pattern Pc and the corresponding coloring procedure Sp are transmitted back. to the electronic device 11 of the user through the second connecting interface 130 in a wireless manner or in a wired manner (Step S 33 ).
  • the edit option 125 may further have a template option 1253 .
  • the template option 1253 has multiple template patterns E1 and E2.
  • each of the template patterns E1 and E2 is a colored pattern Pc that is edited and stored in the pattern database in advance. That is to say, each of the template patterns E1 and E2 has a respective corresponding coloring procedure Sp, and the coloring procedure Sp is already stored in the pattern database correspondingly in advance.
  • each edited colored pattern Pc and the corresponding coloring procedure thereof may be optionally stored as a template, so as to become an option in the template option 1253 .
  • the colored pattern. Pc may act as a template pattern.
  • each of the template patterns E1 and E2 is a colored pattern Pc, but is not limited to be represented (shown), as the outline image Pp of the three-dimensional object 14 .
  • each of the template patterns E1 and E2 presents a result of coloring design.
  • the user interface 120 may output an edit instruction corresponding to the template pattern E1 in response to a select operation of the user, so that the coloring application reads the coloring procedure Sp corresponding to the template pattern E1 from the pattern database in response to the edit instruction (Step S 31 ), and outputs the read coloring procedure Sp when the user clicks the complete option 128 (Step S 33 ).
  • the coloring procedure is generated in a script form.
  • the coloring procedure in the script form is, for example, as follows:
  • ⁇ coloring interface skin> (that is, the type of the three-dimensional object 14 to be colored)
  • ⁇ coloring pigment No. 1 spray material> (that is, the type of the coloring pigment)
  • ⁇ color spray fineness A> (that is, the type of the coloring tool)
  • ⁇ color spray color red> (that is, the color of the coloring pigment)
  • the outline image Pp displayed in the image preview box 122 may be a plane simulated image. In other words, there is 2D image of the three-dimensional object 14 in the appearance image.
  • the outline image Pp displayed in the image preview box 122 may be a three-dimensional simulated imaged.
  • Implementation of the three-dimensional simulated image is well known by persons skilled in the art, and is not repeated herein.
  • each coloring action correspondingly generates track information represented by three-dimensional coordinates.
  • each positioning point in the track information is three-dimensional coordinate data.
  • the user may connect the electronic device 11 to the automatic coloring machine 12 , that is, electrically connect the first connecting interface 230 of the automatic coloring machine 12 to the second connecting interface 130 of the electronic device 11 (Step S 41 ).
  • the user may operate the electronic device 11 , so that the processing unit 110 outputs a coloring procedure to the automatic coloring machine 12 .
  • the automatic coloring machine 12 receives through the first connecting interface 230 the coloring procedure transmitted in a wireless manner or in a wired manner (for example, through the connecting cable 15 or a physical connector) (Step S 43 ).
  • the automatic coloring machine 12 performs makeup on the face of the user based on the coloring procedure.
  • the control unit 210 of the automatic coloring machine 12 sequentially executes each coloring instruction in the coloring procedure.
  • the control unit 210 controls, according to the executed coloring instruction, the material supply module 240 to select a pigment corresponding to color information in the coloring instruction (Step S 45 ), and controls, according to the executed coloring instruction, the moving module 250 to select a coloring tool corresponding to tool information in the coloring instruction (Step S 47 ).
  • the execution order of Step S 45 and Step S 47 is not limited by the present invention. That is to say, besides sequential execution of Step S 45 and Step S 47 , Step S 45 and Step S 47 may be executed at the same time, or Step S 47 is executed before Step S 45 .
  • the color indicated by the color information may be one of multiple pigments included by the material supply module 240 . Further, the color indicated by the color information may not be among the multiple pigments included by the material supply module 240 . In this case, the material supply module 240 may select, according to the color indicated by the color information, two or more pigments from the multiple pigments to obtain the needed pigment (that is, the color indicated by the color information), by mixing.
  • the automatic coloring machine 12 may have a storage unit, and a color database is established in the storage unit.
  • the color database has multiple colors and corresponding mixing methods (for example, pigments and proportions thereof required for mixing).
  • control unit 210 moves the moving module 250 according to track information in the executed coloring instruction, and applies the selected pigment to the face of the user (the three-dimensional object 14 ), through the selected coloring tool (Step S 49 ).
  • the control unit 210 moves the moving module 250 in the first direction and in the second direction according to each positioning point in the track information, so as to move the moving module 250 to a corresponding designated position. Further, during movement to each positioning point or upon arriving at the designated position, the control unit 210 receives a position signal from the range finding device 270 to control movement of the moving module 250 relative to the face (that is, the movement in the third direction), so that the coloring tool is positioned in a position capable of applying the pigment to the face of the user safely.
  • the first direction, the second direction, and the third direction are the Y-axis, the X-axis, and the Z-axis of a movement coordinate system of the moving module 250 respectively.
  • Step S 51 After a coloring instruction is executed (that is, after a coloring action is completed), the control unit 210 continues to execute a next coloring instruction, until all coloring instructions are executed (Step S 51 ).
  • the coloring application has a coordinate system conversion step, so that an image coordinate system of the outline image Pp corresponds to the movement coordinate system of the moving module 250 .
  • the coordinate system conversion step may use features or edges obtained in the feature analysis step (Step S 25 ) as corresponding points, so that the image coordinate system of the outline image Pp corresponds to the movement coordinate system of the moving module 250 . That is to say, the outline image Pp is mapped to the actual position of the three-dimensional object 14 in the movement coordinate system of the moving module 250 .
  • the coordinate system conversion step may be implemented by using a scaling object with known actual size.
  • the scaling object is in the coverage of the capture at the same time.
  • the image capturing module 13 is used to capture the appearance image Pf including the image of the three-dimensional object 14 and including the image of the scaling object.
  • the scale between the image coordinate system of the outline image Pp and the movement coordinate system of the moving module 250 is calculated. Then, by using the features or edges obtained in the feature analysis step (Step S 25 ) as the corresponding points and by using the calculated scale, the outline image Pp is mapped to the actual position of the three-dimensional object 14 in the movement coordinate system of the moving module 250 .
  • the coordinate system conversion step may be implemented by using camera parameters (such as a focal length of the lens and an image format). of the image capturing module 13 and specifications of the screen in the user interface 120 .
  • the coloring application may calculate a scale between image size of the outline image Pp and the actual size of the three-dimensional object 14 according to the camera. parameters of the image capturing module 13 and the specifications of the screen in the user interface 120 . Then, by using the features or edges obtained in the feature analysis step (Step S 25 ) as the corresponding points and by using the calculated scale, the outline image Pp is mapped to the actual position of the three-dimensional object 14 in the movement coordinate system of the moving module 250 .
  • the coloring action applied to the outline image Pp on the user interface 120 can enable the coloring application to generate corresponding track information based on the movement coordinate system of the moving module 250 .
  • the coloring application may be implemented by a computer program product, so that after a computer (that is, the electronic device), is loaded with the coloring application and executes the coloring application, the automatic coloring method according to any embodiment of the present invention may be performed.
  • the computer program product may be a readable recording medium, and the coloring application is stored in the readable recording medium to be loaded into a computer.
  • the coloring application may be a computer program product, and transmitted to the computer in a wired manner or wireless manner.
  • the automatic coloring system and method according to the present invention have the coloring design process (executed by the electronic device), and the actual coloring process (executed by the automatic coloring machine), that are separable, so that the user can design and exchange a colored pattern anytime anywhere.
  • the automatic coloring system and method according to the present invention enable an external device to provide a coloring procedure to be directly executed by the automatic coloring machine, thereby facilitating simplification of the structure of the automatic coloring machine.
  • the control unit of the automatic coloring machine does not need to have a powerful processing function, and may be implemented by, for example, a. microcontroller, or the automatic coloring machine does not need to be provided with the image capturing module.
  • the automatic coloring machine can execute the actual coloring process more precisely. In some embodiments, by directly presenting a three-dimensional simulated image, the coloring action in the coloring design process is closer to that in the actual coloring process.

Landscapes

  • Processing Or Creating Images (AREA)
  • Control Of El Displays (AREA)

Abstract

An automatic coloring system and method are used for coloring a three-dimensional object. An electronic device executes a coloring design process to obtain a coloring procedure corresponding to the three-dimensional object. The coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof. A connecting interface electrically connects the electronic device to an automatic coloring machine in a separable manner, so as to output the coloring procedure from the electronic device to the automatic coloring machine. Finally, the automatic coloring machine directly executes the coloring instructions in the coloring procedure sequentially.

Description

    CROSS-REFERENCES TO RELATED APPLICATIONS
  • This non-provisional application claims priority under 35 U.S.C. §119(a) on Patent Application No. 101146207 filed in Taiwan, R.O.C. on 2012 Dec. 7, the entire contents of which are hereby incorporated by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Technical Field
  • The present invention relates to a coloring technology, and more particularly to an automatic coloring system and method for coloring a three-dimensional object.
  • 2. Related Art
  • Wanting to be beautiful is a natural human desire, so various major manufacturers provide the market with a wide variety of care products and cosmetics for consumers to purchase. However, in order to compose makeup a person likes and is suitable to the person, makeup techniques must be practiced repeatedly, and various cosmetics and makeup tools purchased, so as to draw various eyebrow shapes, various eye lines, eyelashes, eye contours, face makeup, labial makeup, appearance modifications, and various color changes. However the difference in proficiency in the makeup techniques and the wide range of cosmetics usually results in a difference between the effect of the makeup and the effect expected by the consumer.
  • As the information technology continues to evolve, a simulation device for trying color makeup or a care product is provided by some research. Through the simulation device for tying color makeup or a care product, a user may simulate an effect of makeup on a screen before purchase instead of trying a color makeup product in person; for example, US Patent Publication No. 2005/0135675A1. However, simulating the effect of the color makeup on the screen still depends on manual makeup skills that apply the color makeup on the human face. However, the real effect of manual makeup performed by the user is not necessarily equal to the effect presented by the simulation on the screen.
  • SUMMARY
  • In an embodiment, an automatic coloring system is used for coloring a three-dimensional object. The automatic coloring system includes an automatic coloring machine, and the automatic coloring machine includes a first connecting interface, a material supply module, a moving module, at least one coloring tool, and a control unit. The material supply module has at least one pigment. The coloring tool is disposed on the moving module. The control unit is connected electrically to the first connecting interface, the material supply module, and the moving module.
  • The first connecting interface is used for receiving a coloring procedure in a wireless manner or in a wired manner. The coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof. The control unit sequentially executes the coloring instructions in the coloring procedure, and according to the executed coloring instruction controls the material supply module to select at least one pigment and controls the moving module to move one coloring tool to apply the selected pigment to the three-dimensional object.
  • In some embodiments, the automatic coloring system may further include an electronic device, and the electronic device includes a processing unit, a user interface, and a second connecting interface. The processing unit is connected electrically to the user interface and the second connecting interface. The processing unit is used for receiving an appearance image of the three-dimensional object, and generating an outline image through feature analysis of the appearance image. The user interface is used for displaying the outline image, and sequentially outputting at least one edit instruction corresponding to the outline image, so that the processing unit obtains the coloring procedure in response to the edit instruction. The second connecting interface then outputs the coloring procedure to the first connecting interface in a wireless manner or in a wired manner.
  • In some embodiments, the automatic coloring system may further include an image capturing module, and the image capturing module is used for capturing the appearance image of the three-dimensional object. The electronic device, the automatic coloring machine, and the image capturing module may be devices capable of being separated from each other. Alternatively, the image capturing module is built in the electronic device or in the automatic coloring machine.
  • In an embodiment, an automatic coloring method includes receiving an appearance image of a three-dimensional object; generating an outline image through feature analysis on the appearance image; displaying the outline image on a user interface; using the user interface to sequentially output at least one edit instruction corresponding to the outline image; in response to the at least one edit instruction, obtaining a coloring procedure; and outputting the obtained coloring procedure to an automatic coloring machine in a wireless manner or in a wired manner. The coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof.
  • In some embodiments, the automatic coloring method may further include sequentially executing the coloring instructions in the coloring procedure. An execution step of each coloring instruction includes: according to the executed coloring instruction, controlling the material supply module of the automatic coloring machine to select at least one pigment; and according to the executed coloring instruction, controlling the moving module of the automatic coloring machine to move a coloring tool, so as to apply the selected pigment to the three-dimensional object.
  • In some embodiments, each coloring instruction includes track information represented by two-dimensional coordinates or represented by three-dimensional coordinates.
  • In some embodiments, the outline image may be a three-dimensional simulated image.
  • In view of the above, the automatic coloring system and method according to the present invention are used for coloring a three-dimensional object. Herein, the electronic device executes a coloring design process to obtain a coloring procedure corresponding to the three-dimensional object. The coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof. Through separable electrical connection of the connecting interface, the coloring procedure is output from the electronic device to the automatic coloring machine. Then, the automatic coloring machine directly executes the coloring instructions in the coloring procedure sequentially. In other words, the automatic coloring system and method according to the present invention have the coloring design process (executed by the electronic device), and the actual coloring process (executed by the automatic coloring machine), that are separable, so that the user can design and exchange a colored pattern anytime anywhere. Further, the automatic coloring system and method according to the present invention enable an external device to provide a coloring procedure to be directly executed by the automatic coloring machine, thereby facilitating simplification of the structure of the automatic coloring machine. In some embodiments, by directly providing track information represented by three-dimensional coordinates, the automatic coloring machine can execute the actual coloring process more precisely. In some embodiments, by directly presenting a three-dimensional simulated image, the coloring action in the coloring design process is closer to that in the actual coloring process.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention will become more fully understood from the detailed description given herein below for illustration only, and thus not limitative of the present invention, wherein:
  • FIG. 1 is a schematic block diagram of an automatic coloring system according to a first embodiment of the present invention;
  • FIG. 2 is a schematic block diagram of an electronic device according to a first embodiment of the present invention;
  • FIG. 3 is a schematic block diagram of an electronic device according to a first embodiment of the present invention;
  • FIG. 4 is a schematic block diagram of an automatic coloring machine according to a first embodiment of the present invention;
  • FIG. 5 is a schematic view of an automatic coloring system according to a second embodiment of the present invention;
  • FIG. 6 is a schematic view of an automatic coloring system according to a third embodiment of the present invention;
  • FIG. 7 is a flow chart of an automatic coloring method according to a first embodiment of the present invention;
  • FIG. 8 is a schematic view of a user interface according to an embodiment;
  • FIG. 9 is a schematic view of a tool option according to an embodiment;
  • FIG. 10A is a schematic view of a color palette option according to a first embodiment;
  • FIG. 10B is a schematic view of a color palette option according to a second embodiment;
  • FIG. 11 is a schematic view of an automatic coloring system according to a fourth embodiment of the present invention; and
  • FIG. 12 is a schematic view of a template option according to an embodiment; and
  • FIG. 13 is a flow chart of an automatic coloring method according to a second embodiment of the present invention.
  • DETAILED DESCRIPTION
  • Terms such as “first”, “second”, and “third” in the following description are used for distinguishing elements, not used for sequencing or limiting differences between the elements, and not used for limiting the scope of the present invention.
  • Please refer to FIG. 1 to FIG. 4, an automatic coloring system 10 includes an electronic device 11 and an automatic coloring machine 12. The electronic device 11 may output a coloring procedure corresponding to a colored pattern of a three-dimensional object 14 to the automatic coloring machine 12, and the automatic coloring machine 12 colors the three-dimensional object 14 by executing the coloring procedure. Herein, the electronic device 11 may be a device capable of executing an application or an equivalent device thereof, such as a portable electronic device or a personal computer. The portable electronic device may be a smart phone, a notebook computer, a tablet computer, or other equivalent device. The three-dimensional object 14 may be a human body, a specific part of a human body (such as the face, an eye, and a nail), or an article (such as a mask and a cup).
  • The automatic coloring system 10 further includes an image capturing module 13: The image capturing module 13 is used for capturing an appearance image Pf of the three-dimensional object 14. In some embodiments, the electronic device 11, the automatic coloring machine 12, and the image capturing module 13 (such as a. digital camera or a webcam) may be devices capable of being separated from each other. The separable image capturing module 13 is, for example, a digital camera or a webcam. Preferably, the image capturing module 13 is an image pickup device capable of color photographing. In some embodiments, the image capturing module 13 may be built in the electronic device 11 (as shown in FIG. 2) or in the automatic coloring machine 12 (as shown in FIG. 4).
  • Please refer to FIG. 2 and FIG. 3, in which the electronic device 11 includes a processing unit 110, a user interface 120, a connecting interface 130, and a storage unit 140.
  • Please refer to FIG. 4, in which the automatic coloring machine 12 includes a control unit 210, a connecting interface 230, a material supply module 240, a moving module 250, and at least one coloring tool 260 and 262.
  • To make the description clear, in the following, the connecting interface 230 of the automatic coloring machine 12 is referred to as the first connecting interface 230, and the connecting interface 130 of the electronic device 11 is referred to as the second connecting interface 130.
  • Please refer to FIG. 2 and FIG. 3, in which the processing unit 110 is connected electrically to the user interface 120, the second connecting interface 130, and the storage unit 140. The second connecting interface 130 is used for being connected electrically to the first connecting interface 230 of the automatic coloring machine 12 in a wireless manner or in a wired manner. The electrical connection in the wired manner may be direct connection (for example, the first connecting interface 230 and the second connecting interface 130 are a male connector and a female connector, which are physical connectors, respectively), or indirect connection (for example, through a connecting cable 15 or an equivalent device thereof).
  • In some embodiments, please refer to FIG. 2, in which the electronic device 11 may have the built-in image capturing module 13, and the image capturing module 13 is connected electrically to the processing unit 110, as shown in FIG. 2. The appearance image Pf captured by the image capturing module 13 may be transmitted to the processing unit 110, or may be stored in the storage unit 140 in advance.
  • In some embodiments, please refer to FIG. 3, in which the electronic device 11 may further include another connecting interface 132. To make the description clear, in the following, the connecting interface 132 is referred to as the third connecting interface 132.
  • The third connecting interface 132 is connected electrically to the processing unit 110. The image capturing module 13 outside the electronic device 11 is connected to the third connecting interface 132 in a wireless manner, in a directly connected manner, or through a connecting cable, so that the image capturing module 13 is connected electrically to the processing unit 110 through the third connecting interface 132, as shown in FIG. 3. In this case, the appearance image Pf captured by the image capturing module 13 may be transmitted to the processing unit 110 through the third connecting interface 132. Herein, the image capturing module 13 may be a Charge Coupled Device (CCD) element, a Complementary Metal Oxide Semiconductor (CMOS) element, or other equivalent element. Preferably, the image capturing module 13 is an image pickup device capable of color photographing.
  • In some embodiments, the user interface 120 may be a touch screen, a combination of a touch screen and at least one physical button, a combination of a screen and an input assembly (for example, a keyboard, a mouse, a handwriting pad, or a combination thereof), or an equivalent device.
  • In the automatic coloring machine 12, please refer to FIG. 4, in which the control unit 210 is connected electrically to the first connecting interface 230, the material supply module 240, and the moving module 250. The coloring tool 260 and 262 are disposed on the moving module 250. The material supply module 240 has at least one pigment.
  • An example of makeup of a human face is taken in the following to exemplarily illustrate the structure of the automatic coloring machine 12 in detail. In other words, in this example, the three-dimensional object 14 is the face of a user.
  • Please refer to FIG. 5, in which the automatic coloring machine 12 may further include a table 202 and a face positioning module 220. The control unit 210, the face positioning module 220, and the moving module 250 are disposed on the table 202.
  • The face positioning module 220 is disposed to be corresponding to the moving module 250. The face positioning module 220 is provided, so that the head of the user is disposed on the face positioning module 220, so as to ensure the position of the face.
  • The face positioning module 220 includes a lower-jaw support 221 and an overhead positioning member 222. The lower-jaw support 221 is used by the user to place the lower jaw thereof, so as to support the head (face), of the user. The overhead positioning member 222 is disposed above the lower-jaw support 221. Herein, the overhead positioning member 222 is slightly inverted U-shaped, and an arc-shaped holding portion 223 is formed in an upper middle position corresponding to the forehead. During use, the user may urge the forehead thereof against the holding portion 223 of the overhead positioning member 222, and urge the chin against the lower-jaw support 221, so as to ensure the face of the user to be opposite to the position of the moving module 250.
  • The moving module 250 includes a moving block 251, a lifter 252, a horizontal rail 253, and a telescopic platform 254. The horizontal rail 253 spans and is above the lifter 252, and by adjusting the lifter 252, the horizontal rail 253 is enabled to move vertically along a first direction (for example, the Y-axis direction). The telescopic platform 254 is slidably disposed on the horizontal rail 253, and the telescopic platform 254 can move left and right on the horizontal rail 253 along a second direction (for example, the X-axis direction in the drawing). The moving block 251 is disposed on the telescopic platform 254, and the moving block 251 can move back and forth on the telescopic platform 254 along a third direction (for example, the Z-axis direction in the drawing). Further, a motor controlled by the control unit 210 drives the moving block 251, the lifter 252, and the telescopic platform 254, so that the moving block 251 can move in a three-dimensional manner accordingly to be precisely positioned.
  • In this embodiment, the material supply module 240 controls output and makeup operations thereof through the control unit 210. The material supply module 240 is disposed on the moving block 251 of the moving module 250. The material supply module 240 stores various coloring materials. An output port of the material supply module 240 is appropriately connected to each coloring tool 260 and 262, and supplies a corresponding pigment to the coloring tool 260 and 262. The coloring tool 260 and 262 may be a spray head, a nozzle, or a coating pen.
  • When the coloring tool 260 is a nozzle, the material supply module 240 may have a supply cup and an air pressure pipe. The supply cup stores a pigment. The air pressure pipe is connected to an air compressor, provides air flowing to the output port, and can absorb the pigment in the supply cup and spray the pigment out through the output port.
  • When the coloring tool 262 is a coating pen, the material supply module 240 may be designed to have a rotary wheel, various output ports are disposed in the rotary wheel, so as to output the pigment to the outside. The output ports are disposed on the circumference of the rotary wheel. Rotating of the rotary wheel results in different pigments.
  • The diversified material supply module 240 facilitates automatic coating using different coloring tools 260 and 262 or pigments.
  • A control module 204 may be disposed on the table 202. The control module 204 has the control unit 210 and the first connecting interface 230.
  • The first connecting interface 230 receives a coloring procedure from the electronic device 11 in a wireless manner or in a wired manner, and transmits the received coloring procedure to the control unit 210 to sequentially execute each coloring instruction in the coloring procedure. In other words, the coloring procedure has multiple coloring instructions sequenced according to each individual generation order thereof. Each coloring instruction may include track information represented by two-dimensional coordinates or track information represented by three-dimensional coordinates.
  • In some embodiments, when each coloring instruction includes track information represented by three-dimensional coordinates, the control unit 210 controls, based on track information in a currently executed coloring instruction, movement of the moving module 250, to make the moving block 251 move to be positioned.
  • In some embodiments, when each coloring instruction includes track information represented by two-dimensional coordinates, the automatic coloring machine 12 may further include a range finding device 270. The range finding device 270 is mounted on the moving block 251 of the moving module 250. The range finding device 270 can measure a position in the third direction to provide a position signal and a calibration signal, so as to convert a two-dimensional image into a three-dimensional image for operation, thereby ensuring that the coloring tool 260 and 262 contacts the face of the user safely or keeps a safe distance from the face of the user.
  • The control unit 210 controls movement of the moving module 250 based on track information in a currently executed coloring instruction, so as to make the moving block 251 drive the coloring tool to apply a selected pigment to the face of the user. Further, according to the type of the selected coloring tool and the position signal obtained by the range finding device 270, the control unit 210 controls a distance of movement of the moving module 250 relative to the face, so that the moving block 251 moves the coloring tool to the position for contacting the face of the user safely or a position for keeping a safe distance from the face of the user.
  • In some embodiments, the range finding device 270 may be a laser range finder, a tellurometer, an infrared range finder, an image capturing module, or other equivalent range finding devices.
  • In some embodiments, the three-dimensional object 14 may be an eye of the user.
  • Please refer to FIG. 6, in which for the automatic coloring machine 12 dedicated to the eye, the aforementioned face positioning module 220 may be an eye mask to enable the eye of the user to correspond to the moving module 250 of the automatic coloring machine 12.
  • In some embodiments, the second connecting interface 130 may be a wireless transceiver module, a Universal Serial Bus (USB), or an External Serial Advanced Technology Attachment (e-SATA) connector. The third connecting interface may be a wireless transceiver module, a USB, or an e-SATA connector.
  • The wireless transceiver module may adopt various wireless communications technologies in the prior art, such as the Bluetooth technology, the Wireless Fidelity (WiFi) technology, and the Near Field Communication (NFC) technology.
  • In some embodiments, the pigment may be powdery, foamy, gelatinous, in a liquid state, and in any one of the three phases or a combination thereof, for example, shining pieces, mist or other special state. The pigment is, for example, a makeup base material, a concealing material, an eyebrow color material, a cheek color material, a labial makeup material, a decorative color makeup material, basic care material, various colors of inks, or various colors of dyeing materials, which may be mixed arbitrarily.
  • The operation of the automatic coloring system 10 is illustrated below in detail for demonstration. Please refer to FIG. 1 to FIG. 8, in which the storage unit 140 stores a coloring application.
  • The processing unit 110 executes the coloring application, so as to display a coloring editing window 121 on the user interface 120 (Step S21). The coloring editing window 121 includes an image preview box 122 and a design function bar 124. The design function bar 124 has an edit option 125, a return option 126, a clear option 127, a complete option 128, and a file option 129. The edit option 125 has a tool option 1251 and a color palette option 1252, as shown in FIG. 9.
  • In some embodiments, the tool option 1251 and the color palette option 1252 may be located on the same level of menu, as shown in FIG. 9 and FIG. 10A. In some embodiments, the tool option 1251 and the color palette option 1252 may be located on different levels of menu, as shown in FIG. 10B. For example, please refer to FIG. 10B, in which the tool option 1251 has multiple tool pictures A1 and A2, and each tool picture A1 and A2 is connected to a color palette option 1252. When a tool picture A1 is selected, the coloring application provides the color palette option 1252 connected to the tool picture A1, for selection by the user. The color palette option 1252 has multiple color pictures C1 and C2 to be selected by the user.
  • The processing unit 110 may receive an appearance image Pf of a three-dimensional object 14 from the image capturing module 13, read a stored appearance image Pf from the storage unit 140, or receive an appearance image Pf from an external electronic device or storage device (Step S23). In some embodiments, the appearance image Pf may be a plane simulated image, i.e. there is 2D image of the three-dimensional object 14 in the appearance image. In some embodiments, the appearance image Pf may be a three-dimensional simulated image, i.e. there is 3D model of the three-dimensional object 14 in the appearance image.
  • Then, the processing unit 110 performs feature analysis on the received appearance image Pf, so as to generate an outline image Pp (Step S25). In some embodiments, the processing unit 110 may directly read a stored outline image Pp from the storage unit 140, or receive an outline image Pp from an external electronic device or storage device. For example, the user may use the file option 129 to select an outline image Pp to be displayed in the image preview box 122.
  • The processing unit 110 then displays the outline image Pp in the image preview box 122 on the user interface 120 (Step S27).
  • At the moment, the user may use the edit option 125 to perform coloring design of the outline image Pp.
  • In the process of coloring design, the user may use the tool option 1251 to select a coloring tool (that is, click a tool picture A1/A2 in the tool option). to be used, use the color palette option 1252 to select a color to be used (that is, click a color picture C1/C2 in the color palette option), and use the selected coloring tool and color to perform a coloring action on the outline image Pp in the image preview box 122 (that is, move the mouse to perform simulated coloring on the outline image Pp), to apply the selected color to the outline image Pp.
  • For each coloring action performed by the user, the user interface 120 outputs an edit instruction in response to the coloring action of the user (Step S29), so that the coloring application (that is, the processing unit 110) generates a coloring instruction in response to the edit instruction. The coloring instruction includes tool information indicating the coloring tool selected by the user, color information indicating the color selected by the user, and track information indicating a movement track of the coloring action.
  • In some embodiments, the track information is formed of multiple consecutive positioning points. Herein, the start of the coloring action corresponds to a first positioning point, the end of the coloring action corresponds to a last positioning point, and a movement process of the coloring action corresponds to a second positioning point to a penultimate positioning point sequentially. Each positioning point may be coordinate data.
  • After the user performs multiple coloring actions, the user may click the complete option 128 to make the user interface 120 output a confirm instruction. At the moment, the coloring application (that is, the processing unit 110), in response to the confirm instruction, sequences the multiple coloring instructions corresponding to the multiple coloring actions according to a generation order, so as to generate a coloring procedure (Step S31), and output the generated coloring procedure to the outside or store the generated coloring procedure in the storage unit 140 (Step S33). In other words, the coloring procedure has multiple coloring instructions, and the coloring instructions are sequenced according to each individual generation order thereof (that is, an order in which the user performs the multiple coloring actions).
  • Further, the coloring application (that is, the processing unit 110), may obtain a colored pattern Pc through the outline image Pp in the image preview box 122 in response to the confirm instruction. In some embodiments, the processing unit 110 may store the colored pattern Pc and the coloring procedure corresponding to the colored pattern Pc in the storage unit 140, so as to form a pattern database. In other words, the coloring application may have a pattern database. The pattern database is stored in the storage unit 140. The pattern database has one or more colored patterns Pc that are edited and stored in advance, and each colored pattern Pc has a corresponding coloring procedure Sp. Therefore, during next time of use, the user may directly use the file option 129 to select a colored pattern Pc to be used from the pattern database, and display the colored pattern Pc in the image preview box 122, which is for confirmation by the user.
  • Herein, the example that the user edits and designs the colored pattern Pc is provided, by the present invention is not limited thereto. That is to say, please refer to FIG. 11, in which after the user captures the appearance image Pf of the three-dimensional object 14 (such as the face, an eye, or other objects), to be colored, the appearance image Pf or the outline image Pp may be transmitted to another electronic device 11′ through the second connecting interface 130 in a wireless manner, in a wired manner or in other far-end transmission manner. Herein, the second connecting interface 130 may be a telecommunication module, so as to transmit the appearance image Pf acting as a multimedia messaging (MMS) (short message).
  • A designer may perform coloring design on the outline image Pp through the electronic device 11′, that is, Step S25 to Step S31 or Step S27 to Step S31. Herein, after the design is completed, the colored pattern Pc and the corresponding coloring procedure Sp are transmitted back. to the electronic device 11 of the user through the second connecting interface 130 in a wireless manner or in a wired manner (Step S33).
  • In some embodiments, please refer to FIG. 12, in which the edit option 125 may further have a template option 1253. The template option 1253 has multiple template patterns E1 and E2. Herein, each of the template patterns E1 and E2 is a colored pattern Pc that is edited and stored in the pattern database in advance. That is to say, each of the template patterns E1 and E2 has a respective corresponding coloring procedure Sp, and the coloring procedure Sp is already stored in the pattern database correspondingly in advance.
  • In other words, each edited colored pattern Pc and the corresponding coloring procedure thereof may be optionally stored as a template, so as to become an option in the template option 1253. When the edited colored pattern Pc and the corresponding coloring procedure thereof are stored as a template, the colored pattern. Pc may act as a template pattern. In some embodiments, each of the template patterns E1 and E2 is a colored pattern Pc, but is not limited to be represented (shown), as the outline image Pp of the three-dimensional object 14. In other words, each of the template patterns E1 and E2 presents a result of coloring design.
  • When the user clicks a template pattern E1 in the template option 1253, the a result of coloring design represented by the template pattern E1 is applied to the outline image Pp of the three-dimensional object 14 (that is, the image displayed in the image preview box 122), so as to obtain a final colored pattern Pc. At the moment, the user interface 120 may output an edit instruction corresponding to the template pattern E1 in response to a select operation of the user, so that the coloring application reads the coloring procedure Sp corresponding to the template pattern E1 from the pattern database in response to the edit instruction (Step S31), and outputs the read coloring procedure Sp when the user clicks the complete option 128 (Step S33).
  • In some embodiments, the coloring procedure is generated in a script form. The coloring procedure in the script form is, for example, as follows:
  •  <coloring interface = skin> (that is, the type of the three-dimensional
    object
    14 to be colored)
     <coloring pigment = No. 1 spray material> (that is, the type of the
    coloring pigment)
     <color spray fineness = A> (that is, the type of the coloring tool)
     <color spray color = red> (that is, the color of the coloring pigment)
     <system positioning points X, Y scale X>
     <draw point X.Y>
     <draw line X0.Y0 X1,Y1>
     <draw plane X0.Y0 X1,Y1>
     <draw picture picture name>
     <draw character character name>
  • In some embodiments, the outline image Pp displayed in the image preview box 122 may be a plane simulated image. In other words, there is 2D image of the three-dimensional object 14 in the appearance image.
  • In some embodiments, the outline image Pp displayed in the image preview box 122 may be a three-dimensional simulated imaged. In other words, there is 3D model of the three-dimensional object 14 in the outline image Pp. Implementation of the three-dimensional simulated image is well known by persons skilled in the art, and is not repeated herein.
  • Therefore, when the user performs a coloring action on the three-dimensional simulated image, each coloring action correspondingly generates track information represented by three-dimensional coordinates. In other words, each positioning point in the track information is three-dimensional coordinate data.
  • Please refer to FIG. 13, in which when the user actually performs makeup on the face, the user may connect the electronic device 11 to the automatic coloring machine 12, that is, electrically connect the first connecting interface 230 of the automatic coloring machine 12 to the second connecting interface 130 of the electronic device 11 (Step S41).
  • After the connection, the user may operate the electronic device 11, so that the processing unit 110 outputs a coloring procedure to the automatic coloring machine 12. In other words, the automatic coloring machine 12 receives through the first connecting interface 230 the coloring procedure transmitted in a wireless manner or in a wired manner (for example, through the connecting cable 15 or a physical connector) (Step S43).
  • Then, the automatic coloring machine 12 performs makeup on the face of the user based on the coloring procedure. Herein, the control unit 210 of the automatic coloring machine 12 sequentially executes each coloring instruction in the coloring procedure.
  • The control unit 210 controls, according to the executed coloring instruction, the material supply module 240 to select a pigment corresponding to color information in the coloring instruction (Step S45), and controls, according to the executed coloring instruction, the moving module 250 to select a coloring tool corresponding to tool information in the coloring instruction (Step S47). The execution order of Step S45 and Step S47 is not limited by the present invention. That is to say, besides sequential execution of Step S45 and Step S47, Step S45 and Step S47 may be executed at the same time, or Step S47 is executed before Step S45.
  • In Step S45, the color indicated by the color information may be one of multiple pigments included by the material supply module 240. Further, the color indicated by the color information may not be among the multiple pigments included by the material supply module 240. In this case, the material supply module 240 may select, according to the color indicated by the color information, two or more pigments from the multiple pigments to obtain the needed pigment (that is, the color indicated by the color information), by mixing. In other words, the automatic coloring machine 12 may have a storage unit, and a color database is established in the storage unit. The color database has multiple colors and corresponding mixing methods (for example, pigments and proportions thereof required for mixing).
  • Then, the control unit 210 moves the moving module 250 according to track information in the executed coloring instruction, and applies the selected pigment to the face of the user (the three-dimensional object 14), through the selected coloring tool (Step S49).
  • In some embodiments, when the track information is represented by two-dimensional coordinates, the control unit 210 moves the moving module 250 in the first direction and in the second direction according to each positioning point in the track information, so as to move the moving module 250 to a corresponding designated position. Further, during movement to each positioning point or upon arriving at the designated position, the control unit 210 receives a position signal from the range finding device 270 to control movement of the moving module 250 relative to the face (that is, the movement in the third direction), so that the coloring tool is positioned in a position capable of applying the pigment to the face of the user safely. Herein, the first direction, the second direction, and the third direction are the Y-axis, the X-axis, and the Z-axis of a movement coordinate system of the moving module 250 respectively.
  • After a coloring instruction is executed (that is, after a coloring action is completed), the control unit 210 continues to execute a next coloring instruction, until all coloring instructions are executed (Step S51).
  • In some embodiments, the coloring application has a coordinate system conversion step, so that an image coordinate system of the outline image Pp corresponds to the movement coordinate system of the moving module 250.
  • In some embodiments, the coordinate system conversion step may use features or edges obtained in the feature analysis step (Step S25) as corresponding points, so that the image coordinate system of the outline image Pp corresponds to the movement coordinate system of the moving module 250. That is to say, the outline image Pp is mapped to the actual position of the three-dimensional object 14 in the movement coordinate system of the moving module 250.
  • In some embodiments, the coordinate system conversion step may be implemented by using a scaling object with known actual size.
  • When the user uses the image capturing module 13 to capture the appearance image Pf of the three-dimensional object 14, the scaling object is in the coverage of the capture at the same time. In other words, the image capturing module 13 is used to capture the appearance image Pf including the image of the three-dimensional object 14 and including the image of the scaling object. According to the known actual size and the image size of the scaling object in the appearance image Pf, the scale between the image coordinate system of the outline image Pp and the movement coordinate system of the moving module 250 is calculated. Then, by using the features or edges obtained in the feature analysis step (Step S25) as the corresponding points and by using the calculated scale, the outline image Pp is mapped to the actual position of the three-dimensional object 14 in the movement coordinate system of the moving module 250.
  • In some embodiments, the coordinate system conversion step may be implemented by using camera parameters (such as a focal length of the lens and an image format). of the image capturing module 13 and specifications of the screen in the user interface 120. The coloring application may calculate a scale between image size of the outline image Pp and the actual size of the three-dimensional object 14 according to the camera. parameters of the image capturing module 13 and the specifications of the screen in the user interface 120. Then, by using the features or edges obtained in the feature analysis step (Step S25) as the corresponding points and by using the calculated scale, the outline image Pp is mapped to the actual position of the three-dimensional object 14 in the movement coordinate system of the moving module 250.
  • After the coordinate system conversion step is completed, the coloring action applied to the outline image Pp on the user interface 120 can enable the coloring application to generate corresponding track information based on the movement coordinate system of the moving module 250.
  • In some embodiments, the coloring application may be implemented by a computer program product, so that after a computer (that is, the electronic device), is loaded with the coloring application and executes the coloring application, the automatic coloring method according to any embodiment of the present invention may be performed. In some embodiments, the computer program product may be a readable recording medium, and the coloring application is stored in the readable recording medium to be loaded into a computer. In some embodiments, the coloring application may be a computer program product, and transmitted to the computer in a wired manner or wireless manner.
  • In view of the above, the automatic coloring system and method according to the present invention have the coloring design process (executed by the electronic device), and the actual coloring process (executed by the automatic coloring machine), that are separable, so that the user can design and exchange a colored pattern anytime anywhere. Further, the automatic coloring system and method according to the present invention enable an external device to provide a coloring procedure to be directly executed by the automatic coloring machine, thereby facilitating simplification of the structure of the automatic coloring machine. For example, the control unit of the automatic coloring machine does not need to have a powerful processing function, and may be implemented by, for example, a. microcontroller, or the automatic coloring machine does not need to be provided with the image capturing module. In some embodiments, by directly providing track information represented by three-dimensional coordinates, the automatic coloring machine can execute the actual coloring process more precisely. In some embodiments, by directly presenting a three-dimensional simulated image, the coloring action in the coloring design process is closer to that in the actual coloring process.
  • While the present invention has been described by the way of example and in terms of the preferred embodiments, it is to be understood that the invention need not be limited to the disclosed embodiments. On the contrary, it is intended to cover various modifications and similar arrangements included within the spirit and scope of the appended claims, the scope of which should be accorded the broadest interpretation so as to encompass all such modifications and similar structures.

Claims (21)

What is claimed is:
1. An automatic coloring system, used for coloring a three-dimensional object, the automatic coloring system comprising:
an automatic coloring machine, comprising:
a first connecting interface, used for receiving a coloring procedure in a wireless manner or in a wired manner, wherein the coloring procedure has a plurality of coloring instructions, and the coloring instructions are sequenced according to each individual generation order thereof;
a material supply module, having at least one pigment;
a moving module;
at least one coloring tool, disposed on the moving module; and
a control unit, connected electrically to the first connecting interface, the material supply module, and the moving module, so as to sequentially execute the coloring instructions in the coloring procedure, and according to the executed coloring instruction control the material supply module to select at least one of the at least one pigment and control the moving module to move one of the at least one coloring tool to apply the selected at least one pigment to the three-dimensional object.
2. The automatic coloring system according to claim 1, wherein each of the coloring instructions comprises track information represented by two-dimensional coordinates, and the control unit controls, according to the track information, the moving module to move.
3. The automatic coloring system according to claim 1, wherein each of the coloring instructions comprises track information represented by three-dimensional coordinates, and the control unit controls, according to the track information, the moving module to move.
4. The automatic coloring system according to claim 1, further comprising:
an electronic device, comprising:
a processing unit, used for receiving an appearance image of the three-dimensional object, and generating an outline image through feature analysis of the appearance image;
a user interface, connected electrically to the processing unit, so as to display the outline image and sequentially output at least one edit instruction corresponding to the outline image, so that the processing unit obtains the coloring procedure in response to the at least one edit instruction; and
a second connecting interface, connected electrically to the processing unit, so as to output the coloring procedure to the first connecting interface in a wireless manner or in a wired manner.
5. The automatic coloring system according to claim 4, further comprising:
an image capturing module, used for capturing the appearance image of the three-dimensional object;
wherein the electronic device further comprises:
a third connecting interface, connected electrically to the second connecting interface, wherein the image capturing module is connected to the second connecting interface in a wireless manner or in a wired manner, so that the processing unit receives the appearance image from the image capturing module through the second connecting interface and the third connecting interface.
6. The automatic coloring system according to claim 4, wherein the automatic coloring machine further comprises:
an image capturing module, connected electrically to the first connecting interface, so as to capture the appearance image of the three-dimensional object, and transmit the appearance image to the processing unit through the first connecting interface and the second connecting interface.
7. The automatic coloring system according to claim 4, wherein the electronic device further comprises:
an image capturing module, connected electrically to the processing unit, so as to capture the appearance image of the three-dimensional object.
8. The automatic coloring system according to claim 4, wherein the processing unit further executes coordinate system conversion by using camera parameters of a image capturing module and display specifications of the outline image in the user interface, so as to correspondingly convert coordinates of the outline image into coordinates for moving the moving module, and therefore obtain track information in each of the coloring instructions.
9. The automatic coloring system according to claim 4, wherein the outline image is a three-dimensional simulated image.
10. The automatic coloring system according to claim 4, wherein the number of the at least one edit instruction is multiple, and the edit instructions correspond to the coloring instructions respectively.
11. An automatic coloring method, comprising:
receiving an appearance image of a three-dimensional object;
generating an outline image through feature analysis on the appearance image;
displaying the outline image on a user interface;
using the user interface to sequentially output at least one edit instruction corresponding to the outline image;
in response to the at least one edit instruction, obtaining a coloring procedure, wherein the coloring procedure has a plurality of coloring instructions, and the coloring instructions are sequenced according to each individual generation order thereof; and
outputting the obtained coloring procedure in a wireless manner or in a wired manner.
12. The automatic coloring method according to claim 11, wherein each of the coloring instructions comprises track information represented by two-dimensional coordinates.
13. The automatic coloring method according to claim 11, wherein each of the coloring instructions comprises track information represented by three-dimensional coordinates.
14. The automatic coloring method according to claim 11, further comprising:
executing coordinate system conversion by using camera parameters used when capturing the outline image and display specifications of the outline image in the user interface, so as to correspondingly convert coordinates of the outline image into coordinates for moving a moving module of an automatic coloring machine, and therefore obtain a track information in each of the coloring instructions.
15. The automatic coloring method according to claim 11, further comprising:
an automatic coloring machine receiving the coloring procedure and sequentially executing the coloring instructions in the coloring procedure, which comprises:
according to the executed coloring instruction, controlling a material supply module of the automatic coloring machine to select at least one pigment; and
according to the executed coloring instruction, controlling a moving module of the automatic coloring machine to move a coloring tool, so as to apply the selected pigment to the three-dimensional object.
16. The automatic coloring method according to claim 15, wherein each of the coloring instructions comprises track information represented by two-dimensional coordinates, and a control step of the moving module comprises: controlling, according to the track information, the moving module to move.
17. The automatic coloring method according to claim 15, wherein each of the coloring instructions comprises track information represented by three-dimensional coordinates, and a control step of the moving module comprises: controlling, according to the track information, the moving module to move.
18. The automatic coloring method according to claim 11, further comprising:
capturing the appearance image of the three-dimensional object.
19. The automatic coloring method according to claim 11, wherein the outline image is a three-dimensional simulated image.
20. The automatic coloring method according to claim 11, wherein the number of the at least one edit instruction is multiple, and the edit instructions correspond to the coloring instructions respectively.
21. A computer program product, capable of implementing the automatic coloring method according to claim 11 after a computer is loaded with and executes the program.
US13/829,526 2012-12-07 2013-03-14 Automatic coloring system and method Abandoned US20140161507A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW101146207 2012-12-07
TW101146207A TWI543726B (en) 2012-12-07 2012-12-07 Automatic coloring system and method thereof

Publications (1)

Publication Number Publication Date
US20140161507A1 true US20140161507A1 (en) 2014-06-12

Family

ID=49000368

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/829,526 Abandoned US20140161507A1 (en) 2012-12-07 2013-03-14 Automatic coloring system and method

Country Status (4)

Country Link
US (1) US20140161507A1 (en)
EP (1) EP2740386A3 (en)
JP (1) JP5814986B2 (en)
TW (1) TWI543726B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016014132A1 (en) * 2014-07-23 2016-01-28 Preemadonna Inc. Apparatus for applying coating to nails
US9607347B1 (en) 2015-09-04 2017-03-28 Qiang Li Systems and methods of 3D scanning and robotic application of cosmetics to human
US9687059B2 (en) * 2013-08-23 2017-06-27 Preemadonna Inc. Nail decorating apparatus
US9811717B2 (en) 2015-09-04 2017-11-07 Qiang Li Systems and methods of robotic application of cosmetics
US9814297B1 (en) * 2017-04-06 2017-11-14 Newtonoid Technologies, L.L.C. Cosmetic applicator
EP3396585A1 (en) * 2017-04-27 2018-10-31 Cal-Comp Big Data, Inc. Lip gloss guide device and method thereof
US10750838B2 (en) 2018-04-13 2020-08-25 Coral Labs, Inc. System and method for accurate application and curing of nail polish
EP3708029A1 (en) * 2019-03-13 2020-09-16 Cal-Comp Big Data, Inc. Virtual make-up system and virtual make-up coloring method
US11103041B2 (en) 2017-10-04 2021-08-31 Preemadonna Inc. Systems and methods of adaptive nail printing and collaborative beauty platform hosting
US11265444B2 (en) 2013-08-23 2022-03-01 Preemadonna Inc. Apparatus for applying coating to nails

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7079810B2 (en) * 2014-07-23 2022-06-02 プリマドンナ,インコーポレイテッド A device for applying a coating to the nails
CN104382327A (en) * 2014-12-03 2015-03-04 曹乃承 Manicure device and manicure, health management and information pushing method
GB201603495D0 (en) * 2016-02-29 2016-04-13 Virtual Beautician Ltd Image processing system and method
TWI573100B (en) * 2016-06-02 2017-03-01 Zong Jing Investment Inc Method for automatically putting on face-makeup
TWI608809B (en) * 2017-03-03 2017-12-21 致伸科技股份有限公司 Electronic false nail device
US11568675B2 (en) * 2019-03-07 2023-01-31 Elizabeth Whitelaw Systems and methods for automated makeup application

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040078278A1 (en) * 2000-06-26 2004-04-22 Christophe Dauga Cosmetic treatment method and device, in particular for care, make-up or colouring
US20120158184A1 (en) * 2010-12-17 2012-06-21 Electronics And Telecommunications Research Institute Method for operating makeup robot based on expert knowledge and system thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI227444B (en) * 2003-12-19 2005-02-01 Inst Information Industry Simulation method for make-up trial and the device thereof
JP2008017936A (en) * 2006-07-11 2008-01-31 Fujifilm Corp Makeup apparatus and method
JP2009064423A (en) * 2007-08-10 2009-03-26 Shiseido Co Ltd Makeup simulation system, makeup simulation device, makeup simulation method, and makeup simulation program
TW201212852A (en) * 2010-09-21 2012-04-01 Zong Jing Investment Inc Facial cosmetic machine

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040078278A1 (en) * 2000-06-26 2004-04-22 Christophe Dauga Cosmetic treatment method and device, in particular for care, make-up or colouring
US20120158184A1 (en) * 2010-12-17 2012-06-21 Electronics And Telecommunications Research Institute Method for operating makeup robot based on expert knowledge and system thereof

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11082582B2 (en) * 2013-08-23 2021-08-03 Preemadonna Inc. Systems and methods to initiate and perform the painting of an area of interest on a finger
US10470546B2 (en) * 2013-08-23 2019-11-12 Preemadonna Inc. Systems, methods and apparatuses for decorating nails
US10477937B2 (en) * 2013-08-23 2019-11-19 Preemadonna Inc. Systems and apparatuses to apply a material to a nail
US20220256054A1 (en) * 2013-08-23 2022-08-11 Preemadonna Inc. Systems and methods to initiate and perform the painting of an area of interest on a finger
US10653225B2 (en) 2013-08-23 2020-05-19 Preemadonna Inc. Apparatus for applying coating to nails
US20170347770A1 (en) * 2013-08-23 2017-12-07 Preemadonna Inc. Systems, methods and apparatuses for decorating nails
US11290615B2 (en) * 2013-08-23 2022-03-29 Preemadonna Inc. Systems and methods to initiate and perform the painting of an area of interest on a finger
US10972631B2 (en) 2013-08-23 2021-04-06 Preemadonna, Inc. Apparatus for applying coating to nails
US20180255902A1 (en) * 2013-08-23 2018-09-13 Preemadonna Inc. Systems and apparatuses to apply a material to a nail
US11265444B2 (en) 2013-08-23 2022-03-01 Preemadonna Inc. Apparatus for applying coating to nails
US9687059B2 (en) * 2013-08-23 2017-06-27 Preemadonna Inc. Nail decorating apparatus
CN106998870A (en) * 2014-07-23 2017-08-01 普瑞曼多娜有限公司 Equipment for being applied to coating on nail
GB2546672B (en) * 2014-07-23 2019-01-02 Preemadonna Inc Apparatus for applying coating to nails
GB2546672A (en) * 2014-07-23 2017-07-26 Preemadonna Inc Apparatus for applying coating to nails
WO2016014132A1 (en) * 2014-07-23 2016-01-28 Preemadonna Inc. Apparatus for applying coating to nails
US9607347B1 (en) 2015-09-04 2017-03-28 Qiang Li Systems and methods of 3D scanning and robotic application of cosmetics to human
US9811717B2 (en) 2015-09-04 2017-11-07 Qiang Li Systems and methods of robotic application of cosmetics
US9814297B1 (en) * 2017-04-06 2017-11-14 Newtonoid Technologies, L.L.C. Cosmetic applicator
US10783802B2 (en) 2017-04-27 2020-09-22 Cal-Comp Big Data, Inc. Lip gloss guide device and method thereof
EP3396585A1 (en) * 2017-04-27 2018-10-31 Cal-Comp Big Data, Inc. Lip gloss guide device and method thereof
US11717070B2 (en) 2017-10-04 2023-08-08 Preemadonna Inc. Systems and methods of adaptive nail printing and collaborative beauty platform hosting
US11103041B2 (en) 2017-10-04 2021-08-31 Preemadonna Inc. Systems and methods of adaptive nail printing and collaborative beauty platform hosting
US10750838B2 (en) 2018-04-13 2020-08-25 Coral Labs, Inc. System and method for accurate application and curing of nail polish
US11096466B2 (en) 2018-04-13 2021-08-24 Coral Labs, Inc. System and method for accurate application and curing of nail polish
EP3708029A1 (en) * 2019-03-13 2020-09-16 Cal-Comp Big Data, Inc. Virtual make-up system and virtual make-up coloring method

Also Published As

Publication number Publication date
TW201422173A (en) 2014-06-16
JP2014113445A (en) 2014-06-26
EP2740386A2 (en) 2014-06-11
TWI543726B (en) 2016-08-01
JP5814986B2 (en) 2015-11-17
EP2740386A3 (en) 2017-06-28

Similar Documents

Publication Publication Date Title
US20140161507A1 (en) Automatic coloring system and method
US20210177124A1 (en) Information processing apparatus, information processing method, and computer-readable storage medium
US20190347865A1 (en) Three-dimensional drawing inside virtual reality environment
AU2012213943B2 (en) Eye make-up application machine
US10479109B2 (en) Automatic facial makeup method
TW201212852A (en) Facial cosmetic machine
CN106339929A (en) 3D fitting system
US10755489B2 (en) Interactive camera system with virtual reality technology
KR20120066773A (en) A method of virtual make-up using mobile device
Treepong et al. Makeup creativity enhancement with an augmented reality face makeup system
CN110503707A (en) A kind of true man&#39;s motion capture real-time animation system and method
CN109584361A (en) A kind of equipment cable is virtually pre-installed and trajectory measurement method and system
CN103853067B (en) Automatic colouring system and method thereof
KR101719927B1 (en) Real-time make up mirror simulation apparatus using leap motion
CN107481325A (en) A kind of house decoration scheme system based on virtual reality technology
AU2006330460B2 (en) Eye movement data replacement in motion capture
CN105976655B (en) A kind of three-dimension virtual reality device for assiatant robot
CN107710305A (en) Augmented reality method of servicing and system for game of tinting
CN108970109A (en) A kind of methods, devices and systems based on idiodynamic reality-virtualizing game
CN214751793U (en) Virtual reality remote clothing perception system
Reepen et al. Mixed reality embodiment platform–Visual coherence for volumetric mixed reality scenes-Recording and replay of volumetric characters
CN106060501A (en) Multi-channel software imaging system

Legal Events

Date Code Title Description
AS Assignment

Owner name: ZONG JING INVESTMENT,INC., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WONG, CHARLENE HSUEH-LING;REEL/FRAME:030108/0975

Effective date: 20121129

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION