WO2018117022A1 - 塗布制御装置、塗布装置、塗布制御方法および記録媒体 - Google Patents
塗布制御装置、塗布装置、塗布制御方法および記録媒体 Download PDFInfo
- Publication number
- WO2018117022A1 WO2018117022A1 PCT/JP2017/045303 JP2017045303W WO2018117022A1 WO 2018117022 A1 WO2018117022 A1 WO 2018117022A1 JP 2017045303 W JP2017045303 W JP 2017045303W WO 2018117022 A1 WO2018117022 A1 WO 2018117022A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- coating
- skin
- application
- head
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 20
- 239000002537 cosmetic Substances 0.000 claims abstract description 55
- 238000003384 imaging method Methods 0.000 claims abstract description 42
- 238000000576 coating method Methods 0.000 claims description 144
- 239000011248 coating agent Substances 0.000 claims description 138
- 239000007788 liquid Substances 0.000 claims description 3
- 230000006870 function Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 11
- 238000001514 detection method Methods 0.000 description 8
- 230000007246 mechanism Effects 0.000 description 8
- 238000011156 evaluation Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 239000003086 colorant Substances 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 239000011148 porous material Substances 0.000 description 4
- 206010014970 Ephelides Diseases 0.000 description 3
- 208000003351 Melanosis Diseases 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 238000005401 electroluminescence Methods 0.000 description 2
- 230000037303 wrinkles Effects 0.000 description 2
- 208000007256 Nevus Diseases 0.000 description 1
- 206010053262 Skin swelling Diseases 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000003796 beauty Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 230000008961 swelling Effects 0.000 description 1
- 229920003002 synthetic resin Polymers 0.000 description 1
- 239000000057 synthetic resin Substances 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D44/00—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
- A45D44/005—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D34/00—Containers or accessories specially adapted for handling liquid toiletry or cosmetic substances, e.g. perfumes
- A45D34/04—Appliances specially adapted for applying liquid, e.g. using roller or ball
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D44/00—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B41—PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
- B41J—TYPEWRITERS; SELECTIVE PRINTING MECHANISMS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
- B41J2/00—Typewriters or selective printing mechanisms characterised by the printing or marking process for which they are designed
- B41J2/005—Typewriters or selective printing mechanisms characterised by the printing or marking process for which they are designed characterised by bringing liquid or particles selectively into contact with a printing material
- B41J2/01—Ink jet
- B41J2/015—Ink jet characterised by the jet generation process
- B41J2/04—Ink jet characterised by the jet generation process generating single droplets or particles on demand
- B41J2/045—Ink jet characterised by the jet generation process generating single droplets or particles on demand by pressure, e.g. electromechanical transducers
- B41J2/04501—Control methods or devices therefor, e.g. driver circuits, control circuits
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B41—PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
- B41J—TYPEWRITERS; SELECTIVE PRINTING MECHANISMS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
- B41J3/00—Typewriters or selective printing or marking mechanisms characterised by the purpose for which they are constructed
- B41J3/407—Typewriters or selective printing or marking mechanisms characterised by the purpose for which they are constructed for marking on special material
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B41—PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
- B41J—TYPEWRITERS; SELECTIVE PRINTING MECHANISMS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
- B41J3/00—Typewriters or selective printing or marking mechanisms characterised by the purpose for which they are constructed
- B41J3/44—Typewriters or selective printing mechanisms having dual functions or combined with, or coupled to, apparatus performing other functions
- B41J3/445—Printers integrated in other types of apparatus, e.g. printers integrated in cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B41—PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
- B41J—TYPEWRITERS; SELECTIVE PRINTING MECHANISMS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
- B41J3/00—Typewriters or selective printing or marking mechanisms characterised by the purpose for which they are constructed
- B41J3/44—Typewriters or selective printing mechanisms having dual functions or combined with, or coupled to, apparatus performing other functions
- B41J3/46—Printing mechanisms combined with apparatus providing a visual indication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K15/00—Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers
- G06K15/22—Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers using plotters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
- G06V10/7515—Shifting the patterns to accommodate for positional errors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D44/00—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
- A45D2044/007—Devices for determining the condition of hair or skin or for selecting the appropriate cosmetic or hair treatment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Definitions
- the present invention relates to a coating control device, a coating device, a coating control method, and a recording medium.
- Patent Document 1 discloses a cosmetic device that includes a head that ejects cosmetic ink and a moving device that can move the head.
- the cosmetic device of Patent Document 1 measures data such as unevenness of the face to be dressed (skin), brightness distribution, and the like, and determines the position of spots, wrinkles, etc. on the face to be dressed based on the data.
- Patent Document 1 does not specifically disclose a method of positioning the head with respect to spots, wrinkles and the like to be applied.
- the moving device of Patent Document 1 can move the head along the skin, but in order to perform positioning, it is necessary to know where the head is located on the skin.
- the present invention has been made in view of such problems, and an object thereof is to provide a coating control device, a coating device, a coating control method, and a recording medium capable of recognizing the position of the coating head on the skin.
- An application control device includes: a storage unit that stores a first image of a predetermined range of skin; and an imaging unit that is connected to an application head that can apply cosmetics to the skin.
- the acquisition part which acquires the 2nd image of the skin of the range smaller than the range of the said image, and the recognition part which recognizes the position of the said 2nd image in a said 1st image are provided.
- An application control method includes: a step of storing a first image of a predetermined range of skin; and an imaging unit connected to an application head capable of applying cosmetics to the skin. Obtaining a second image of the skin in a range smaller than the range of the image, and recognizing a position of the second image in the first image.
- the recording medium includes a step of storing a first image of a predetermined range of the skin, and an imaging unit connected to an application head capable of applying cosmetics to the skin.
- a program for causing a computer to execute a step of acquiring a second image of the skin in a range smaller than the range and a step of recognizing the position of the second image in the first image is recorded.
- a coating control device capable of recognizing the position of the coating head on the skin.
- FIG. 1 is a schematic diagram illustrating a configuration of a coating system 10 according to the present embodiment.
- the coating system 10 includes a coating device 100, an imaging unit 200, and a coating control device 300.
- the application apparatus 100 is an apparatus for applying cosmetics (cosmetics) to the skin and is held by a user.
- the coating apparatus 100 has a prismatic casing, and a coating head 101 is provided on one end surface of the casing.
- the shape of the coating apparatus 100 is not limited as long as it is a shape that can be easily gripped by the user, and may be cylindrical or hemispherical. Further, the coating apparatus 100 may include a gripping member such as a handle.
- the application head 101 is composed of, for example, an inkjet head and includes a plurality of nozzles that discharge cosmetics.
- the plurality of nozzles are two-dimensionally arranged and can apply cosmetics to a predetermined region of the skin.
- a cosmetic tank 102 is attached to the coating apparatus 100, and cosmetics are supplied from the cosmetic tank 102 to the coating head 101.
- the cosmetic tank 102 may be provided in the coating apparatus 100.
- a liquid concealer for hiding skin spots, freckles, pores and the like can be used.
- the imaging unit 200 is provided on the side surface (upper surface) of the coating apparatus 100 in the same direction as the coating head 101.
- the imaging unit 200 includes a lens, an imaging device, and the like, and can capture an image of a narrow range of skin (second image) on which application is performed by the application head 101.
- the imaging unit 200 is connected to the coating head 101, and the relative position of the imaging unit 200 with respect to the coating head 101 is fixed.
- the imaging unit 200 may be formed integrally with the application head 101.
- the coating apparatus 100 and the imaging unit 200 are controlled by the coating control apparatus 300.
- the application control device 300 is connected to the application device 100 and the imaging unit 200 via a wired connection such as a USB (Universal Serial Bus) cable or a wireless connection such as Bluetooth (registered trademark) or WiFi.
- the coating control device 300 may be built in the coating device 100.
- the application control device 300 stores in advance an image (first image) of a wide range of skin including spots, freckles, pores, and the like to be applied.
- the application control apparatus 300 can grasp the position of the application head 101 on the skin by comparing the second image acquired from the imaging unit 200 with the first image.
- the application control apparatus 300 includes a display 301, and various information such as a skin image and the state of the application head 101 are displayed on the display 301.
- the coating control apparatus 300 recognizes the position of the coating head 101 on the skin and displays the current position of the coating head 101 on the display 301.
- the user moves the application head 101 along the skin while checking the display 301.
- the application head 101 arrives at a position to be applied, the application of the cosmetic is automatically performed.
- FIG. 2 is a block diagram of the coating apparatus 100 and the coating control apparatus 300 according to the present embodiment.
- the coating apparatus 100 includes a coating head 101, a cosmetic tank 102, a moving mechanism 103, an operation unit 104, a distance sensor 105, and a motion sensor 106.
- the application control apparatus 300 includes a display 301, an image processing circuit 302, a gap control circuit 303, a head control circuit 304, a CPU 305, a RAM 306, a ROM 307, a storage device 308, and a speaker 309.
- the coating head 101 is, for example, a piezo ink jet head, and includes a nozzle, a pressure chamber, a piezoelectric element, a drive circuit, and the like.
- the pressure chamber is filled with cosmetics and a voltage is applied from the drive circuit to the piezoelectric element, the volume of the pressure chamber changes due to the deformation of the piezoelectric element.
- cosmetics are discharged from a nozzle in the shape of a droplet.
- the application head 101 may be a thermal inkjet head that heats the cosmetic with a heating body and discharges the cosmetic with the pressure of the generated bubbles.
- the coating head 101 operates based on a control signal from the head control circuit 304.
- the cosmetic tank 102 stores the cosmetic and supplies the cosmetic to the application head 101.
- the cosmetic tank 102 can be a cartridge type that can be easily replaced.
- the cosmetic is a liquid having a predetermined viscosity that can be discharged from the application head 101, and may include a concealer, a foundation, a cheek, a funny, an eye shadow, and the like.
- a plurality of cosmetic tanks 102 may be provided so as to accommodate a plurality of cosmetics of different types or colors. For example, four cosmetic tanks 102 can be provided so that four colors of cosmetics can be applied, and four nozzle groups corresponding to the respective colors can be provided in the application head 101.
- the moving mechanism 103 includes an actuator, a guide member, and the like, and can drive the application head 101 forward and backward in the longitudinal direction of the application apparatus 100, that is, in a direction perpendicular to the skin when the application head 101 is opposed to the skin. it can.
- the moving mechanism 103 controls the position of the coating head 101 in accordance with a control signal from the gap control circuit 303.
- the operation unit 104 includes operation members such as a power switch, a menu button, and a coating button for executing coating, and is used by the user to give an instruction to the coating device 100.
- the application control device 300 controls the operation of the application device 100 in accordance with a user instruction input from the operation unit 104.
- the application button is preferably arranged at a position where the user can easily operate while holding the application apparatus 100.
- the application button is arranged at a position where the user touches the finger when holding the application apparatus 100. The Thereby, even when the user moves the coating apparatus 100 to a part (such as a cheek) that cannot be directly visually recognized, the user can operate the coating button by groping.
- the distance sensor 105 is, for example, an infrared sensor, an ultrasonic sensor, or the like, irradiates a detection wave such as an infrared ray or an ultrasonic wave toward an object, and receives the reflected wave.
- the distance sensor 105 can detect the distance to the object based on the time from irradiating the detection wave to receiving the reflected wave.
- a plurality of distance sensors 105 are provided around the application head 101, and the inclination of the application head 101 with respect to the skin can be detected.
- the application control device 300 keeps the distance between the skin and the application head 101 constant based on the detection signal from the distance sensor 105, and discharges cosmetics when the application head 101 is inclined with respect to the skin, for example.
- the coating head 101 can be controlled so that it does not occur.
- the motion sensor 106 includes an acceleration sensor, a gyro sensor, and a geomagnetic sensor, and detects movement such as movement and rotation of the coating head 101.
- the acceleration sensor is composed of, for example, a capacitance detection element, and can detect acceleration applied to the coating head 101.
- the gyro sensor is composed of, for example, a piezoelectric vibration element and has a function of detecting the direction of the coating head 101.
- the geomagnetic sensor can grasp the orientation of the coating head 101 by detecting the geomagnetism. Based on the detection signal from the motion sensor 106, the application control device 300 can control the application head 101 so that the cosmetic is not discharged, for example, when the application head 101 moves quickly.
- the imaging unit 200 includes an optical system, an imaging device, and an A / D (Analog / Digital) converter.
- the optical system includes an optical filter, a fixed lens, and a focus lens, and images light from a subject (skin) on an imaging surface of an imaging element to form a subject image.
- a polarizing filter can be attached to the optical system, and specular reflection can be reduced.
- the image pickup device is, for example, a CMOS (Complementary Metal Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor, and includes a plurality of pixels arranged two-dimensionally, a color filter, and a microlens.
- the plurality of pixels may include imaging pixels and focus detection pixels.
- the image sensor has an electronic shutter function for controlling the charge accumulation time.
- Each of the plurality of pixels outputs a pixel signal based on incident light from the optical system.
- the A / D converter includes a comparison circuit, a latch circuit, and the like, and converts an analog pixel signal from the image sensor into digital RAW data.
- the imaging unit 200 can output a moving image having a predetermined frame rate in addition to a still image.
- the display 301 includes, for example, a liquid crystal display and an organic EL (Electro Luminescence) display.
- the display 301 performs various displays such as an image from the imaging unit 200, an image stored in the storage device 308, status information of the coating head 101, and a menu screen based on data from the CPU 305.
- the display 301 may be a touch panel and may function as the operation unit 104.
- the image processing circuit 302 includes a numerical operation circuit, performs a demosaic process on the RAW data from the imaging unit 200, and has an image having each color value of R (red), G (green), and B (blue) for each pixel. Data (RGB image) is generated.
- the image processing circuit 302 also has a function of performing digital image processing such as white balance adjustment, gamma correction, contour enhancement, tone conversion, noise reduction, and compression on image data.
- the gap control circuit 303 controls the gap (gap) between the skin and the coating head 101 by outputting a control signal to the moving mechanism 103. Based on the detection signal from the distance sensor 105, the gap control circuit 303 can control the position of the coating head 101 so as to maintain a certain distance from the skin. Based on an instruction from the CPU 305, the head control circuit 304 outputs, to the application head 101, a control signal indicating information on a nozzle that discharges cosmetics, an application amount, and the like.
- a CPU (Central Processing Unit) 305 includes a CPU core, a cache memory, and the like, and comprehensively controls each unit of the coating control apparatus 300.
- a RAM (Random Access Memory) 306 is, for example, a DRAM (Dynamic RAM), and is used as a work area of the CPU 305, a program load area, and the like. The RAM 306 temporarily stores data necessary for processing by the CPU 305, image data generated by the image processing circuit 302, image data read from the storage device 308, and the like.
- a ROM (Read Only Memory) 307 is, for example, an EEPROM (Electrically Erasable Programmable ROM), and stores various setting files, a basic program such as an OS (Operating System), and a control program for controlling the operation of the coating apparatus 100.
- EEPROM Electrically Erasable Programmable ROM
- OS Operating System
- the storage device (storage unit) 308 is, for example, a flash memory or a hard disk, and stores LAW data from the imaging unit 200, image data generated by the image processing circuit 302, and the like.
- the storage device 308 can also store image data acquired by an external imaging device.
- the storage device 308 may be a portable storage medium, and may be configured to be detachable from the application control device 300 via a memory card slot, a USB connector, or the like.
- the speaker 309 includes a piezoelectric vibration unit, a drive circuit, and the like, and outputs a sound wave signal based on data from the CPU 305.
- the speaker 309 can reproduce voice messages, sound effects, and the like, and is used, for example, to notify the user of the operating state of the coating apparatus 100.
- FIG. 3 is a functional block diagram of the coating control apparatus 300 according to the present embodiment.
- the application control apparatus 300 has functions of a storage unit 308, an acquisition unit 310, a recognition unit 311, and a calculation unit 312.
- the function of the application control apparatus 300 is realized by the CPU 305 reading out a predetermined control program stored in the ROM 307 to the RAM 306 and executing it.
- the storage unit 308 stores a first image of a predetermined range of skin.
- the first image is an image obtained by capturing a wide range of skin including spots, freckles, pores and the like to be applied.
- the first image can be captured using a dedicated imaging device or a general digital camera, and is stored in the storage unit 308 in advance.
- the first image can be captured using the imaging unit 200.
- the acquisition unit 310 acquires the second image from the imaging unit 200.
- the second image is a skin image in a range smaller than the range of the first image.
- the acquisition unit 310 acquires the second image every predetermined time, and delivers the second image to the recognition unit 311.
- the imaging unit 200 is connected to the coating head 101, and the position of the coating head 101 can be obtained from the position of the imaging unit 200.
- the recognition unit 311 compares the second image from the acquisition unit 310 with the first image stored in the storage unit 308, and recognizes the position of the second image in the first image.
- the recognition unit 311 can perform correlation calculation of image data, and can extract an area having the highest correlation with the second image from the first image.
- the calculation unit 312 divides the first image or the second image into a plurality of segments, and calculates the required cosmetic application amount for each segment.
- the method of calculating the application amount is arbitrary, and for example, the application amount can be calculated so that the luminance distribution of the skin after application is uniform.
- the application head 101 can apply cosmetics to the skin according to the application amount from the calculation unit 312.
- FIG. 4 is a flowchart of the application control method according to the present embodiment.
- the CPU 305 reads an image including a plurality of spots (first image) from the storage device 308 and stores it in the RAM 306 (step S401).
- the first image is an image obtained by capturing a wide range of cheeks (for example, 3 cm long ⁇ 4 cm wide).
- the first image may be captured using a polarizing filter.
- the CPU 305 displays the first image on the display 301.
- the CPU 305 may identify the positions of a plurality of spots from the first image, and display an icon or the like indicating the positions of the spots to the user on the display 301.
- the position of the stain can be specified based on, for example, the feature amount (luminance distribution or the like) of the image.
- the user grasps the coating apparatus 100 and brings the coating head 101 close to the skin.
- the user moves the application head 101 along the skin (step S402).
- the user can move the coating head 101 in a direction where there is a spot while looking at the display 301.
- the imaging unit 200 captures a skin moving image at a predetermined frame rate.
- the CPU 305 acquires the second image from the imaging unit 200 (step S403).
- the second image is one frame of a moving image and is an image obtained by capturing a narrow range of cheeks (for example, 1 cm in length ⁇ 1 cm in width). Similar to the first image, the second image may be captured using a polarizing filter.
- the CPU 305 generates image data for the second image in the image processing circuit 302 and stores the image data in the RAM 306. Note that the second image may be a single still image, and the data format of the second image is not particularly limited.
- the CPU 305 calculates the correlation between the first image and the second image (step S404). Specifically, the CPU 305 calculates the value of the evaluation function F ( ⁇ , ⁇ ) expressed by the following formula (1).
- a feature point may be extracted to check the correlation of the feature amount.
- f (x, y) is the pixel value at the coordinates (x, y) of the second image 502
- g (x, y) is at the coordinates (x, y) of the first image 501. Represents a pixel value.
- the hatched portion represents a spot.
- the origin O of the first image 501 and the second image 502 is set to the pixel in the lower left corner.
- the size of the second image 502 is vertical M pixels ⁇ horizontal N pixels
- the size of the first image 501 is larger than the size of the second image 502 and is vertical M1 pixels ⁇ horizontal N1 pixels (M ⁇ M1 and N ⁇ N1).
- the parameters ⁇ and ⁇ represent the shift amount of the window area 501a in the first image.
- the window area 501a has the same size as the second image 502.
- the window region 501a can be scanned in the first image 501 by changing the values of the parameters ⁇ and ⁇ .
- the evaluation function F ( ⁇ , ⁇ ) corresponds to the product-sum operation of the corresponding pixel values between the second image 502 and the window area 501a.
- the evaluation function F ( ⁇ , ⁇ ) is not limited to the equation (1), and may be a sum of absolute values of differences, a square sum of differences of pixel values, a correlation coefficient, or the like for the corresponding pixel value.
- RGB color values R value, G value, B value
- values obtained by appropriately weighting and adding each color value and other values expressed in a color system other than RGB Etc.
- the correlation of the image may be calculated using a specific wavelength component. For example, it is preferable to use a wavelength component (B signal) of 540 nm or less.
- step S404 the CPU 305 calculates the value of the evaluation function F ( ⁇ , ⁇ ) while changing the parameters ⁇ and ⁇ within the ranges of 0 ⁇ ⁇ ⁇ N1-N and 0 ⁇ ⁇ ⁇ M1-M, respectively. That is, the CPU 305 scans the entire area of the first image 501 while shifting the position of the window area 501a by one pixel in the vertical or horizontal direction, and calculates the correlation between the second image 502 and the window area 501a.
- the CPU 305 recognizes the position of the second image in the first image (step S405).
- the CPU 305 determines that the correlation is higher as the value of the evaluation function F ( ⁇ , ⁇ ) is larger, and determines the position ( ⁇ , ⁇ ) of the window region 501a when the evaluation function F ( ⁇ , ⁇ ) has the maximum value. It is recognized as the position of the second image 502.
- the CPU 305 calculates the position of the coating head 101 based on the position of the second image 502 and displays the position information of the coating head 101 on the display 301. For example, the CPU 305 displays a frame indicating a region where the coating head 101 can be applied currently on the first image.
- the CPU 305 determines whether or not there is a coating target (stain) at the position of the coating head 101 (step S406). For example, the CPU 305 refers to the position of the spot specified in step S401 and determines whether or not a spot is included in an area where the coating head 101 can be applied at present. If the CPU 305 determines that the position of the coating head 101 is not stained (NO in step S406), the CPU 305 returns to step S402 and waits until the coating head 101 is moved to another position by the user. If the CPU 305 determines that there is a spot at the position of the coating head 101 (YES in step S406), the CPU 305 executes the coating process of FIG. 6 (step S407). The CPU 305 may notify the user that the coating process is to be executed using the display 301 and the speaker 309.
- the CPU 305 determines whether or not the coating has been completed for all the coating targets (step S408). That is, the CPU 305 determines whether or not the coating process has been executed for all the spots included in the first image. If the CPU 305 determines that the application has not been completed (NO in step S408), the CPU 305 returns to step S402 and waits until the application head 101 is moved to another position by the user. If the CPU 305 determines that the application has been completed (YES in step S408), the CPU 305 displays an indication that the application has been completed on the display 301, and ends the process.
- FIG. 6 is a flowchart showing details of the coating process (step S407) according to the present embodiment.
- the CPU 305 calculates the cosmetic application amount based on the second image acquired from the imaging unit 200 (step S601). For example, the CPU 305 calculates a necessary application amount for each of a plurality of segments obtained by dividing the second image.
- the segment can be a square pixel block composed of a plurality of pixels, and the CPU 305 calculates the application amount independently for each segment.
- the CPU 305 may calculate the application amount for the target segment based on a plurality of segments including the target segment and the surrounding segments.
- the cosmetic application amount may be calculated based on the first image in step S401 of FIG. Similar to the second image, the CPU 305 can calculate a necessary coating amount for each of a plurality of segments obtained by dividing the first image.
- the CPU 305 adjusts the distance between the application head 101 and the skin (step S602).
- the CPU 305 outputs a control signal based on the value detected by the distance sensor 105 to the moving mechanism 103 via the gap control circuit 303.
- the moving mechanism 103 moves the coating head 101 forward and backward, so that the gap with the skin is kept constant. If the size of the gap exceeds the range adjustable by the moving mechanism 103, the CPU 305 uses the display 301, the speaker 309, etc. to prompt the user to move the application head 101 closer to or slightly away from the skin. Can be urged.
- the gap adjustment process may be executed in parallel with the above-described application amount calculation process (step S601).
- the CPU 305 causes the application head 101 to discharge cosmetics (step S603). That is, the CPU 305 outputs, to the coating head 101, control information such as the identification information of the nozzle that performs ejection and the coating amount via the head control circuit 304.
- the coating head 101 discharges cosmetics from the nozzles in response to a control signal from the head control circuit 304.
- the CPU 305 may permit the discharge of the cosmetic only when there is an execution instruction from the user. For example, the CPU 305 performs the cosmetic discharge process (step S603) when the user operates the application button, and does not perform the cosmetic discharge process when the application button is not operated for a predetermined time. You may return to
- the CPU 305 may acquire state information such as the movement and posture of the coating head 101 based on detection signals from the distance sensor 105 and the motion sensor 106.
- the CPU 305 can control the ejection timing based on the state information of the coating head 101. For example, the CPU 305 warns the user using the display 301 and the speaker 309 when the shake amount of the application head 101 or the inclination with respect to the skin exceeds a predetermined range, and the application head 101 takes an appropriate posture with respect to the skin. The discharge timing is delayed until
- the second image acquired from the imaging unit 200 connected to the application head 101 is compared with the first image obtained by imaging the entire skin including the application target, thereby applying the entire skin.
- the position of the head 101 can be accurately recognized. Thereby, the positional relationship between the application head 101 on the skin and the application target is clarified, and the user can guide the application head 101 to the application target without hesitation.
- the first image and the second image are images obtained by imaging the skin of the same part with different sizes. However, not only the size but also the timing of imaging, skin swelling, skin tone, and specular reflection. Projection is different. For this reason, it may be difficult to detect the position of the second image in the first image by simple image matching, but according to the present embodiment, the first image, the second image, By using the correlation, it is possible to accurately detect the position of the second image in the first image.
- a coating apparatus 700 according to a second embodiment of the present invention will be described. Since the coating apparatus 700 according to the present embodiment is configured in the same manner as the coating apparatus 100 according to the first embodiment, the description will focus on differences from the first embodiment.
- FIG. 7A is a schematic view showing the appearance of the coating apparatus 700
- FIG. 7B is a cross-sectional view taken along the line A-A 'of FIG. 7A.
- the coating apparatus 700 includes a guide member (holding unit) 710 on an end surface (hereinafter referred to as a head surface) on which the coating head 101 and the imaging unit 200 are provided.
- the guide member 710 has a hollow prismatic shape inside, and is formed of a transparent or translucent synthetic resin or the like.
- the guide member 710 is disposed along the edge of the head surface and is fixed to the casing 711 of the coating apparatus 700.
- the distal end portion of the guide member 710 is bent inward so that the skin can be easily pressed.
- the user grasps the coating apparatus 700 and lightly presses the tip portion of the guide member 710 against the skin.
- the application head 101 is held by the guide member 710 so as to maintain a constant distance D in the z-axis direction with respect to the skin.
- the guide member 710 is provided on the head surface of the coating apparatus 700, it is not necessary to adjust the gap by the moving mechanism 103. Further, by pressing the guide member 710, swelling of the skin can be suppressed and the flatness of the skin can be increased. Thereby, it becomes possible to improve the coating accuracy by the coating head 101.
- FIG. 9 is a schematic diagram showing a configuration of the coating system 80 according to the present embodiment.
- the coating system 80 includes a coating device 800, an imaging unit 200, and a coating control device 900.
- the coating apparatus 800 includes a coating head 801 and a robot arm 810, and the imaging unit 200 is connected to the coating head 801.
- the coating head 801 is attached to the tip of the robot arm 810.
- the robot arm 810 is an articulated arm type robot, and the position and orientation of the coating head 801 can be freely changed by changing the posture of the arm.
- the robot arm 810 can move the application head 801 in an arbitrary direction along the skin based on a drive command from the application control device 900.
- An application control device 900 is connected to the robot arm 810, and the application control device 900 can control the operation of the robot arm 810 using a drive command.
- the application control apparatus 900 includes a display 301, and various information such as a skin image and the state of the application head 801 are displayed on the display 301.
- FIG. 9 is a block diagram of the coating apparatus 800 and the coating control apparatus 900 according to the present embodiment.
- the coating apparatus 800 includes a coating head 801, a robot arm 810, a cosmetic tank 102, an operation unit 104, and a distance sensor 105.
- the application control apparatus 900 includes a robot controller 910, a display 301, an image processing circuit 302, a head control circuit 304, a CPU 305, a RAM 306, a ROM 307, a storage device 308, and a speaker 309.
- the coating head 801 is configured in the same manner as the coating head 101 according to the first embodiment.
- a plurality of nozzles are arranged in one dimension (for example, in a line shape).
- the robot arm 810 can apply the cosmetic by sweeping the application head 801 in a direction perpendicular to the direction in which the plurality of nozzles are arranged.
- the coating apparatus 800 may include a plurality of cosmetic tanks 102 that store cosmetics of different colors, and a plurality of nozzle rows corresponding to the respective colors may be provided in the coating head 801.
- the robot controller 910 outputs a drive command generated based on an instruction from the CPU 305 to the robot arm 810.
- Cosmetic tank 102, operation unit 104, distance sensor 105, display 301, image processing circuit 302, head control circuit 304, CPU 305, RAM 306, ROM 307, storage device 308, and speaker 309 are the same as those in the first embodiment. Description is omitted.
- the operation unit 104 is preferably provided on a base that supports the robot arm 810, the display 301, or the like.
- the skin shape associated with the first image may be stored in advance in the storage device 308 in a format such as three-dimensional coordinates.
- the robot controller 910 can control the robot arm 810 so as to hold the application head 801 at a certain distance from the skin based on the shape of the skin.
- the application control apparatus 900 performs the same processing as the flowchart of FIG. 4 except that the robot arm 810 moves the application head 801 in step S402 in this embodiment. That is, the CPU 305 determines the direction and distance to which the coating head 101 should be moved based on the position of the second image in the first image recognized in step S405, and moves the robot arm 810 through the robot controller 910. Drive.
- the CPU 305 can recognize the position of the application head 801 on the skin and can move the application head 801 to the application target position using the robot arm 810. Even if the user moves during the movement of the coating head 801, the CPU 305 can recognize the position of the coating head 801 in real time, so that the moving direction and amount of the coating head 801 are immediately corrected. It becomes possible to do.
- the present invention is not limited to the above-described embodiment, and can be modified without departing from the spirit of the present invention.
- feature points such as spots, moles, and pores are extracted from each image, and the correlation is calculated only for the feature points.
- the calculation amount of the evaluation function F ( ⁇ , ⁇ ) can be greatly reduced.
- control processing of the above-described embodiment can also be applied to a beauty device that acts on a specific place on the skin, such as injection of a medicine using a needle, laser irradiation of a skin mole, or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Coating Apparatus (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Ink Jet (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Cosmetics (AREA)
- Signal Processing (AREA)
Abstract
Description
図1は、本実施形態に係る塗布システム10の構成を示す模式図である。塗布システム10は、塗布装置100、撮像部200、塗布制御装置300を備えている。塗布装置100は、肌に化粧品(化粧料)を塗布するための装置であり、ユーザに把持される。塗布装置100は角柱状の筐体を有し、筐体の一端面に塗布ヘッド101が設けられている。塗布装置100の形状はユーザが把持し易い形状であれば限定されず、円柱状、半球状をなしていても良い。また、塗布装置100はハンドル等の把持部材を備えていても良い。
なお、相関を算出する方法としては、以下に述べる画素毎に相関を調べる方法ほか、特徴点を抽出して特徴量の相関を調べるものであってもよい。
続いて、本発明の第2実施形態に係る塗布装置700を説明する。本実施形態に係る塗布装置700は、第1実施形態に係る塗布装置100と同様に構成されているため、第1実施形態と異なる点を中心に説明する。
続いて、本発明の第3実施形態に係る塗布システム80を説明する。図9は、本実施形態に係る塗布システム80の構成を示す模式図である。塗布システム80は、塗布装置800、撮像部200、塗布制御装置900を備えている。塗布装置800は、塗布ヘッド801、ロボットアーム810を備え、塗布ヘッド801に撮像部200が連結されている。塗布ヘッド801は、ロボットアーム810の先端に取り付けられている。
本発明は、上述の実施形態に限定されることなく、本発明の趣旨を逸脱しない範囲で変更実施可能である。例えば、第1の画像と第2の画像との相関を算出する際には(ステップS404)、各画像からシミ、ホクロ、毛穴等の特徴点を抽出し、特徴点のみについて相関を算出するようにしても良い。これにより、評価関数F(τ、υ)の計算量を大幅に削減することが可能となる。
101 塗布ヘッド
200 撮像部
300 塗布制御装置
308 記憶装置(記憶部)
310 取得部
311 認識部
312 算出部
501 第1の画像
502 第2の画像
Claims (11)
- 肌の所定の範囲の第1の画像を記憶する記憶部と、
肌に化粧品を塗布可能な塗布ヘッドに連結された撮像部から、前記第1の画像の範囲よりも小さい範囲の肌の第2の画像を取得する取得部と、
前記第1の画像における前記第2の画像の位置を認識する認識部と、
を備える塗布制御装置。 - 前記認識部は、前記第1の画像から前記第2の画像との相関が最も高い領域を抽出することによって、前記位置を認識する請求項1に記載の塗布制御装置。
- 前記認識部は、前記第1の画像および前記第2の画像の特定の波長成分を用いて前記相関を算出する請求項2に記載の塗布制御装置。
- 前記特定の波長成分は、540nm以下の波長を有する請求項3に記載の塗布制御装置。
- 前記第1の画像および前記第2の画像は、偏光フィルタを用いて撮像される請求項1乃至4のいずれか1項に記載の塗布制御装置。
- 前記第1の画像または前記第2の画像を複数のセグメントに分割し、前記セグメントごとに必要な塗布量を算出する算出部をさらに備え、
前記塗布ヘッドは、前記塗布量に従って化粧品を塗布する請求項1乃至5のいずれか1項に記載の塗布制御装置。 - 肌に対して一定の距離を保つように前記塗布ヘッドを保持する保持部をさらに備える請求項1乃至6のいずれか1項に記載の塗布制御装置。
- 前記塗布ヘッドは、液滴状の化粧品を吐出するインクジェットヘッドである請求項1乃至7のいずれか1項に記載の塗布制御装置。
- 前記塗布ヘッドと、
前記撮像部と、
請求項1乃至8のいずれか1項に記載の塗布制御装置とを備える塗布装置。 - 肌の所定の範囲の第1の画像を記憶するステップと、
肌に化粧品を塗布可能な塗布ヘッドに連結された撮像部から、前記第1の画像の範囲よりも小さい範囲の肌の第2の画像を取得するステップと、
前記第1の画像における前記第2の画像の位置を認識するステップと、
を備える塗布制御方法。 - 肌の所定の範囲の第1の画像を記憶するステップと、
肌に化粧品を塗布可能な塗布ヘッドに連結された撮像部から、前記第1の画像の範囲よりも小さい範囲の肌の第2の画像を取得するステップと、
前記第1の画像における前記第2の画像の位置を認識するステップと、
をコンピュータに実行させるプログラムを記録した記録媒体。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/470,406 US10799009B2 (en) | 2016-12-20 | 2017-12-18 | Application control device, application device, application control method and storage medium |
JP2018557762A JPWO2018117022A1 (ja) | 2016-12-20 | 2017-12-18 | 塗布制御装置、塗布装置、塗布制御方法および記録媒体 |
KR1020197019418A KR102555594B1 (ko) | 2016-12-20 | 2017-12-18 | 도포 제어 장치, 도포 장치, 도포 제어 방법 및 기록매체 |
CN201780083304.0A CN110191661B (zh) | 2016-12-20 | 2017-12-18 | 涂布控制装置、涂布装置、涂布控制方法以及记录介质 |
EP17882888.5A EP3560375B1 (en) | 2016-12-20 | 2017-12-18 | Application control device, application device, application control method, and recording medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016246656 | 2016-12-20 | ||
JP2016-246656 | 2016-12-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018117022A1 true WO2018117022A1 (ja) | 2018-06-28 |
Family
ID=62626310
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/045303 WO2018117022A1 (ja) | 2016-12-20 | 2017-12-18 | 塗布制御装置、塗布装置、塗布制御方法および記録媒体 |
Country Status (7)
Country | Link |
---|---|
US (1) | US10799009B2 (ja) |
EP (1) | EP3560375B1 (ja) |
JP (1) | JPWO2018117022A1 (ja) |
KR (1) | KR102555594B1 (ja) |
CN (1) | CN110191661B (ja) |
TW (1) | TW201826969A (ja) |
WO (1) | WO2018117022A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111216451A (zh) * | 2018-11-27 | 2020-06-02 | 卡西欧计算机株式会社 | 涂布装置、便携终端以及涂布系统 |
JP2021000761A (ja) * | 2019-06-21 | 2021-01-07 | ロレアル | 化粧インク塗布装置および化粧インクを塗布する方法 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3840646A1 (en) * | 2018-08-21 | 2021-06-30 | The Procter & Gamble Company | Methods for identifying pore color |
US11501457B2 (en) | 2020-05-08 | 2022-11-15 | The Procter & Gamble Company | Methods for identifying dendritic pores |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006271654A (ja) | 2005-03-29 | 2006-10-12 | Canon Inc | 液体吐出法を用いた化粧システム及び化粧方法 |
JP2008234362A (ja) * | 2007-03-20 | 2008-10-02 | Sharp Corp | 画像位置合わせ方法、画像位置合わせ装置、画像処理システム、制御プログラム、及び該プログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2015159975A (ja) * | 2014-02-27 | 2015-09-07 | カシオ計算機株式会社 | 肌処理装置、肌処理方法及びプログラム |
JP2016127333A (ja) * | 2014-12-26 | 2016-07-11 | 株式会社リコー | 撮像素子および撮像装置および撮像情報認識システム |
JP2016166853A (ja) * | 2015-03-04 | 2016-09-15 | パナソニックIpマネジメント株式会社 | 位置推定装置および位置推定方法 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2810761B1 (fr) * | 2000-06-26 | 2003-09-05 | Oreal | Procede et dispositif de traitement cosmetique,notamment de soin, de maquillage ou de coloration |
JP4118270B2 (ja) * | 2004-11-18 | 2008-07-16 | 花王株式会社 | 被験部位の特定方法 |
JP4919028B2 (ja) * | 2006-03-03 | 2012-04-18 | 富士ゼロックス株式会社 | 画像処理装置および画像処理プログラム |
JP5095644B2 (ja) * | 2009-01-23 | 2012-12-12 | 株式会社キーエンス | 画像計測装置及びコンピュータプログラム |
JP5521888B2 (ja) * | 2010-08-19 | 2014-06-18 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム、及び電子装置 |
JP5733032B2 (ja) * | 2011-06-06 | 2015-06-10 | ソニー株式会社 | 画像処理装置および方法、画像処理システム、プログラム、および、記録媒体 |
WO2014168930A1 (en) * | 2013-04-09 | 2014-10-16 | University Of Washington Through Its Center For Commercialization | Methods and systems for determining hemodynamic properties of a tissue |
JP6295561B2 (ja) * | 2013-09-17 | 2018-03-20 | 株式会社リコー | 画像検査結果判断装置、画像検査システム及び画像検査結果の判断方法 |
US10098545B2 (en) * | 2013-10-23 | 2018-10-16 | Maxell Holdings, Ltd. | Lens information management system for surface condition measurement and analysis and information management method for surface condition measurement and analysis |
US20160259034A1 (en) * | 2015-03-04 | 2016-09-08 | Panasonic Intellectual Property Management Co., Ltd. | Position estimation device and position estimation method |
US10535131B2 (en) * | 2015-11-18 | 2020-01-14 | Kla-Tencor Corporation | Systems and methods for region-adaptive defect detection |
JP6650819B2 (ja) * | 2016-04-15 | 2020-02-19 | 株式会社 資生堂 | 色ムラ部位の評価方法、色ムラ部位評価装置及び色ムラ部位評価プログラム |
-
2017
- 2017-12-18 EP EP17882888.5A patent/EP3560375B1/en active Active
- 2017-12-18 JP JP2018557762A patent/JPWO2018117022A1/ja active Pending
- 2017-12-18 WO PCT/JP2017/045303 patent/WO2018117022A1/ja unknown
- 2017-12-18 US US16/470,406 patent/US10799009B2/en active Active
- 2017-12-18 CN CN201780083304.0A patent/CN110191661B/zh active Active
- 2017-12-18 KR KR1020197019418A patent/KR102555594B1/ko active IP Right Grant
- 2017-12-20 TW TW106144766A patent/TW201826969A/zh unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006271654A (ja) | 2005-03-29 | 2006-10-12 | Canon Inc | 液体吐出法を用いた化粧システム及び化粧方法 |
JP2008234362A (ja) * | 2007-03-20 | 2008-10-02 | Sharp Corp | 画像位置合わせ方法、画像位置合わせ装置、画像処理システム、制御プログラム、及び該プログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2015159975A (ja) * | 2014-02-27 | 2015-09-07 | カシオ計算機株式会社 | 肌処理装置、肌処理方法及びプログラム |
JP2016127333A (ja) * | 2014-12-26 | 2016-07-11 | 株式会社リコー | 撮像素子および撮像装置および撮像情報認識システム |
JP2016166853A (ja) * | 2015-03-04 | 2016-09-15 | パナソニックIpマネジメント株式会社 | 位置推定装置および位置推定方法 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111216451A (zh) * | 2018-11-27 | 2020-06-02 | 卡西欧计算机株式会社 | 涂布装置、便携终端以及涂布系统 |
JP2020081983A (ja) * | 2018-11-27 | 2020-06-04 | カシオ計算機株式会社 | 塗布装置及び塗布システム |
JP7024695B2 (ja) | 2018-11-27 | 2022-02-24 | カシオ計算機株式会社 | 塗布装置及び塗布システム |
CN114347656A (zh) * | 2018-11-27 | 2022-04-15 | 卡西欧计算机株式会社 | 涂布装置以及涂布系统 |
US11416721B2 (en) | 2018-11-27 | 2022-08-16 | Casio Computer Co., Ltd. | Hand-held printer having a camera for obtaining image data used to notify a user that a position has been reached |
JP2021000761A (ja) * | 2019-06-21 | 2021-01-07 | ロレアル | 化粧インク塗布装置および化粧インクを塗布する方法 |
Also Published As
Publication number | Publication date |
---|---|
TW201826969A (zh) | 2018-08-01 |
EP3560375A1 (en) | 2019-10-30 |
US10799009B2 (en) | 2020-10-13 |
US20190307231A1 (en) | 2019-10-10 |
KR20190099228A (ko) | 2019-08-26 |
EP3560375B1 (en) | 2023-08-16 |
KR102555594B1 (ko) | 2023-07-14 |
CN110191661B (zh) | 2022-07-05 |
EP3560375A4 (en) | 2020-07-15 |
CN110191661A (zh) | 2019-08-30 |
JPWO2018117022A1 (ja) | 2019-10-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018117022A1 (ja) | 塗布制御装置、塗布装置、塗布制御方法および記録媒体 | |
US9687059B2 (en) | Nail decorating apparatus | |
US20160088197A1 (en) | Nail information detection device, drawing apparatus, and nail information detection method | |
US20140037135A1 (en) | Context-driven adjustment of camera parameters | |
US20140060560A1 (en) | Nail printing apparatus and printing control method | |
US20160054859A1 (en) | User interface apparatus and control method | |
JP2011095985A (ja) | 画像表示装置 | |
JP6627476B2 (ja) | 液体吐出装置、液体吐出方法、プログラム | |
WO2013187282A1 (ja) | 撮像画像表示装置、撮像画像表示方法、記録媒体 | |
KR101542671B1 (ko) | 공간 터치 방법 및 공간 터치 장치 | |
US20230333677A1 (en) | Optical stylus for optical position determination device | |
US10116809B2 (en) | Image processing apparatus, control method, and computer-readable storage medium, which obtains calibration image information with which to correct image data | |
JP5691736B2 (ja) | 読取装置 | |
JP2019111004A (ja) | 描画システム、描画装置及び端末装置 | |
JP2015184701A (ja) | 画像処理装置、画像処理方法及びプログラム | |
US11494945B2 (en) | Image analysis device, image analysis method, and program | |
JP2014048565A (ja) | 画像表示装置 | |
JP2019016843A (ja) | 原稿読取装置、原稿読取装置の制御方法、及びプログラム | |
JP7135800B2 (ja) | 塗布装置、塗布システム、塗布方法及びプログラム | |
JP2020054966A (ja) | 塗布装置、塗布システム、塗布方法及びプログラム | |
US11816873B2 (en) | Image analysis device, image analysis method, and program | |
JP2019130694A (ja) | 塗布装置、塗布方法及びプログラム | |
JP7173256B2 (ja) | 描画システム及び描画制御方法 | |
WO2021065176A1 (ja) | 処理装置、電子機器、処理方法、及びプログラム | |
JP2023038725A (ja) | 電子機器、判定方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17882888 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018557762 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20197019418 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2017882888 Country of ref document: EP Effective date: 20190722 |