EP3835010A1 - Anleitung zur haarentfernung - Google Patents

Anleitung zur haarentfernung Download PDF

Info

Publication number
EP3835010A1
EP3835010A1 EP19215228.8A EP19215228A EP3835010A1 EP 3835010 A1 EP3835010 A1 EP 3835010A1 EP 19215228 A EP19215228 A EP 19215228A EP 3835010 A1 EP3835010 A1 EP 3835010A1
Authority
EP
European Patent Office
Prior art keywords
hair removal
user
skin
hair
removal unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19215228.8A
Other languages
English (en)
French (fr)
Inventor
Jonathan Alambra PALERO
Babu Varghese
Yannyk Parulian Julian Bourquin
Steffie AKKERMANS
Vincentius Paulus Buil
Lucie DURACHER
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Priority to EP19215228.8A priority Critical patent/EP3835010A1/de
Priority to JP2022535104A priority patent/JP7371259B2/ja
Priority to EP20816264.4A priority patent/EP4072798B1/de
Priority to CN202080086177.1A priority patent/CN114786893A/zh
Priority to PCT/EP2020/084757 priority patent/WO2021115976A1/en
Priority to US17/779,182 priority patent/US20230001593A1/en
Publication of EP3835010A1 publication Critical patent/EP3835010A1/de
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B26HAND CUTTING TOOLS; CUTTING; SEVERING
    • B26BHAND-HELD CUTTING TOOLS NOT OTHERWISE PROVIDED FOR
    • B26B19/00Clippers or shavers operating with a plurality of cutting edges, e.g. hair clippers, dry shavers
    • B26B19/38Details of, or accessories for, hair clippers, or dry shavers, e.g. housings, casings, grips, guards
    • B26B19/3873Electric features; Charging; Computing devices
    • B26B19/388Sensors; Control
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D26/00Hair-singeing apparatus; Apparatus for removing superfluous hair, e.g. tweezers
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/445Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D2044/007Devices for determining the condition of hair or skin or for selecting the appropriate cosmetic or hair treatment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B26HAND CUTTING TOOLS; CUTTING; SEVERING
    • B26BHAND-HELD CUTTING TOOLS NOT OTHERWISE PROVIDED FOR
    • B26B21/00Razors of the open or knife type; Safety razors or other shaving implements of the planing type; Hair-trimming devices involving a razor-blade; Equipment therefor
    • B26B21/40Details or accessories
    • B26B21/405Electric features; Charging; Computing devices
    • B26B21/4056Sensors or controlling means

Definitions

  • the invention relates to a method, apparatus and tangible machine-readable medium for providing hair removal instructions.
  • Hair removal techniques such as shaving may cause skin irritation.
  • a user may not be aware of the optimal hair removal technique for reducing skin irritation due to hair removal.
  • a user may have difficulty establishing the optimal hair removal technique since they may find it difficult or be unable to determine whether or not their hair removal technique delivers optimal results and/or they may not be aware of any better techniques for delivering optimal results.
  • an object is to provide user guidance to improve hair removal results. Another object is to reduce skin irritation due to hair removal.
  • aspects or embodiments described herein relate to providing user guidance to improve hair removal results and/or reduce skin irritation due to hair removal. Aspects or embodiments described herein may obviate one or more problems associated with hair removal.
  • a computer-implemented method comprises obtaining an indication comprising a skin parameter of a user.
  • the indication further comprises an interaction between the user's skin and a hair removal unit.
  • the method further comprises determining a position of the hair removal unit relative to the user's skin.
  • the method further comprises determining a hair removal instruction for the user based on the indication and the position.
  • the method further comprises causing a user interface to provide the hair removal instruction for the user.
  • obtaining the indication comprising the skin parameter of the user comprises accessing skin parameter data for the user determined based on imaging data of the user's skin.
  • the method comprises causing an imaging device to acquire the imaging data prior to a user hair removal session, to determine pre-hair removal skin parameter data.
  • the method comprises causing the imaging device to acquire the imaging data during and/or after the user hair removal session, to determine present and/or post-hair removal skin parameter data.
  • the method may further comprise generating skin parameter data based on the imaging data.
  • the method may further comprise determining a skin parameter map for the user based on a comparison between the pre-hair removal and present and/or post-hair removal skin parameter data.
  • the skin parameter comprises a visible skin irritation indicator.
  • the skin parameter may be based on whether or not the comparison identifies any change in the visible skin irritation indicator between the pre-hair removal and present and/or post-hair removal skin parameter data.
  • the indication further comprises a hair parameter of the user.
  • the method may further comprise determining the hair removal instruction taking into account the hair parameter.
  • determining the hair removal instruction comprises accessing an optimal hair removal map of the user's skin.
  • a spatial location of the optimal hair removal map may be associated with an optimal hair removal technique.
  • the optimal hair removal technique may be determined based on at least one of: pre-hair removal skin parameter data; historical data for the user; and predetermined knowledge regarding hair removal.
  • the method may further comprise determining the hair removal instruction for the spatial location based on the optimal hair removal map.
  • the method comprises determining, in real-time, the position of the hair removal unit relative to the user's skin. In some embodiments, the method comprises determining, in real-time, the interaction between the user's skin and the hair removal unit. In some embodiments, the method comprises determining, in real-time, the skin parameter. The method may further comprise determining a real-time hair removal instruction for the user based on at least one of: the position; the interaction; the skin parameter; historical hair removal performance data for the user; and pre-determined hair removal performance data.
  • the method comprises causing the user interface to provide, in real-time, the hair removal instruction for the user.
  • the historical hair removal performance data for the user comprises at least one of: user skin type; user skin condition; pressure applied between the hair removal unit and the user's skin; user hair removal behavior; visible skin irritation; hair removal results; hair removal unit motion and hair removal unit operational performance.
  • the historical hair removal performance data may be determined from at least one previous user hair removal session.
  • the pre-determined hair removal performance data may comprise knowledge acquired from other users and/or clinical data regarding at least one of: skin type; skin condition; pressure applied between the hair removal unit and the other user's skin; hair removal behavior; visible skin irritation; hair removal results; hair removal unit motion and hair removal unit operational performance.
  • a recommended hair removal instruction for the user can be determined in order to provide improved hair removal experience as compared to a previous user hair removal session.
  • the recommended hair removal instruction may be based on at least one the historical hair removal performance data for the user and the pre-determined hair removal performance data.
  • the hair removal instruction is configured to provide a personalized recommendation for the user regarding at least one of: pressure to apply between the hair removal unit and the user's skin; hair removal unit positioning relative to the user's skin and hair removal unit motion.
  • the method may further comprise causing the user interface to provide the hair removal instruction for the user based on whether or not the user has deviated from a previously-recommended hair removal instruction.
  • determining the position of the hair removal unit relative to the user's skin comprises acquiring at least one of: imaging data of the user's skin and the hair removal unit; and motion data from a sensor on-board the hair removal unit.
  • the position of the hair removal unit relative to the user's skin may comprise at least one of: a position of a hair removal device of the hair removal unit on the user's skin; and an orientation of the hair removal device relative to the user's skin.
  • apparatus comprising processing circuitry.
  • the processing circuitry comprises an obtaining module, a determining module and a user instruction module.
  • the obtaining module is configured to obtain an indication comprising a skin parameter of a user.
  • the indication further comprises an interaction between the user's skin and a hair removal unit.
  • the determining module is configured to determine a position of the hair removal unit relative to the user's skin.
  • the determining module is further configured to determine a hair removal instruction for the user based on the indication and the position.
  • the user instruction module is configured to cause a user interface to provide the hair removal instruction for the user.
  • the apparatus further comprises at least one of: an imaging device and the user interface.
  • the imaging device may be for acquiring imaging data of the user's skin and the hair removal unit.
  • a tangible machine-readable medium stores instructions which, when executed by at least one processor, cause the at least one processor to obtain an indication comprising a skin parameter of a user.
  • the indication further comprises an interaction between the user's skin and a hair removal unit.
  • the instructions further cause the at least one processor to determine a position of the hair removal unit relative to the user's skin.
  • the instructions further cause the at least one processor to determine a hair removal instruction for the user based on the indication and the position.
  • the instructions further cause the at least one processor to cause a user interface to provide the hair removal instruction for the user.
  • Figure 1 shows a method 100 (e.g., a computer-implemented method) of providing hair removal instructions (e.g., guidance) for a user.
  • the method 100 may allow the user to improve and/or optimize hair removal results and/or reduce skin irritation due to hair removal.
  • the method 100 comprises, at block 102, obtaining an indication.
  • the indication comprises a skin parameter of a user.
  • the skin parameter may refer to a characteristic of the skin that may affect hair removal and/or be affected by hair removal.
  • a spatial location on the user's skin may be associated with the skin parameter.
  • a map of the user's skin may comprise a plurality of spatial locations where each spatial location has an associated skin parameter.
  • a plurality of skin parameters may be associated with each spatial location.
  • the skin parameter may refer to, for example, skin type, skin health status, skin moisture, skin roughness, after-hair removal irritation (e.g., skin redness) of the user and/or any skin characteristic associated with the user's skin.
  • the skin parameter may be indicative of certain information regarding the user's skin.
  • the information may comprise a calculated or estimated value indicating, for example, a skin irritation level where different values may indicative a different level of skin irritation.
  • a plurality of skin parameters are associated with a spatial location on the skin, different skin parameters may be associated with that spatial location. At least one of the skin parameters may provide certain information to facilitate e.g., improved hair removal results and/or a reduction in skin irritation from hair removal.
  • the obtained indication further comprises an interaction between the user's skin and a hair removal unit.
  • the interaction may comprise a pressure applied by the user on the user's skin by the hair removal unit.
  • the method 100 further comprises, at block 104, determining a position of the hair removal unit relative to the user's skin.
  • the method 100 may allow the position of the hair removal unit to be tracked in relation to the user's skin. A more detailed description of the hair removal unit position determination is provided below.
  • the method 100 further comprises, at block 106, determining a hair removal instruction for the user based on the indication and the position.
  • the method 100 may take into account certain information derived from the indication and/or the position of the hair removal unit to determine the hair removal instruction.
  • the hair removal instruction may be used to provide guidance for the user in terms of how to remove hair from their skin using the hair removal unit.
  • the hair removal instruction may determine that the user should apply more or less pressure using the hair removal unit.
  • the hair removal instruction may determine guidance in terms of the direction, hair removal unit orientation and/or speed with which the user is to use the hair removal unit.
  • the method 100 may determine a hair removal instruction to provide to the user that is indicative of a skin and/or hair treatment regime, hair removal unit charge level and any other factor which may affect whether or not the hair removal session provides improved/optimal hair removal results and/or reduces skin irritation.
  • Determining the position of the hair removal unit relative to the user's skin may provide certain information which may be used to determine the hair removal instruction. For example, a determination may be made that the hair removal unit is at or about to arrive at a certain spatial location on the user's skin. A determination may be made, based on the indication and/or the position, regarding a recommended hair removal unit action/technique for that spatial location on the skin and/or the next/predicted spatial location on the skin.
  • the method 100 further comprises, at block 108, causing a user interface to provide the hair removal instruction for the user.
  • the hair removal instruction may be provided in an appropriate format to enable the user to identity the hair removal instruction and attempt to follow the hair removal instruction.
  • the hair removal instruction may be provided in any appropriate format for the user (e.g., via a visual and/or audible format).
  • the user interface may comprise a device capable of providing the hair removal instruction for the user in a visual and/or audible manner.
  • the user interface may be provided by a user equipment such as a mobile phone, tablet, mirror, smart device or any other device capably of conveying visual and/or audible instructions.
  • the user may possess a user equipment capable of providing a visualization of the user's skin (e.g., via a graphical user interface such as a screen) and corresponding hair removal instructions.
  • an arrow, moving indicator or other hair removal instruction may be visualized on the screen, which the user can interpret and follow.
  • the user equipment may provide an audible hair removal instruction. For example, if too much or too little pressure is applied by the hair removal unit, an audible warning such as a beep or verbal instruction may be played for the user. Any combination of visual and/or audible hair removal instructions may be provided by the user interface.
  • the method 100 may provide a user with hair removal guidance, which may help the user to achieve improved and/or optimal hair removal results and/or reduce skin irritation due to hair removal.
  • a user may be made aware of the optimal hair removal technique for reducing skin irritation from hair removal.
  • the hair removal instruction may be tailored to the user's individual needs e.g., to provide improved hair removal results and/or reduce skin irritation.
  • the method 100 may enable the user to be trained so that their future hair removal sessions provide improved/optimal results and/or reduce skin irritation.
  • data from different hair removal sessions may be compared to determine whether or not hair removal performance could be improved and/or whether or not skin irritation could be reduced.
  • the hair removal instruction may take into account such a comparison e.g., to learn how the user could improve their hair removal results.
  • FIG. 2 shows a system 200 for implementing certain methods described herein.
  • a user 202 removes hair from their skin (e.g., from their face or another part of their body) with a hair removal unit 204.
  • the hair removal unit 204 may comprise an electric shaver (e.g., a motorized rotary blade or a foil-based razor), an epilator, a manual razor, smart razor or indeed any type of hair removal unit capable of removing hair from skin whether by cutting, pulling or otherwise removing at least a portion of the hair from the skin.
  • Any reference to shaving may refer to any form of hair removal e.g., by any type of hair removal unit.
  • the hair removal unit 204 may comprise at least one sensor for determining the interaction between the user's skin and the hair removal unit 204.
  • the hair removal unit 204 may comprise a pressure sensor for measuring contact pressure between the user's skin and the hair removal unit 204.
  • a user measurement unit 206 of the system 200 is configured to determine the indication of the skin parameter and/or an indication of any other parameters described herein (for example, a hair parameter).
  • the user measurement unit 206 may acquire information relating to the skin parameter and/or any other parameters (e.g., from imaging data acquired by a user equipment) in order to determine the indication.
  • the imaging data may indicate that a hair removal session has caused some skin irritation, which may be apparent by comparing the redness in the skin between different images.
  • the user measurement unit 206 may perform measurements which can be used to determine the indication of the skin parameter.
  • the user measurement unit 206 may map the skin in order to determine the indication of the skin parameter (and/or any other parameters) for different spatial locations on the user's skin.
  • the user measurement unit 206 may perform processing (e.g., using on-board processing circuitry of a user equipment) to determine the skin parameter and/or any other parameters described herein. Additionally or alternatively, the user measurement unit 206 may send its acquired data to an online service so that the online service may determine the skin parameter and/or any other parameters. Further the processing may facilitate mapping of the user's skin such that a spatial location is associated with a certain skin parameter and/or any other parameter described herein.
  • the user measurement unit 206 is configured to map certain relevant skin and/or hair parameters such as skin moisture, skin roughness, hair growth orientation, hair length, hair density, post-hair removal irritation/redness of the user.
  • the user measurement unit 206 may be configured to cause an on-board sensor of the user equipment (e.g., an imaging device such as a camera) to acquire data such as imaging data to be used (e.g., by the user measurement unit 206 itself or an online service) to determine the skin parameter and/or any other parameter.
  • an on-board sensor of the user equipment e.g., an imaging device such as a camera
  • data such as imaging data to be used (e.g., by the user measurement unit 206 itself or an online service) to determine the skin parameter and/or any other parameter.
  • the position of the hair removal unit 204 relative to the user's skin may be determined by a hair removal unit localization unit 208.
  • the position of the hair removal unit 204 may refer to or be indicative of a position of a hair removal device (e.g., a blade) of the hair removal unit 204 on the user's skin. Additionally or alternatively, the position of the hair removal unit 204 may refer to or be indicative of an orientation of the hair removal device relative to the user's skin.
  • the hair removal unit 204 may comprise the hair removal device and other components such as a handle. While performing hair removal, the position of the hair removal device itself may be used to determine the hair removal instruction.
  • the hair removal unit localization unit 208 may determine the position of the hair removal device itself, which may provide accurate information regarding the spatial location on the user's skin where the hair removal device (e.g., the blade) is located. Additionally or alternatively, the position of any other part of the hair removal unit 204 may be determined, which may infer or be used to determine the position of the hair removal device relative to the user's skin.
  • the hair removal unit localization unit 208 may be implemented by any user equipment (e.g., a smart device) or other user device depending on the procedure used to determine the position of the hair removal unit 204.
  • the hair removal unit 204 itself may comprise an on-board sensor such as a motion-sensitive detector (e.g., accelerometer) and/or an imaging device to determine its position relative to the user's skin.
  • a motion-sensitive detector e.g., accelerometer
  • the user equipment may comprise or at least partially implement the hair removal localization unit 208.
  • the user equipment may comprise an imaging device such as a camera for acquiring imaging data, which can be used to track the position of the hair removal unit 204 and/or the user's hand relative to the user's skin.
  • the imaging data may be processed by on-board processing circuitry of the user equipment and/or may be communicated to an online service or other processing apparatus to be processed.
  • the tracking of the position of the hair removal unit 204 relative to the user's skin may, for example, involve a machine vision-based tracking procedure.
  • the tracking procedure may also take into account the user's skin using a skin recognition algorithm. For example, if tracking the hair removal unit 204 on the user's face, a facial recognition algorithm in combination with a tracking algorithm may be used to determine where, on the user's face, the hair removal unit 204 is positioned.
  • the system 200 comprises a processing unit 210 for implementing certain methods described herein, such as the method 100 of Figure 1 .
  • the processing unit 210 comprises processing circuitry for implementing the method.
  • the processing unit 210 obtains the indication comprising the skin parameter and/or any other parameter of the user from the user measurement unit 206 (e.g., in accordance with block 102 of the method 100).
  • the processing unit 210 further obtains the indication comprising the interaction between the user's skin and the hair removal unit 204 from the hair removal unit 204 (e.g., in accordance with block 104 of the method 100).
  • the processing unit 210 determines the position of the hair removal unit 204 relative to the user's skin based on data provided by the hair removal unit localization unit 208 (e.g., in accordance with block 106 of the method 100).
  • the processing unit 210 determines a hair removal instruction for the user 202 based on the indication and the position (e.g., in accordance with block 106 of the method 100).
  • the processing unit 210 may determine a map of optimal hair removal instructions for at least one spatial location on the user's skin.
  • the hair removal instruction may be indicative of at least one of: an applied pressure, hair removal unit 204 motion direction, hair removal unit 204 motion speed that is recommended for the user 202 based on at least one of: the skin parameter (e.g., as provided in a skin measurement map), any other parameters as described herein; and the position of the hair removal unit 204 relative to the user's skin.
  • the processing unit 210 causes a user interface 212 of the system 200 to provide the hair removal instruction to the user (e.g., in accordance with block 108 of the method 100).
  • the user interface 212 comprises a display for visualization of the recommended hair removal instruction mapped on to a skin and/or hair parameter map to provide real-time guidance (e.g., visual guidance) to the user for hair removal that may lead to improved and/or optimal hair removal results and/or reduced/minimized skin irritation.
  • the user interface 212 may be provided by a user equipment that is the same as or different to the user equipment or user device providing the user measurement unit 206 and/or the hair removal unit localization unit 208.
  • a smart phone or other smart device may perform imaging via its on-board camera to obtain the indication and the position.
  • a display screen and/or speaker of the smart device may be configured to provide the hair removal instruction (e.g., in a format appropriate for the user to interpret).
  • the processing unit 210 further comprises or can access a memory unit for storing at least one of: certain measurements (such as obtained by the hair removal unit 204, user measurement unit 206 and/or hair removal unit localization unit 208), a skin and/or hair parameter map (e.g., comprising a skin, hair and/or other parameter for a corresponding spatial location on the skin), a derived skin parameter and/or any other parameter, hair removal unit position relative to the user's skin, a map of hair removal unit 204 usage (e.g., previous usage) and a map of optimal hair removal unit 204 instructions.
  • certain measurements such as obtained by the hair removal unit 204, user measurement unit 206 and/or hair removal unit localization unit 208
  • a skin and/or hair parameter map e.g., comprising a skin, hair and/or other parameter for a corresponding spatial location on the skin
  • a derived skin parameter and/or any other parameter e.g., hair removal unit position relative to the user's skin
  • the processing unit 210 may calculate an optimal hair removal unit 204 configuration.
  • the hair removal unit 204 configuration may refer to, for example, a blade speed, blade rotation speed, cutting force and/or power for the hair removal unit 204.
  • the processing unit 210 may provide feedback to the hair removal unit 204 such that the hair removal unit 204 adapts in real-time for optimal hair removal.
  • the calculation of the optimal hair removal unit 204 configuration may be provided in addition to certain methods described herein or may replace certain blocks of certain methods. For example, the calculation of the optimal hair removal unit 204 configuration may be provided in addition to the blocks of the method 100.
  • blocks 106 and 108 of the method 100 may be omitted and the calculation of the optimal hair removal unit 204 configuration may be implemented in combination with blocks 102 and 104 of the method 100.
  • blocks 104, 106 and 108 of the method 100 may be omitted and the calculation of the optimal hair removal unit 204 configuration may be implemented in combination with block 102 of the method 100.
  • the processing unit 210 for implementing certain methods described herein may be provided by a user equipment such as described above.
  • the processing unit 210 may be provided by an online service (e.g., at a server or cloud-based service).
  • Figure 3 shows a method 300 (e.g., a computer-implemented method) of providing hair removal instructions for a user.
  • the method 300 may be implemented by processing circuitry such as provided by the processing unit 210 of Figure 2 or any other processing apparatus or circuitry described herein. As will be described in more detail herein, the method 300 may allow the user to improve and/or optimize hair removal results and/or reduce skin irritation due to hair removal.
  • the method 300 may comprise certain blocks corresponding to certain blocks of the method 100. Certain blocks of the method 300 may be omitted and/or modified.
  • a hair removal unit such as a shaver may be communicatively coupled to a smart device, e.g. smartphone loaded with a Real Time Shaving Guidance application, or 'app', to assist a user with hair removal.
  • the hair removal unit may be used for shaving/removing facial hair and/or for removing hair from any other part of the body.
  • the method 300 comprises, at block 302, acquiring imaging data (e.g., at least one image) of the user's skin.
  • the block 302 may cause an imaging device (e.g., of a user equipment) to obtain the imaging data.
  • the imaging data is acquired prior to a user hair removal session.
  • the method 300 comprises, at block 304, determining certain data regarding a skin and/or hair parameter associated with the user.
  • the data regarding the skin and/or hair parameter may be referred to as pre-hair removal skin parameter data (i.e., the data may relate to the skin and/or hair parameter).
  • an algorithm may determine the skin and/or hair parameter based on a machine learning model which has been trained to identity and/or classify certain skin parameters (e.g., skin redness, for example) from the imaging data.
  • a corresponding skin and/or hair parameter map may be generated by the processing circuitry based on the skin and/or hair parameter data.
  • the skin and/or hair parameter map may comprise at least one skin and/or hair parameter (and/or any other parameter) associated with at least one spatial location of the user's skin.
  • a user may initially capture an image of their skin (e.g., their face) using their smart device to obtain a baseline skin parameter map (e.g., a baseline facial skin map). This can be done via certain facial tracking techniques (e.g., based on machine learning or another algorithm).
  • a baseline skin parameter map e.g., a baseline facial skin map
  • the present skin and/or hair conditions e.g., length and/or type of hair
  • This image capture and present skin/hair condition determination procedure may be performed before each hair removal session and/or may be performed before the first (i.e., first ever) hair removal session with the hair removal unit.
  • the baseline skin parameter map may comprise any relevant skin and/or hair parameters (e.g., skin moisture, skin roughness, hair growth orientation, hair length, hair density, post-hair removal irritation/redness of the user).
  • relevant skin and/or hair parameters e.g., skin moisture, skin roughness, hair growth orientation, hair length, hair density, post-hair removal irritation/redness of the user.
  • the method 300 further comprises, at block 306, providing or accessing historical data for the user (e.g., from a memory such as the memory unit referred to in relation to Figure 2 ).
  • a previous hair removal session may have yielded certain data regarding hair removal performance and/or at least one skin and/or hair parameter associated with the user. For example, if the previous hair removal session caused skin irritation, this may be reflected by the corresponding skin and/or hair parameter for the spatial location(s) affected by the skin irritation.
  • the historical data may comprise or be used to calculate a comparison of an outcome of a hair removal session (e.g., a comparison of the post-hair removal skin parameter data with the pre-hair removal skin parameter data).
  • the historical data may comprise or be referred to as a post-hair removal skin parameter map (i.e., the historical data may relate to the skin and/or hair parameter).
  • the post-hair removal skin parameter map may have been obtained previously after a previous hair removal session.
  • the post-hair removal skin parameter map may comprise the comparison of a skin and/or hair parameter map obtained before and after the hair removal session.
  • the method comprises, at block 308, providing or accessing predetermined knowledge regarding hair removal (which knowledge may be stored in a memory e.g., of an online service or of a user equipment).
  • the predetermined knowledge may comprise general (e.g., clinical) knowledge on hair removal techniques and/or the skin-hair interaction.
  • Such knowledge may comprise, for example, at least one of: an optimal hair removal unit pressure to apply on the skin (e.g., for certain skin types and/or position of the hair removal unit); optimal hair removal unit speed across the user's skin; optimal hair removal unit direction and/or motion pattern for certain spatial locations on the skin and/or hair lengths/types.
  • the predetermined knowledge may be used for providing an initial recommendation on the hair removal technique.
  • further recommendations may be personalized based on data obtained from subsequent hair removal sessions.
  • the method 300 comprises, at block 310, generating an optimal hair removal map of the user's skin.
  • a spatial location of the optimal hair removal map may be associated with an optimal hair removal technique that is determined based on at least one of: historical data for the user (e.g., from block 304); pre-hair removal skin parameter data (e.g., from block 306) and predetermined knowledge regarding hair removal (e.g., from block 308).
  • the optimal hair removal map may be stored in a memory (e.g., of a user equipment or an online service), for example, to allow the optimal hair removal map to be accessed during or after a hair removal session.
  • a memory e.g., of a user equipment or an online service
  • the spatial location may be associated with at least one of: a skin parameter and a hair parameter for the user's skin at that spatial location.
  • the optimal hair removal map may provide an indication of the skin and/or hair parameter at the spatial location.
  • the indication of the skin and/or hair parameter provided by the optimal hair removal map may be referred to as pre-hair removal skin and/or hair parameter data.
  • the method 300 comprises accessing the optimal hair removal map of the user's skin (e.g., as generated according to block 310); and determining the hair removal instruction for the spatial location based on the optimal hair removal map.
  • the method 300 comprises, at block 312, starting a hair removal session in which the optimal hair removal map is accessed.
  • obtaining the indication (e.g., in accordance with block 102 of the method 100) comprising the skin parameter of the user comprises accessing skin parameter data for the user determined based on imaging data of the user's skin.
  • the imaging data of the user's skin may refer to the imaging data acquired at block 302 of the method 300.
  • the optimal hair removal map may provide the indication of the skin parameter.
  • obtaining the indication comprising the skin parameter of the user may comprise accessing the optimal hair removal map (e.g., as described in relation to blocks 310/312 of the method 300).
  • the imaging data of the user's skin may refer to imaging data acquired at another block of the method 300, which may be used to provide the indication comprising the skin parameter of the user (for example, in real-time, as will be described in more detail herein).
  • the optimal hair removal map may be accessed in order to allow determination of a hair removal instruction for the user based on the indication comprising the skin parameter of the user (e.g., as referred to in block 106 of the method 100).
  • the method 300 may use information derived from the optimal hair removal map in conjunction with other information obtained during the hair removal session in order to determine the hair removal instruction.
  • the method 300 comprises, at block 314, determining, in real-time, the position of the hair removal unit relative to the user's skin.
  • the data for determining the position of the hair removal unit in block 314 may be obtained by the hair removal unit localization unit 208 described in relation to Figure 2 . This data may be obtained from at least one of: imaging data acquired from an imaging device for acquiring images of the skin; and an on-board sensor of the hair removal unit 304. The determination of the position of the hair removal unit may be determined from this data (e.g., using the processing unit 210 of Figure 2 ).
  • determining the position of the hair removal unit relative to the user's skin comprises acquiring at least one of: imaging data of the user's skin and the hair removal unit; and motion data from a sensor on-board the hair removal unit.
  • the position of the hair removal unit relative to the user's skin may comprise at least one of: a position of a hair removal device of the hair removal unit on the user's skin; and an orientation of the hair removal device relative to the user's skin.
  • Positioning data may be obtained from the imaging data and/or the on-board sensor. This positioning data may be used to track the position of the hair removal unit relative to the user's skin as the user moves the hair removal unit across their skin.
  • the app may determine the real-time motion (e.g., position and/or orientation) of the hair removal unit relative to the user's skin, using a series of images captured by a camera of the user equipment.
  • This determination of the real-time motion can be performed by tracking the hair removal unit and/or the user's hand within the image series, for example, using a computer vision algorithm.
  • the determination may be supported by motion and/or orientation tracking within the hair removal unit itself (e.g., using an on-board sensor of the hair removal unit).
  • the method 300 further comprises, at block 316, determining, in real-time, the interaction between the user's skin and the hair removal unit.
  • the hair removal unit 204 may comprise at least one sensor for determining the interaction (e.g., applied pressure) between the user's skin and the hair removal unit 204.
  • data for determining the interaction may be obtained from the hair removal unit 204.
  • the applied pressure may be recorded in real-time and may be linked to the position of the hair removal unit.
  • the amount of hair cut/removed may be recorded or inferred.
  • the amount of hair cut/removed may be determined from at least one of: a sound analysis (e.g., using a microphone of a user equipment or of the hair removal unit itself to determine how many hairs are being cut or removed); a motor resistance observed by a hair removal device (e.g., motorized blade) of the hair removal unit (i.e., the motor resistance may be affected by pressure on skin and/or number of hairs cut/removed); and a computer vision analysis (e.g., using imaging data obtained from a camera of the user equipment) of the results obtained during the hair removal session.
  • a sound analysis e.g., using a microphone of a user equipment or of the hair removal unit itself to determine how many hairs are being cut or removed
  • a motor resistance observed by a hair removal device e.g., motorized blade
  • a computer vision analysis e.g., using imaging data obtained from a camera of the user equipment
  • the method 300 comprises, at block 318, determining, in real-time, the skin parameter.
  • the indication comprising the skin parameter of the user may be determined from imaging data acquired before the hair removal session.
  • the skin parameter may be determined from imaging data (e.g., at least one image) acquired during the hair removal session.
  • the imaging data may be acquired by an imaging device of a user equipment and this imaging data may be processed in order to provide the indication comprising the skin parameter of the user (e.g., in a similar manner to block 304 determining certain data regarding the skin parameter associated with the user).
  • block 318 may cause the imaging device to acquire the imaging data during the user hair removal session, to determine present skin parameter data.
  • the skin parameter of the user may be based on a comparison between the pre-hair removal skin parameter data (e.g., as referred to in block 310) and the present skin parameter data.
  • the method 300 comprises, at block 320, determining a real-time hair removal instruction for the user based on at least one of: the position; the interaction; the skin parameter; historical hair removal performance data for the user; and pre-determined hair removal performance data.
  • at least one of blocks 314, 316 and 318 may be implemented in order to determine the real-time hair removal instruction.
  • the real-time hair removal instruction may provide at least one of: a recommended pressure, hair removal unit motion direction and/or pattern and/or hair removal unit motion speed which has been calculated for each position on the skin and/or hair parameter map.
  • the historical hair removal performance data for the user may comprise at least one of: user skin type; user skin condition; pressure applied between the hair removal unit and the user's skin; user hair removal behavior; visible skin irritation (e.g., skin redness); hair removal results (e.g., hair cutting effectiveness); hair removal unit motion (e.g., direction and type of movement) and hair removal unit operational performance (e.g., battery level, cutting speed).
  • the historical hair removal performance data may be determined from at least one previous user hair removal session.
  • This historical hair removal performance data may be an example of the historical data provided at block 306 of the method 300.
  • the term 'visible' in relation to the skin irritation may refer to whether or not the skin irritation is visible to a machine vision system or any system capable of detecting skin irritation, whether visible to the human eye, or not.
  • the pre-determined hair removal performance data may comprise knowledge acquired from other users and/or clinical data regarding at least one of: skin type; skin condition; pressure applied between the hair removal unit and the other user's skin; hair removal behavior; visible skin irritation; hair removal results; hair removal unit motion and hair removal unit operational performance.
  • This knowledge may be used to determine a recommended hair removal instruction for the user in order to provide improved hair removal experience (e.g., more efficient cutting, less time, less skin irritation) as compared to a previous user hair removal session.
  • This pre-determined hair removal performance data may be an example of the predetermined knowledge regarding hair removal provided at block 308 of the method 300.
  • the method 300 further comprises, at block 322, comprising causing a user interface (e.g., the user interface 212 of Figure 2 ) to provide, in real-time, the real-time hair removal instruction for the user.
  • a user interface e.g., the user interface 212 of Figure 2
  • the user ends their hair removal session.
  • Certain blocks described below may be used to evaluate the results of the hair removal session, which may provide certain information that can be used in a subsequent hair removal session (e.g., the information may be stored in a memory (e.g., of a user equipment or an online service) so as to be provided at block 306).
  • the method 300 comprises, at block 326, causing the imaging device to acquire the imaging data (i.e., after the user hair removal session), to determine post-hair removal skin parameter data.
  • skin and/or hair parameter data is generated (which may be referred to as post-hair removal skin parameter data).
  • This skin and/or hair parameter data may be used to generate a skin and/or hair parameter map.
  • the post-hair removal skin parameter data may relate to a skin and/or hair parameter and/or any other parameter.
  • the method 300 further comprises, at block 330, determining a skin and/or hair parameter map for the user based on a comparison between the pre-hair removal and present and/or post-hair removal skin parameter data.
  • the comparison may be made between a map derived from the present and/or post-hair removal skin parameter data and the baseline skin parameter map.
  • the skin parameter comprises a visible skin irritation indicator (e.g., skin redness).
  • the skin parameter may be based on whether or not the comparison (e.g., at block 330) identifies any change in the visible skin irritation indicator between the pre-hair removal and present and/or post-hair removal skin parameter data.
  • the skin redness is measured and a map of the redness is saved (e.g., the map may correspond to the skin parameter map).
  • This measurement may be performed by analyzing data such as imaging data acquired from a camera of a user equipment and/or from a separate skin analysis device.
  • the analysis may comprise a comparison with the image(s) captured before the hair removal session began and/or use data provided by the separate skin analysis device.
  • the separate skin analysis device may refer to any other device capable of measuring a certain property of the skin such as hydration, gloss/oiliness, spots and redness, among other properties.
  • Certain examples of such skin analysis devices may illuminate the skin with radiation (e.g., ultraviolet, visible and/or infrared) and detect characteristics (such as a change in spectral content and/or intensity) from the radiation reflected by the skin.
  • certain data such as the recorded hair removal unit applied pressure, skin redness, hair removal results and/or the hair removal unit motion data may be processed to identify hair removal actions/techniques related to a position on the skin where those actions/techniques resulted in optimal, sub-optimal or poor hair removal results.
  • the data acquired at certain blocks may provide an indication that can be used to update the hair removal instruction.
  • too much pressure applied by the hair removal unit on the skin may result in skin irritation and/or suboptimal hair cutting.
  • insufficient skin contact e.g., including suboptimal orientation of the hair removal unit in relation to the skin
  • a suboptimal direction or motion pattern e.g., straight vs circular
  • the hair removal unit may result in suboptimal hair cutting.
  • suboptimal hair removal unit motion speeds may result in skin irritation, suboptimal hair cutting and/or shaving inefficiency.
  • too many passes of the hair removal unit over a particular spatial location of the skin may be a result of suboptimal hair cutting, resulting in skin irritation and/or shaving inefficiency.
  • other suboptimal hair removal techniques may be identified that can be improved by different user and/or hair removal unit behavior. Any combination of these and other examples may be identified from the acquired data and used to recommend a technique (e.g., hair removal instruction) to the user which may result in improved and/or optimal hair removal and/or reduced skin irritation.
  • certain blocks of the method 300 may be implemented.
  • the real-time motion (e.g., position and/or orientation) of the hair removal unit relative to the skin may be determined using a series of images captured by the camera (e.g., in accordance with block 318).
  • other parameters such as the skin and/or hair parameter may be determined and/or other sensors may be used as well to provide data which can be used to determine the recommendation for the user.
  • the recommended hair removal unit pressure, direction and speed may be visually shown to the user in real-time.
  • the recommended hair removal guidance may be shown (e.g., by a display of a user equipment) in relation to the actual shaving behavior of the user, which may provide direct cues to the user such as indicating: an increase or decrease of applied pressure, an increase or decrease in the motion direction and/or a certain motion pattern (e.g., straight or circular motion and/or a different diameter of circular motion) and/or an increase or decrease in motion speed.
  • the applied hair removal pressure (and/or other parameters) may be recorded in real-time and linked to the position of the hair removal unit.
  • the skin irritation e.g., skin redness
  • a map of the skin irritation may be saved in a memory (and used to update the hair removal instruction).
  • the recorded pressure, skin irritation, position data and/or any previous recommendations may be processed, to calculate personalized recommendations. These calculated personalized recommendations may be based on the level of adherence to the guidance indicated by the cues described above.
  • these recommendations may be based on the results in terms of skin irritation and/or hair removal efficiency.
  • perfect adherence by the user can still lead to suboptimal results, which may indicate that the general knowledge may not apply to this user and may need to be personalized for the user by learning from data acquired from the user's hair removal session(s).
  • the recommended pressure, motion direction and/or motion speed may be recalculated for each position in the skin and/or hair parameter map for the next hair removal session.
  • the hair removal instruction is configured to provide a personalized recommendation for the user regarding at least one of: pressure to apply between the hair removal unit and the user's skin; hair removal unit positioning (e.g., including orientation of the hair removal unit) relative to the user's skin and hair removal unit motion (e.g., including direction, speed and motion pattern of the hair removal unit).
  • hair removal unit positioning e.g., including orientation of the hair removal unit
  • hair removal unit motion e.g., including direction, speed and motion pattern of the hair removal unit.
  • Certain embodiments described herein refer to a user's skin parameter, an indication of which is obtained by certain methods described herein.
  • the indication further comprises a hair parameter of the user.
  • certain methods described herein may further comprise determining the hair removal instruction taking into account the hair parameter.
  • the hair removal instruction may be based on an analysis of the skin and/or hair parameter acquired from imaging data, which may have been obtained prior to a hair removal session (e.g., at block 302 or from a previous hair removal session's block 326) or during a hair removal session (e.g., at block 318).
  • the method 300 may further comprise causing the user interface to provide the hair removal instruction for the user based on whether or not the user has deviated from a previously-recommended hair removal instruction. For example, during a hair removal session, a user may deviate from the recommended hair removal instruction. The method 300 may then update the hair removal instruction to accommodate/correct for the user's deviation.
  • FIG 4 shows an apparatus 400, which may be used for implementing certain methods described herein such as the methods 100, 300.
  • the apparatus 400 comprises processing circuitry 402.
  • the processing circuitry 402 may correspond to the processing circuitry of the processing unit 210 described in relation to Figure 2 .
  • the processing circuitry 402 comprises an obtaining module 404.
  • the obtaining module 404 is configured to obtain an indication comprising a skin parameter of a user; and an interaction between the user's skin and a hair removal unit (such as described in relation to block 102 of the method 100).
  • the processing circuitry 402 further comprises a determining module 406.
  • the determining module 406 is configured to determine a position of the hair removal unit relative to the user's skin (such as described in relation to block 104 of the method 100).
  • the determining module 406 is further configured to determine a hair removal instruction for the user based on the indication and the position (such as described in relation to block 106 of the method 100).
  • the processing circuitry 402 further comprises a user instruction module 408.
  • the user instruction module 408 is configured to cause a user interface to provide the hair removal instruction for the user (such as described in relation to block 108 of the method 100).
  • FIG. 5 shows an apparatus 500, which may be used for implementing certain methods described herein such as the methods 100, 300.
  • the apparatus 500 comprises processing circuitry 502.
  • the processing circuitry 502 comprises the processing circuitry 402 of the apparatus 400 of Figure 4 .
  • the apparatus 500 further comprises an imaging device 504 such as a camera of a user equipment for acquiring imaging data of the user's skin and the hair removal unit (e.g., hair removal unit 204 of Figure 2 ).
  • an imaging device 504 such as a camera of a user equipment for acquiring imaging data of the user's skin and the hair removal unit (e.g., hair removal unit 204 of Figure 2 ).
  • the apparatus 500 further comprises the user interface 506 (e.g., as referred to in the user instruction module 408) of a user equipment.
  • the apparatus 500 comprises both the imaging device 504 and the user interface 506.
  • Figure 6 shows a tangible machine-readable medium 600 storing instructions 602 which, when executed by at least one processor 604, cause the at least one processor 604 to implement certain methods described herein (such as the methods 100, 300).
  • the instructions 602 comprise instructions 606 that cause the at least one processor 604 to obtain an indication comprising a skin parameter of a user; and an interaction between the user's skin and a hair removal unit (such as described in relation to block 102 of the method 100).
  • the instructions 602 further comprise instructions 608 that cause the at least one processor 604 to determine a position of the hair removal unit relative to the user's skin (such as described in relation to block 104 of the method 100).
  • the instructions 602 further comprise instructions 610 that cause the at least one processor 604 to determine a hair removal instruction for the user based on the indication and the position (such as described in relation to block 106 of the method 100).
  • the instructions 602 further comprise instructions 612 that cause the at least one processor 604 to cause a user interface to provide the hair removal instruction for the user (such as described in relation to block 108 of the method 100).
  • One or more features described in one embodiment may be combined with or replace features described in another embodiment.
  • the methods 100 and 300 of Figures 1 and 3 may be modified based on features described in relation to the system 200 and apparatus 400, 500 of Figures 2 , 4 and 5 , and vice versa.
  • certain methods described herein may be implemented by processing circuitry of a user equipment such as a mobile phone, tablet, mirror, smart device or any other device.
  • certain methods described herein may be implemented by processing circuitry of an online service such as provided by a server or cloud-based service.
  • the user equipment and the online service may exchange information as part of the implementation of certain methods described herein.
  • Embodiments in the present disclosure can be provided as methods, systems or as a combination of machine readable instructions and processing circuitry.
  • Such machine readable instructions may be included on a non-transitory machine (for example, computer) readable storage medium (including but not limited to disc storage, CD-ROM, optical storage, etc.) having computer readable program codes therein or thereon.
  • the machine readable instructions may, for example, be executed by a general purpose computer, a special purpose computer, an embedded processor or processors of other programmable data processing devices to realize the functions described in the description and diagrams.
  • a processor or processing circuitry, or a module thereof may execute the machine readable instructions.
  • functional modules of the apparatus 400, 500 for example, the obtaining module 404, determining module 406 and/or user instruction module 408 and other devices described herein may be implemented by a processor executing machine readable instructions stored in a memory, or a processor operating in accordance with instructions embedded in logic circuitry.
  • the term 'processor' is to be interpreted broadly to include a CPU, processing unit, ASIC, logic unit, or programmable gate array etc.
  • the methods and functional modules may all be performed by a single processor or divided amongst several processors.
  • Such machine readable instructions may also be stored in a computer readable storage that can guide the computer or other programmable data processing devices to operate in a specific mode.
  • Such machine readable instructions may also be loaded onto a computer or other programmable data processing devices, so that the computer or other programmable data processing devices perform a series of operations to produce computer-implemented processing, thus the instructions executed on the computer or other programmable devices realize functions specified by block(s) in the flow charts and/or in the block diagrams.
  • teachings herein may be implemented in the form of a computer program product, the computer program product being stored in a storage medium and comprising a plurality of instructions for making a computer device implement the methods recited in the embodiments of the present disclosure.
  • a computer program may be stored or distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Forests & Forestry (AREA)
  • Mechanical Engineering (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Dermatology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Processing (AREA)
  • Dry Shavers And Clippers (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
EP19215228.8A 2019-12-11 2019-12-11 Anleitung zur haarentfernung Withdrawn EP3835010A1 (de)

Priority Applications (6)

Application Number Priority Date Filing Date Title
EP19215228.8A EP3835010A1 (de) 2019-12-11 2019-12-11 Anleitung zur haarentfernung
JP2022535104A JP7371259B2 (ja) 2019-12-11 2020-12-04 毛除去命令
EP20816264.4A EP4072798B1 (de) 2019-12-11 2020-12-04 Anleitung zur haarentfernung
CN202080086177.1A CN114786893A (zh) 2019-12-11 2020-12-04 毛发去除指令
PCT/EP2020/084757 WO2021115976A1 (en) 2019-12-11 2020-12-04 Hair removal instructions
US17/779,182 US20230001593A1 (en) 2019-12-11 2020-12-04 Hair removal instructions

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP19215228.8A EP3835010A1 (de) 2019-12-11 2019-12-11 Anleitung zur haarentfernung

Publications (1)

Publication Number Publication Date
EP3835010A1 true EP3835010A1 (de) 2021-06-16

Family

ID=69063602

Family Applications (2)

Application Number Title Priority Date Filing Date
EP19215228.8A Withdrawn EP3835010A1 (de) 2019-12-11 2019-12-11 Anleitung zur haarentfernung
EP20816264.4A Active EP4072798B1 (de) 2019-12-11 2020-12-04 Anleitung zur haarentfernung

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP20816264.4A Active EP4072798B1 (de) 2019-12-11 2020-12-04 Anleitung zur haarentfernung

Country Status (5)

Country Link
US (1) US20230001593A1 (de)
EP (2) EP3835010A1 (de)
JP (1) JP7371259B2 (de)
CN (1) CN114786893A (de)
WO (1) WO2021115976A1 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023118461A1 (de) * 2021-12-23 2023-06-29 Erten Ayhan System zur entfernung von körperbehaarung an schwer zugänglichen stellen

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015068068A1 (en) * 2013-11-05 2015-05-14 Koninklijke Philips N.V. Programmable hair trimming system
WO2017148941A1 (en) * 2016-03-01 2017-09-08 Koninklijke Philips N.V. System and method for automated hairstyle processing and hair cutting device
WO2018149738A1 (en) * 2017-02-20 2018-08-23 Koninklijke Philips N.V. Operating a personal care device
WO2019011523A1 (en) * 2017-07-14 2019-01-17 Bic Violex S.A. APPARATUSES AND METHODS FOR MEASURING SKIN CHARACTERISTICS AND IMPROVING SHAVING EXPERIENCES

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4141260B2 (ja) * 2001-03-30 2008-08-27 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 保護された放射出力開口を有する皮膚処理装置
RU2665443C2 (ru) * 2013-11-06 2018-08-29 Конинклейке Филипс Н.В. Система и способ управления движениями пользователя во время процедуры бритья
WO2015067498A1 (en) * 2013-11-06 2015-05-14 Koninklijke Philips N.V. A system and a method for treating a part of a body
WO2015067484A1 (en) * 2013-11-06 2015-05-14 Koninklijke Philips N.V. A system and a method for treating a part of a body
KR102072284B1 (ko) * 2015-05-22 2020-01-31 로레알 피부 상태를 치료하기 위한 어플리케이터
CN109744701A (zh) * 2018-02-09 2019-05-14 深圳市洋沃电子有限公司 一种脱毛系统、脱毛云系统及脱毛方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015068068A1 (en) * 2013-11-05 2015-05-14 Koninklijke Philips N.V. Programmable hair trimming system
WO2017148941A1 (en) * 2016-03-01 2017-09-08 Koninklijke Philips N.V. System and method for automated hairstyle processing and hair cutting device
WO2018149738A1 (en) * 2017-02-20 2018-08-23 Koninklijke Philips N.V. Operating a personal care device
WO2019011523A1 (en) * 2017-07-14 2019-01-17 Bic Violex S.A. APPARATUSES AND METHODS FOR MEASURING SKIN CHARACTERISTICS AND IMPROVING SHAVING EXPERIENCES

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023118461A1 (de) * 2021-12-23 2023-06-29 Erten Ayhan System zur entfernung von körperbehaarung an schwer zugänglichen stellen
DE102021134444B4 (de) 2021-12-23 2024-06-20 Ayhan Erten System zur Entfernung von Körperbehaarung an schwer zugänglichen Stellen

Also Published As

Publication number Publication date
EP4072798C0 (de) 2023-07-26
JP2022553431A (ja) 2022-12-22
CN114786893A (zh) 2022-07-22
US20230001593A1 (en) 2023-01-05
JP7371259B2 (ja) 2023-10-30
EP4072798B1 (de) 2023-07-26
EP4072798A1 (de) 2022-10-19
WO2021115976A1 (en) 2021-06-17

Similar Documents

Publication Publication Date Title
JP6297687B2 (ja) シェービング処理中に利用者をガイドするためのシステム及び方法
RU2754195C2 (ru) Система для измерения совокупности клинических параметров функции зрения
US20220176575A1 (en) Determining a device location a body part
RU2719426C2 (ru) Устройство и способ определения положения мобильного устройства относительно субъекта
CN109890289A (zh) 情绪估计设备、方法和程序
JP2018531437A5 (de)
US10573026B2 (en) Analysis unit and system for assessment of hair condition
US20190325256A1 (en) Tracking a head of a subject
EP4072798B1 (de) Anleitung zur haarentfernung
WO2015087323A1 (en) Emotion based 3d visual effects
JP7561317B2 (ja) 毛髪切断装置の動作パラメータの決定
US20240265533A1 (en) Computer-based body part analysis methods and systems
US20240269873A1 (en) Determining a beard growth distribution for a subject
EP4409547A1 (de) Maschinenlernen zur bestimmung von gesichtsmessungen über erfasste bilder
JP2023532841A (ja) 処理後の使用者の外観の予測
MX2013013238A (es) Tecnologia para el apoyo a la caracterizacion semi-automatica y clasificacion de padecimientos dermatologicos.

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20211217