EP3936000A1 - Personal care device - Google Patents
Personal care device Download PDFInfo
- Publication number
- EP3936000A1 EP3936000A1 EP20185356.1A EP20185356A EP3936000A1 EP 3936000 A1 EP3936000 A1 EP 3936000A1 EP 20185356 A EP20185356 A EP 20185356A EP 3936000 A1 EP3936000 A1 EP 3936000A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- handle
- sensors
- orientation
- function
- personal care
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A46—BRUSHWARE
- A46B—BRUSHES
- A46B15/00—Other brushes; Brushes with additional arrangements
- A46B15/0002—Arrangements for enhancing monitoring or controlling the brushing process
- A46B15/0004—Arrangements for enhancing monitoring or controlling the brushing process with a controlling means
- A46B15/0006—Arrangements for enhancing monitoring or controlling the brushing process with a controlling means with a controlling brush technique device, e.g. stroke movement measuring device
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B26—HAND CUTTING TOOLS; CUTTING; SEVERING
- B26B—HAND-HELD CUTTING TOOLS NOT OTHERWISE PROVIDED FOR
- B26B19/00—Clippers or shavers operating with a plurality of cutting edges, e.g. hair clippers, dry shavers
- B26B19/38—Details of, or accessories for, hair clippers, or dry shavers, e.g. housings, casings, grips, guards
- B26B19/3873—Electric features; Charging; Computing devices
- B26B19/388—Sensors; Control
-
- A—HUMAN NECESSITIES
- A46—BRUSHWARE
- A46B—BRUSHES
- A46B2200/00—Brushes characterized by their functions, uses or applications
- A46B2200/10—For human or animal care
- A46B2200/1066—Toothbrush for cleaning the teeth or dentures
Definitions
- the present invention generally relates to a personal care device comprising a handle, a method of detecting an orientation of a handle of a personal care device and also a non-transitory computer readable storage medium.
- a well known problem related thereto refers to determining which part(s) of the mouth a user have brushed and for how long. This is often referred to as brush head localization.
- Solutions to the problem are typically based on sensor modalities that can be broadly categorized by the use of proprioceptive stimuli or exteroceptive stimuli.
- Proprioceptive stimuli is provided for by the use of passive sensors.
- Exteroceptive stimuli is provided for by using active and/or passive sensors where the passive sensor is a camera.
- a successful brush head localization implementation aims to satisfy some key criterions.
- One criterion is to achieve localization in all brushing scenarios with limited to no impact to user routine. Examples of typical user routines are brushing the teeth under the shower or while doing activities of daily living. Further, there should be no need to adjust the style of brushing to accommodate localization, e.g. brushing teeth in a particular order.
- Another criterion is limited lag in determining brushing location.
- Yet another criterion is that the cost of implementation should be low.
- the second most common implementations fall under the "exteroceptive/passive" category, where the sensor signal is directly influenced by aspects of the mouth and its surroundings.
- Known implementation of this category make use of a smartphone camera to extract the position of the toothbrush relative to the face. While such an implementation allows for faster detection of brush head location from a single image frame, its application is strictly limited to cases where the user's face is directly within the camera's field of view. Such an implementation is not suitable since it has an impact on the user routines and also the relative cost of camera means.
- the solution should not only be applicable to electrical toothbrushes per se but to personal care devices as such.
- a personal care device comprising a handle comprising a plurality of sensors configured for detecting proximity data between each of the plurality of sensors and a user's hand when gripping the handle.
- the present invention resides in a discovery resulting from the inventor's extensive investigations that the operator's hand grip physically differs depending on which part of the oral cavity is brushed since the wrist must be angled differently to reach different areas. The same applies to other personal care devices to be used on other parts of the body.
- the inventors have realized that by providing the handle with a plurality of sensors, proximity data referring to the proximity between each sensor and the operator's hand may be determined. Especially, the proximity data between the plurality of sensors and one or more of the carpus of the hand, the palm and the fingers may be detected.
- the plurality of sensors being distributed along the circumference, the sensors will provide a substantially 360 degree detection area around the gripping portion of the handle in order to allow categorization and distinguishing between different types of hand grips.
- These categorized and distinguished handgrips may be linked to different localizations of a tooling configured to be supported by the handle such as a brush head in the event of a tooth brush.
- the invention there is no impact on the user's routine of using the handle since the plurality of sensors are configured to interact with the operator's hand when holding the handle. Accordingly, there is no need to adjust the style of brushing and hence movement pattern. Also, the invention is applicable no matter which type of environment the handle is used in.
- the invention may be implemented to a low cost.
- the proximity data may be collected real-time whereby the detection may be without any undue time lag. This in turn allows an immediate feedback to the operator when using the device.
- the plurality of sensors may be equidistantly distributed along the circumference of the handle. As an advantage thereof, the plurality of sensors will allow detection of the proximity data referring to a proximity between an operator's hand and the respective sensors no matter how the operator holds the handle.
- a substantially 360 degree detection area around a gripping portion of the handle may be provided for to allow categorization and distinguishing between different types of hand grips and hence allow determining of localization.
- the plurality of sensors may be arranged, as a non-limiting example, adjacent a rear end portion of the handle opposite an end configured to support a tooling.
- One and the same handle may be provided with two or more sets of sensors, where each set comprises a plurality of sensors which are equidistantly distributed along the circumference of the handle. The number of sensors in each set may differ.
- the plurality of sensors may be at least four sensors and more preferred at least six sensors. The more sensors, the better resolution of the proximity data may be provided for.
- the plurality of sensors may be capacitive sensors or photocells.
- a capacitive sensor allows detecting and measuring anything that is conductive or has a dielectric different from air.
- a photocell is a sensor that changes resistance depending on the light incident on it.
- the detection no matter type of sensor may be made without any contact between the operator's hand and the plurality of sensors.
- the detected signal strength is in the context of the invention referred to as proximity data.
- the proximity data from a first sensor may be compared with the proximity data from adjacent sensors to thereby allow determining which sensor or sensors have the closest proximity to the operator's hand at a given time.
- the handle may further comprise a circuitry configured to execute an orienting function configured to determine an orientation of the handle by comparing the proximity data of the plurality of sensors with a set of predetermined proximity intervals representative for specific orientations of the handle.
- the predetermined proximity intervals may be information which is stored in an array where each cell in such array comprises a predetermined proximity interval for a given sensor and for a given localization, such as a in the context of a tooth brush, brush head localization.
- the predetermined proximity intervals may by way of example be denoted "very close”, “quite close” and “not close”. Other alternatives of denoting are by using letters or numbers.
- a pre-requisite for a specific predetermined toothbrush localization to apply a predetermined combination of proximity data for the plurality of sensors must be fulfilled.
- the array may comprise the following predetermined toothbrush localizations in an oral cavity: the outer portion of the upper right jaw; the inner portion of the upper right jaw; the outer portion of the lower right jaw; the inner portion of the lower right jaw; the outer portion of the upper middle portion of the jaw; the inner portion of the upper middle portion of the jaw; the outer portion of the lower middle portion of the jaw; the inner portion of the lower middle portion of the jaw; the outer portion of the upper left jaw; the inner portion of the upper left jaw; the outer portion of the lower left jaw; and the inner portion of the lower left jaw.
- the orientation function may be configured to determine for how long time the tooling which is configured to be supported by the handle has been in contact with a specific portion of the oral cavity during an operation cycle.
- the execution of the orienting function may be made in real-time or after a completed operation cycle.
- the orienting function may further be configured to categorize the determined orientation of the handle as a function of time representative for specific orientations of the handle. This allows monitoring of the operator's performance during an operation cycle. This may be used to provide an immediate feedback during operation of the handle to teach/maintain/promote a good operation technique. It may also be used in an evaluation made afterwards, by e.g. by the operator herself or by a third party such as a dentist.
- the handle may further comprise an internal data storage, wherein the circuitry is further configured to execute a storing function configured to store the categorized orientation of the handle as a function of time in the data storage.
- the handle may further comprise a communication unit, wherein the communication unit is configured to communicate the categorized orientation of the handle as a function of time, or the determined orientation of the handle as a function of time to an external data storage separate from the handle.
- the communication unit may be wired or wireless. In the event of a wireless communication unit, the communication may be made via Wi-Fi or Bluetooth.
- the invention provides an assembly comprising a handle with the features described above and a cradle configured to support the handle when not in use, the cradle comprising circuitry configured to execute an extraction function configured to extract the stored categorized orientation of the handle as a function of time from the internal data storage of the handle.
- the thus extracted information may by way of example be used for presentation to the user or to any third party.
- the cradle may be the same cradle that is used to store and/or charge the personal care device.
- the cradle may further comprise a display, wherein the circuitry of the cradle is further configured to execute a display function configured to display a representation of the categorized orientation of the handle on the display.
- the representation may be made by using one or more of the features from a group consisting of photos, illustrations, icons, colours, text, numbers and audial signals.
- the cradle may further comprise a communication unit configured to communicate the categorized orientation of the handle as a function of time to an external data storage separate from the cradle.
- external data storage may by way of example be a cloud-based solution to which a third party has access to the stored information remotely.
- the invention refers to a method of detecting an orientation of a handle of a personal care device, comprising: receiving from a plurality of sensors on the handle proximity data of a proximity between a user's hand and the plurality of sensors; and determining the orientation of the handle by comparing the proximity data with a set of predetermined proximity data representative for specific orientations of the handle.
- the invention provides the advantage that by using a plurality of sensors, the proximity data from a first sensor may be compared with the proximity data from adjacent sensors to thereby allow determining which sensor or sensors have the closest proximity to the operator's hand at a given time. By comparing the proximity data received from the plurality of sensors with predetermined proximity intervals, it may be able to determine localization data such as brush head localization in the event the personal care device is a toothbrush.
- the method may further comprise categorizing the determined orientation of the handle as a function of time representative for specific orientations of the handle.
- the invention refers to a non-transitory computer readable recording medium having computer readable program code recorded thereon which when executed on a device having processing capability is configured to perform the method of the previous aspect.
- Fig. 1 schematically shows an exemplary personal care device 100 in the form of an electrical tooth brush in which the teaching of the present disclosure may be implemented. It is however to be emphasized that the teaching of the present disclosure may be implemented in other devices as well where localization sensing is required. For example, the teachings may be applied to personal care devices such as tongue cleaners, shavers, hair clippers or trimmers, hair removal devices, or skin care devices.
- the localization which is to be determined may be in relation to a specific portion of the body.
- the electric toothbrush in Fig. 1 comprises a handle 1 to which a tooling in the form of a brush head 2 is removably or non-removably mounted.
- the handle 1 includes a housing 3, at least a portion of which is hollow, to contain components of the devices, for example, a drive assembly in the form of a motor M, a circuitry 4 supported by a printed circuitry board 5, an internal data storage 7, a control unit 8, a communication unit 9 and a power source 10.
- a drive assembly in the form of a motor M for example, a drive assembly in the form of a motor M, a circuitry 4 supported by a printed circuitry board 5, an internal data storage 7, a control unit 8, a communication unit 9 and a power source 10.
- the particular configuration and arrangement shown in Fig. 1 is by way of example only and does not limit the scope of the embodiments disclosed below.
- the handle 1 comprises a plurality of sensors 6.
- the plurality of sensors 6 are equidistantly distributed along the circumference of the handle 1.
- the sensors 6 are disclosed as being arranged along an end portion of the handle 1 opposite to the end supporting the brush head 2. It is to be understood that other positions are possible.
- the handle 1 may be provided with two or more sets of sensors 6, wherein each set comprises a plurality of sensors 6 which are equidistantly distributed along the circumference of the handle 1.
- the plurality of sensors 6 may be at least four sensors and more preferred at least six sensors. The more sensors 6, the better resolution of the proximity data may be provided for. In the event of two or more sets of sensors 6, the number of sensors 6 in each set may be different.
- the plurality of sensors 6 are embodied as capacitive sensors.
- a capacitive sensor allows detecting and measuring anything that is conductive or has a dielectric different from air.
- the sensors 6 may be photocells.
- a photocell changes resistance depending on the light incident on it.
- the detected signal strength no matter type of sensor is in the context of the invention referred to as proximity data.
- the detection may made without any contact between the operator's hand and the plurality of sensors 6. Thus, the detection may be provided for in a contact-less manner.
- Each sensor 6 is connected to the circuitry 4 to be discussed below.
- FIG. 2a one set of sensors 6 is schematically illustrated equidistantly distributed along the circumference of the handle 1.
- Each sensor 6 is provided with an identification, in the example A, B, C, D, E and F.
- each sensor 6 has a given position in view of the circumference of the handle 1 and as seen in view of the longitudinal extension of the handle 1.
- three sensors 6 denoted as A, F and E are arranged on a first side of a virtual line extending between a front portion of the handle 1 and a back portion of the handle 1 and three sensors 6 denoted as B, C and D are arranged on the other side of the virtual line.
- each sensor 6 is configured to detect, in a contact-less manner, a proximity to a user's hand in a condition when the operator grips the handle.
- the detected signal strength i.e. proximity data
- the detected signal strength is in the present example categorized into three different proximity intervals, see table of Fig. 2b , denoted as "very close”, “quite close” and “not close”.
- the proximity intervals are schematically illustrated by different shadings which shadings are used also in Fig. 3 .
- the proximity data detected by each individual sensor A, B, C, D, E, F is categorized as representing one of these three proximity intervals. It is to be understood that more or fewer proximity intervals may be used depending on desired resolution.
- the proximity data from a first sensor A may be compared with the proximity data from adjacent sensors B, C, D, E, F to thereby allow determining which sensor or sensors have the closest proximity to the operator's hand at a given time.
- Fig. 3 discloses twelve different positions of a brush head in view of the oral cavity.
- the twelve positions are, starting from the upper left comer: “outer portion of the upper right jaw”; “inner portion of the upper right jaw”; “outer portion of the lower right jaw”; “inner portion of the lower right jaw”; “outer portion of the upper middle part of the jaw”; “inner portion of the upper middle part of the jaw”; “outer portion of the lower middle part of the jaw”; “inner portion of the lower middle part of the jaw”; “outer portion of the upper left jaw”; “inner portion of the upper left jaw”; “outer portion of the upper left jaw”; “inner portion of the lower left jaw”.
- the operator's wrist is angled differently in view of the longitudinal extension of the handle 1 depending on the position of the brush head in view of the oral cavity.
- the position of the carpus, the fingers and the hand palm differ depending on the position of the brush head in view of the oral cavity.
- the inventors have discovered that by arranging the plurality of sensors 6 equidistantly around the circumference of the handle 1, these sensors 6 may be used to categorize and thereby identify different grips in real time.
- Fig. 3 the proximity intervals of Fig. 2b are applied to a first example shown in the upper left corner in Fig. 3 .
- sensors A and F both provide a proximity data that is categorized to be within the proximity interval "very close”.
- Sensor B provides a proximity data that is within the proximity interval "quite close” and sensors C, D and E provides a proximity data that is within the proximity interval "not close”.
- sensors A, B and C do all provide a proximity data that is within the proximity interval "not close”.
- Sensors D and F provide a proximity data that is within the proximity interval "quite close” and sensor E provides a proximity data that is within the proximity interval "very close”.
- this pre-determined categorized information is represented in the form of an array with the twelve different possible localizations vs proximity data for each sensor A, B, C, D, E, F in the plurality of sensors 6.
- the proximity intervals are stored in cells of this array. Each cell comprises a predetermined proximity interval for a given sensor and for a given categorized localization.
- the proximity data for all sensors A, B, C, D, E, F in the set of sensors 6 must exhibit a certain combination of proximity intervals. As seen in Fig. 3 and Fig. 4 in combination, for the localization "upper right outer” to apply, sensors A and F must both provide a proximity data that is within the proximity interval "very close”. Sensor B must provide a proximity data that is within the proximity interval "quite close” and sensors C, D and E must provide a proximity data that is within the proximity interval "not close”.
- sensor E must provide a proximity data that is within the proximity interval "very close”.
- Sensors D and F must provide a proximity data that is within the proximity interval "quite close” and sensors A, B and C must provide a proximity data that is within the proximity interval "not close”.
- this array of pre-determined information for different localizations is used by the circuitry 4 of the handle 1 during operation to determine brush head localization.
- the array may by way of example be stored in the internal data storage 7 of the handle 1 or in an internal or external data storage external from the handle 1.
- the internal data storage 7 may be one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, a random access memory (RAM), or another suitable device.
- the internal data storage 7 may include a non-volatile memory for long term data storage and a volatile memory that functions as system memory for the control unit 8.
- the circuitry 4 of the handle 1 is configured to execute an orienting function configured to determine an orientation of the handle 1 by comparing the proximity data of the plurality of sensors 6 with a set of predetermined proximity intervals representative for specific orientations of the handle 1. Accordingly, by comparing the proximity data received from the plurality of sensors 6 with the predetermined proximity intervals given in the array, it may be able to determine by way of example which portions in the oral cavity have been in brushed or not, for how long time and in which order. More precisely, the orientation function may be configured to determine for how long time the brush head which is supported by the handle 1 has been in contact with a specific portion of the oral cavity during an operation cycle.
- the execution of the orienting function may be made in real-time or after a completed operation cycle.
- the orienting function may be configured to categorize the determined orientation of the handle 1 as a function of time representative for specific orientations of the handle. Thus, it is made possible to determine e.g. for how long time during an operation cycle a specific part of the jaw has been brushed. This may be presented as a percentage of the time of a complete operation cycle. Alternatively, it may be presented to visualize a pattern to illustrate how the handle 1 was oriented during the operation cycle.
- the orienting function allows monitoring of the operator's performance during an operation cycle. This may be used to provide an immediate feedback during operation of the handle 1 to teach/maintain/promote a good operation technique. It may also be used in an evaluation made afterwards, by e.g. by the operator herself or by a third party.
- the circuitry 4 may further be configured to execute a storing function configured to store the categorized orientation of the handle 1 in the internal data storage 7 as a function of time.
- the plurality of sensors 6 are configured to communicate with the control unit 8 and the internal data storage 7.
- the control unit 8 is configured to carry out overall control of functions and operations of the localization arrangement/localization method of the handle 1.
- the control unit 8 may include a processor, such as a central processing unit (CPU), microcontroller, or microprocessor.
- the processor is configured to execute program code stored in the internal storage device to carry the localization.
- the handle 1 further comprises the communication unit 9, wherein the communication unit 9 is configured to communicate the categorized orientation of the handle 1 as a function of time, or the determined orientation of the handle 1 as a function of time to an external data storage separate from the handle 1.
- the communication unit 9 may be wired or wireless. In the event of a wireless communication unit, the communication may be made via Wi-Fi or Bluetooth.
- the handle 1 may form part of an assembly.
- such assembly may also comprise a cradle 20, see Fig, 1 which is configured to support the handle 1 when not in use.
- a cradle 20 may comprise a circuitry 21 configured to execute an extraction function configured to extract the stored categorized orientation of the handle 1 as a function of time from the internal data storage 7 of the handle 1. The thus extracted information may by way of example be used for presentation to the user or to any third party.
- the presentation to the operator may by way of example be made via a display 22 which forms part of or which is connected to the cradle 20.
- the circuitry 21 of the cradle 20 is configured to execute a display function configured to display a representation of the categorized orientation of the handle 1 on the display 22.
- the representation may be made by using one or more of features from a group consisting of photos, illustrations, icons, colours, text, numbers and audial signals.
- the cradle 20 may further comprise a communication unit 23 configured to communicate the categorized orientation of the handle 1 as a function of time to an external data storage 30 separate from the cradle 20.
- external data storage 30 may by way of example be a cloud-based solution to which a third party has access to the stored information remotely.
- the method comprises the following acts: Receiving 1000 from a plurality of sensors 6 on the handle 1 proximity data of a proximity between a user's hand and the plurality of sensors 6.
- Determining 2000 the orientation of the handle 1 by comparing the proximity data with a set of predetermined proximity data representative for specific orientations of the handle.
- the method may further comprise the act of categorizing 3000 the determined orientation of the handle 1 as a function of time representative for specific orientations of the handle 1.
- the result of such categorization may by way of example be used to analyse and which portions of the oral cavity have been brushed, for how long and in which sequence.
- the method may be provided with a calibration routine to allow that the signal strength and hence the proximity data resulting from a water film or a water droplet may be differentiated from that of the operator's hand.
- a non-transitory computer readable recording medium having computer readable program code recorded thereon which when executed on a device having processing capability is configured to perform the method as presented above.
- the invention resides in the discovery resulting from the inventor's extensive investigations that the hand grip physically differs depending on how the handle 1 is gripped depending on which part of the oral cavity is brushed.
- the discovery is equally applicable to other parts to be treated by a body by using a personal care device.
- the sensors 6 By the plurality of sensors 6 being distributed along the circumference of the handle 1, the sensors 6 will provide a substantially 360 degree detection area around the gripping portion of the handle 1 in order to allow categorization and distinguishing between different types of hand grips. These categorized and distinguished handgrips may be equalled with different localizations of a tooling supported by the handle 1 such as a brush head in the event of the device being a tooth brush.
- the categorization has been exemplified based on twelve localizations.
- the skilled person realizes that the number of localizations is adapted to the intended body part to be subjected to by the personal care device.
- the number of localizations may be fewer or more.
- the number of localizations may be extended to sixteen to thereby also cover the chewing surfaces of the upper and lower molars in the left and the right jaw.
- the device has been explained based on one set of sensors comprising six sensors.
- the skilled person realizes that the number of sets may be increased within the scope of the invention. Also, the number of sets of sensors and their relative position may be changed. Also, in the event of two or more sets of sensors, the number of sensors in each set may differ. Further, it is to be understood that the number of proximity intervals may be increased to more than three or decreased to fewer than three.
- the sensors have been exemplified as capacitive sensors or photocells. It is to be understood that also other types of sensors may be used with the same principle. It is also to be understood that two types of sensors may be combined.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Forests & Forestry (AREA)
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Biophysics (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
A personal care device (100) comprising a handle (1) is provided. The handle (1) comprises a plurality of sensors (6) configured for detecting proximity data between each of the plurality of sensors (6) and a user's hand when gripping the handle (1). Also, an assembly comprising a personal care device and a cradle (20) is provided, and a method of detecting an orientation of a handle (1) of a personal care device (100).
Description
- The present invention generally relates to a personal care device comprising a handle, a method of detecting an orientation of a handle of a personal care device and also a non-transitory computer readable storage medium.
- One well known example of a personal care device is an electrical tooth brush. Also, a well known problem related thereto refers to determining which part(s) of the mouth a user have brushed and for how long. This is often referred to as brush head localization. Solutions to the problem are typically based on sensor modalities that can be broadly categorized by the use of proprioceptive stimuli or exteroceptive stimuli. Proprioceptive stimuli is provided for by the use of passive sensors. Exteroceptive stimuli is provided for by using active and/or passive sensors where the passive sensor is a camera.
- From development standpoint, a successful brush head localization implementation aims to satisfy some key criterions. One criterion is to achieve localization in all brushing scenarios with limited to no impact to user routine. Examples of typical user routines are brushing the teeth under the shower or while doing activities of daily living. Further, there should be no need to adjust the style of brushing to accommodate localization, e.g. brushing teeth in a particular order. Another criterion is limited lag in determining brushing location. Yet another criterion is that the cost of implementation should be low.
- The most common implementations fall under the "proprioceptive/passive" category, where sensor signal is measured internal to the device's frame of reference. Known applications typically use sensors such as accelerometers to extract features which are then mapped to locations in the oral cavity. A fundamental limitation of this approach is that algorithms designed to extract the features, i.e. correct movement signals to map into mouth location, are based on the exploitation of spatiotemporal input data. Therefore, there must first be available a set of input signals collected over a variable amount of time from which the various features, such as movement signatures, can be extracted. This means that there can be no guarantee as to the time the algorithm will take to determine location. This makes this approach less suitable for applications where fast real-time detection of brush head location is required, such as the case where a toothbrush needs to adjust its behaviour in real-time depending on which location the user is brushing. Thus, this approach does not meet the first mentioned criteria. A further limitation to this approach is that the determined location suffers from ambiguities due to lack of reference to the mouth's frame of reference.
- The second most common implementations fall under the "exteroceptive/passive" category, where the sensor signal is directly influenced by aspects of the mouth and its surroundings. Known implementation of this category make use of a smartphone camera to extract the position of the toothbrush relative to the face. While such an implementation allows for faster detection of brush head location from a single image frame, its application is strictly limited to cases where the user's face is directly within the camera's field of view. Such an implementation is not suitable since it has an impact on the user routines and also the relative cost of camera means.
- Another approach known in the arts, is similar in modality to the implementation just discussed, with the difference that the sensors used are non-camera optical sensors such as infrared sensors. While lower cost, they are more complex to implement in a product since the sensors require a direct line of sight. For example, embedding infrared sensors within a toothbrush for sensing of location requires that a scratch-free cover is used to protect the sensors against water. This increases the cost of the product and restricts the usability in terms of where and how the product is used.
- There is hence a need for a solution providing an improved brush head localization which does not negatively impact the user routine, which exhibits no or limited lag and which can be implemented into an electric toothbrush for a low cost.
- The solution should not only be applicable to electrical toothbrushes per se but to personal care devices as such.
- These and other objectives are achieved by providing a handle of a personal care device having the features of the independent claim. Preferred embodiments are defined in the dependent claims.
- According to a first aspect of the present invention, there is provided a personal care device, comprising a handle comprising a plurality of sensors configured for detecting proximity data between each of the plurality of sensors and a user's hand when gripping the handle.
- The present invention resides in a discovery resulting from the inventor's extensive investigations that the operator's hand grip physically differs depending on which part of the oral cavity is brushed since the wrist must be angled differently to reach different areas. The same applies to other personal care devices to be used on other parts of the body. The inventors have realized that by providing the handle with a plurality of sensors, proximity data referring to the proximity between each sensor and the operator's hand may be determined. Especially, the proximity data between the plurality of sensors and one or more of the carpus of the hand, the palm and the fingers may be detected. By the plurality of sensors being distributed along the circumference, the sensors will provide a substantially 360 degree detection area around the gripping portion of the handle in order to allow categorization and distinguishing between different types of hand grips. These categorized and distinguished handgrips may be linked to different localizations of a tooling configured to be supported by the handle such as a brush head in the event of a tooth brush.
- As an advantage of the invention, there is no impact on the user's routine of using the handle since the plurality of sensors are configured to interact with the operator's hand when holding the handle. Accordingly, there is no need to adjust the style of brushing and hence movement pattern. Also, the invention is applicable no matter which type of environment the handle is used in.
- As another advantage, the invention may be implemented to a low cost.
- As yet another advantage, the proximity data may be collected real-time whereby the detection may be without any undue time lag. This in turn allows an immediate feedback to the operator when using the device.
- The plurality of sensors may be equidistantly distributed along the circumference of the handle. As an advantage thereof, the plurality of sensors will allow detection of the proximity data referring to a proximity between an operator's hand and the respective sensors no matter how the operator holds the handle. A substantially 360 degree detection area around a gripping portion of the handle may be provided for to allow categorization and distinguishing between different types of hand grips and hence allow determining of localization.
- The plurality of sensors may be arranged, as a non-limiting example, adjacent a rear end portion of the handle opposite an end configured to support a tooling. One and the same handle may be provided with two or more sets of sensors, where each set comprises a plurality of sensors which are equidistantly distributed along the circumference of the handle. The number of sensors in each set may differ.
- The plurality of sensors may be at least four sensors and more preferred at least six sensors. The more sensors, the better resolution of the proximity data may be provided for.
- The plurality of sensors may be capacitive sensors or photocells. A capacitive sensor allows detecting and measuring anything that is conductive or has a dielectric different from air. A photocell is a sensor that changes resistance depending on the light incident on it. The detection no matter type of sensor may be made without any contact between the operator's hand and the plurality of sensors. The detected signal strength is in the context of the invention referred to as proximity data. By using a plurality of sensors, the proximity data from a first sensor may be compared with the proximity data from adjacent sensors to thereby allow determining which sensor or sensors have the closest proximity to the operator's hand at a given time.
- The handle may further comprise a circuitry configured to execute an orienting function configured to determine an orientation of the handle by comparing the proximity data of the plurality of sensors with a set of predetermined proximity intervals representative for specific orientations of the handle.
- The predetermined proximity intervals may be information which is stored in an array where each cell in such array comprises a predetermined proximity interval for a given sensor and for a given localization, such as a in the context of a tooth brush, brush head localization. The predetermined proximity intervals may by way of example be denoted "very close", "quite close" and "not close". Other alternatives of denoting are by using letters or numbers. As a pre-requisite for a specific predetermined toothbrush localization to apply, a predetermined combination of proximity data for the plurality of sensors must be fulfilled. As a non-limiting example, the array may comprise the following predetermined toothbrush localizations in an oral cavity: the outer portion of the upper right jaw; the inner portion of the upper right jaw; the outer portion of the lower right jaw; the inner portion of the lower right jaw; the outer portion of the upper middle portion of the jaw; the inner portion of the upper middle portion of the jaw; the outer portion of the lower middle portion of the jaw; the inner portion of the lower middle portion of the jaw; the outer portion of the upper left jaw; the inner portion of the upper left jaw; the outer portion of the lower left jaw; and the inner portion of the lower left jaw.
- Accordingly, by comparing the proximity data received from the plurality of sensors with predetermined proximity intervals, it may be able to determine brush head localization. It is by way of example possible to determine which portions in the oral cavity have been in brushed or not, for how long time and in which order. More precisely, the orientation function may be configured to determine for how long time the tooling which is configured to be supported by the handle has been in contact with a specific portion of the oral cavity during an operation cycle.
- The execution of the orienting function may be made in real-time or after a completed operation cycle.
- The orienting function may further be configured to categorize the determined orientation of the handle as a function of time representative for specific orientations of the handle. This allows monitoring of the operator's performance during an operation cycle. This may be used to provide an immediate feedback during operation of the handle to teach/maintain/promote a good operation technique. It may also be used in an evaluation made afterwards, by e.g. by the operator herself or by a third party such as a dentist.
- The handle may further comprise an internal data storage, wherein the circuitry is further configured to execute a storing function configured to store the categorized orientation of the handle as a function of time in the data storage.
- The handle may further comprise a communication unit, wherein the communication unit is configured to communicate the categorized orientation of the handle as a function of time, or the determined orientation of the handle as a function of time to an external data storage separate from the handle. The communication unit may be wired or wireless. In the event of a wireless communication unit, the communication may be made via Wi-Fi or Bluetooth.
- According to another aspect, the invention provides an assembly comprising a handle with the features described above and a cradle configured to support the handle when not in use, the cradle comprising circuitry configured to execute an extraction function configured to extract the stored categorized orientation of the handle as a function of time from the internal data storage of the handle. The thus extracted information may by way of example be used for presentation to the user or to any third party. The cradle may be the same cradle that is used to store and/or charge the personal care device.
- The cradle may further comprise a display, wherein the circuitry of the cradle is further configured to execute a display function configured to display a representation of the categorized orientation of the handle on the display. The representation may be made by using one or more of the features from a group consisting of photos, illustrations, icons, colours, text, numbers and audial signals.
- The cradle may further comprise a communication unit configured to communicate the categorized orientation of the handle as a function of time to an external data storage separate from the cradle. Such external data storage may by way of example be a cloud-based solution to which a third party has access to the stored information remotely.
- According to another aspect, the invention refers to a method of detecting an orientation of a handle of a personal care device, comprising:
receiving from a plurality of sensors on the handle proximity data of a proximity between a user's hand and the plurality of sensors; and determining the orientation of the handle by comparing the proximity data with a set of predetermined proximity data representative for specific orientations of the handle. - The advantages of the handle as such have been thoroughly discussed above and these advantages are equally applicable to the method of detecting an orientation of such handle. To avoid undue repetition reference is made to the previous discussion. In short, the invention provides the advantage that by using a plurality of sensors, the proximity data from a first sensor may be compared with the proximity data from adjacent sensors to thereby allow determining which sensor or sensors have the closest proximity to the operator's hand at a given time. By comparing the proximity data received from the plurality of sensors with predetermined proximity intervals, it may be able to determine localization data such as brush head localization in the event the personal care device is a toothbrush.
- The method may further comprise categorizing the determined orientation of the handle as a function of time representative for specific orientations of the handle.
- The advantage is provided that it is made possible to analyse the performance of the operation cycle. Parameters that may be analysed is by way of example which portions of the oral cavity has been brushed, for how long and in which sequence.
- As yet another aspect, the invention refers to a non-transitory computer readable recording medium having computer readable program code recorded thereon which when executed on a device having processing capability is configured to perform the method of the previous aspect. The above-mentioned features of the method, when applicable, apply to this third aspect as well. To avoid undue repetition, reference is made to the above.
- Further objectives of, features of, and advantages with, the present invention will become apparent when studying the following detailed disclosure, the drawings and the appended claims. Those skilled in the art will realize that different features of the present invention can be combined to create embodiments other than those described in the following.
- These and other aspects of the present invention will now be described in more detail, with reference to the appended drawings showing embodiment(s) of the invention.
-
Fig. 1 schematically discloses one embodiment of a personal care device in the form of an electrical tooth brush and a cradle. -
Fig. 2a schematically discloses a plurality of sensors distributed along the circumference of a handle. -
Fig. 2b highly schematically discloses an array of proximity data and categorization thereof. -
Fig. 3 discloses schematically predetermined proximity intervals representative for specific orientations of the handle in view of the oral cavity. -
Fig. 4 discloses an array of proximity data representative for specific orientations of the handle in view of the oral cavity. -
Fig. 5 discloses a flow chart of a method of detecting an orientation of a handle of a personal care device. -
Fig. 1 schematically shows an exemplarypersonal care device 100 in the form of an electrical tooth brush in which the teaching of the present disclosure may be implemented. It is however to be emphasized that the teaching of the present disclosure may be implemented in other devices as well where localization sensing is required. For example, the teachings may be applied to personal care devices such as tongue cleaners, shavers, hair clippers or trimmers, hair removal devices, or skin care devices. The localization which is to be determined may be in relation to a specific portion of the body. - The electric toothbrush in
Fig. 1 comprises a handle 1 to which a tooling in the form of abrush head 2 is removably or non-removably mounted. The handle 1 includes ahousing 3, at least a portion of which is hollow, to contain components of the devices, for example, a drive assembly in the form of a motor M, acircuitry 4 supported by a printedcircuitry board 5, aninternal data storage 7, acontrol unit 8, acommunication unit 9 and apower source 10. The particular configuration and arrangement shown inFig. 1 is by way of example only and does not limit the scope of the embodiments disclosed below. - The handle 1 comprises a plurality of
sensors 6. The plurality ofsensors 6 are equidistantly distributed along the circumference of the handle 1. Thesensors 6 are disclosed as being arranged along an end portion of the handle 1 opposite to the end supporting thebrush head 2. It is to be understood that other positions are possible. It is also to be understood that the handle 1 may be provided with two or more sets ofsensors 6, wherein each set comprises a plurality ofsensors 6 which are equidistantly distributed along the circumference of the handle 1. - The plurality of
sensors 6 may be at least four sensors and more preferred at least six sensors. Themore sensors 6, the better resolution of the proximity data may be provided for. In the event of two or more sets ofsensors 6, the number ofsensors 6 in each set may be different. - The plurality of
sensors 6 are embodied as capacitive sensors. A capacitive sensor allows detecting and measuring anything that is conductive or has a dielectric different from air. Alternatively, thesensors 6 may be photocells. A photocell changes resistance depending on the light incident on it. - The detected signal strength no matter type of sensor is in the context of the invention referred to as proximity data.
- No matter if the
sensors 6 are capacitive sensors or photocells, the detection may made without any contact between the operator's hand and the plurality ofsensors 6. Thus, the detection may be provided for in a contact-less manner. - Each
sensor 6 is connected to thecircuitry 4 to be discussed below. - Now turning to
Fig. 2a , one set ofsensors 6 is schematically illustrated equidistantly distributed along the circumference of the handle 1. Eachsensor 6 is provided with an identification, in the example A, B, C, D, E and F. Also, eachsensor 6 has a given position in view of the circumference of the handle 1 and as seen in view of the longitudinal extension of the handle 1. In the given example threesensors 6 denoted as A, F and E are arranged on a first side of a virtual line extending between a front portion of the handle 1 and a back portion of the handle 1 and threesensors 6 denoted as B, C and D are arranged on the other side of the virtual line. - As given above, each
sensor 6 is configured to detect, in a contact-less manner, a proximity to a user's hand in a condition when the operator grips the handle. The detected signal strength, i.e. proximity data, is in the present example categorized into three different proximity intervals, see table ofFig. 2b , denoted as "very close", "quite close" and "not close". In the table the proximity intervals are schematically illustrated by different shadings which shadings are used also inFig. 3 . Accordingly, the proximity data detected by each individual sensor A, B, C, D, E, F is categorized as representing one of these three proximity intervals. It is to be understood that more or fewer proximity intervals may be used depending on desired resolution. - By using a plurality of
sensors 6, the proximity data from a first sensor A may be compared with the proximity data from adjacent sensors B, C, D, E, F to thereby allow determining which sensor or sensors have the closest proximity to the operator's hand at a given time. By the known identification of each sensor A, B, C, D, E, F and also the known position of eachsensor 6 on the handle 1, not only circumferentially but also longitudinally, it is made possible to determine which part of the hand is close to the plurality ofsensors 6 and hence correlate that information to how the operator holds the handle 1 at a given time. - This is exemplified in
Fig. 3 which discloses twelve different positions of a brush head in view of the oral cavity. The twelve positions are, starting from the upper left comer: "outer portion of the upper right jaw"; "inner portion of the upper right jaw"; "outer portion of the lower right jaw"; "inner portion of the lower right jaw"; "outer portion of the upper middle part of the jaw"; "inner portion of the upper middle part of the jaw"; "outer portion of the lower middle part of the jaw"; "inner portion of the lower middle part of the jaw"; "outer portion of the upper left jaw"; "inner portion of the upper left jaw"; "outer portion of the upper left jaw"; "inner portion of the lower left jaw". - As is seen from these illustrations in
Fig. 3 , the operator's wrist is angled differently in view of the longitudinal extension of the handle 1 depending on the position of the brush head in view of the oral cavity. Thus, the position of the carpus, the fingers and the hand palm differ depending on the position of the brush head in view of the oral cavity. The inventors have discovered that by arranging the plurality ofsensors 6 equidistantly around the circumference of the handle 1, thesesensors 6 may be used to categorize and thereby identify different grips in real time. - Now turning to
Fig. 3 , the proximity intervals ofFig. 2b are applied to a first example shown in the upper left corner inFig. 3 . In this first example, sensors A and F both provide a proximity data that is categorized to be within the proximity interval "very close". Sensor B provides a proximity data that is within the proximity interval "quite close" and sensors C, D and E provides a proximity data that is within the proximity interval "not close". - Correspondingly, in the example in the upper right corner in
Fig. 3 , sensors A, B and C do all provide a proximity data that is within the proximity interval "not close". Sensors D and F provide a proximity data that is within the proximity interval "quite close" and sensor E provides a proximity data that is within the proximity interval "very close". - Now turning to
Fig. 4 , this pre-determined categorized information is represented in the form of an array with the twelve different possible localizations vs proximity data for each sensor A, B, C, D, E, F in the plurality ofsensors 6. The proximity intervals are stored in cells of this array. Each cell comprises a predetermined proximity interval for a given sensor and for a given categorized localization. - For a certain pre-determined localization to apply, the proximity data for all sensors A, B, C, D, E, F in the set of
sensors 6 must exhibit a certain combination of proximity intervals. As seen inFig. 3 andFig. 4 in combination, for the localization "upper right outer" to apply, sensors A and F must both provide a proximity data that is within the proximity interval "very close". Sensor B must provide a proximity data that is within the proximity interval "quite close" and sensors C, D and E must provide a proximity data that is within the proximity interval "not close". - Correspondingly, for the localization "upper right inner" to apply, sensor E must provide a proximity data that is within the proximity interval "very close". Sensors D and F must provide a proximity data that is within the proximity interval "quite close" and sensors A, B and C must provide a proximity data that is within the proximity interval "not close".
- As will be describe below, this array of pre-determined information for different localizations is used by the
circuitry 4 of the handle 1 during operation to determine brush head localization. The array may by way of example be stored in theinternal data storage 7 of the handle 1 or in an internal or external data storage external from the handle 1. Theinternal data storage 7 may be one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, a random access memory (RAM), or another suitable device. In a typical arrangement, theinternal data storage 7 may include a non-volatile memory for long term data storage and a volatile memory that functions as system memory for thecontrol unit 8. - The
circuitry 4 of the handle 1 is configured to execute an orienting function configured to determine an orientation of the handle 1 by comparing the proximity data of the plurality ofsensors 6 with a set of predetermined proximity intervals representative for specific orientations of the handle 1. Accordingly, by comparing the proximity data received from the plurality ofsensors 6 with the predetermined proximity intervals given in the array, it may be able to determine by way of example which portions in the oral cavity have been in brushed or not, for how long time and in which order. More precisely, the orientation function may be configured to determine for how long time the brush head which is supported by the handle 1 has been in contact with a specific portion of the oral cavity during an operation cycle. - The execution of the orienting function may be made in real-time or after a completed operation cycle.
- The orienting function may be configured to categorize the determined orientation of the handle 1 as a function of time representative for specific orientations of the handle. Thus, it is made possible to determine e.g. for how long time during an operation cycle a specific part of the jaw has been brushed. This may be presented as a percentage of the time of a complete operation cycle. Alternatively, it may be presented to visualize a pattern to illustrate how the handle 1 was oriented during the operation cycle. Thus, the orienting function allows monitoring of the operator's performance during an operation cycle. This may be used to provide an immediate feedback during operation of the handle 1 to teach/maintain/promote a good operation technique. It may also be used in an evaluation made afterwards, by e.g. by the operator herself or by a third party.
- The
circuitry 4 may further be configured to execute a storing function configured to store the categorized orientation of the handle 1 in theinternal data storage 7 as a function of time. - The plurality of
sensors 6 are configured to communicate with thecontrol unit 8 and theinternal data storage 7. Thecontrol unit 8 is configured to carry out overall control of functions and operations of the localization arrangement/localization method of the handle 1. Thecontrol unit 8 may include a processor, such as a central processing unit (CPU), microcontroller, or microprocessor. The processor is configured to execute program code stored in the internal storage device to carry the localization. - The handle 1 further comprises the
communication unit 9, wherein thecommunication unit 9 is configured to communicate the categorized orientation of the handle 1 as a function of time, or the determined orientation of the handle 1 as a function of time to an external data storage separate from the handle 1. Thecommunication unit 9 may be wired or wireless. In the event of a wireless communication unit, the communication may be made via Wi-Fi or Bluetooth. - The handle 1 may form part of an assembly. In addition to the handle 1, such assembly may also comprise a
cradle 20, seeFig, 1 which is configured to support the handle 1 when not in use.Such cradle 20 may comprise acircuitry 21 configured to execute an extraction function configured to extract the stored categorized orientation of the handle 1 as a function of time from theinternal data storage 7 of the handle 1. The thus extracted information may by way of example be used for presentation to the user or to any third party. - The presentation to the operator may by way of example be made via a
display 22 which forms part of or which is connected to thecradle 20. Thecircuitry 21 of thecradle 20 is configured to execute a display function configured to display a representation of the categorized orientation of the handle 1 on thedisplay 22. The representation may be made by using one or more of features from a group consisting of photos, illustrations, icons, colours, text, numbers and audial signals. - The
cradle 20 may further comprise acommunication unit 23 configured to communicate the categorized orientation of the handle 1 as a function of time to anexternal data storage 30 separate from thecradle 20. Suchexternal data storage 30 may by way of example be a cloud-based solution to which a third party has access to the stored information remotely. - Now turning to
Fig. 5 , the invention and its operation may be described in the terms of a method of detecting an orientation of a handle 1 of a personal care device. The method comprises the following acts:
Receiving 1000 from a plurality ofsensors 6 on the handle 1 proximity data of a proximity between a user's hand and the plurality ofsensors 6. - Determining 2000 the orientation of the handle 1 by comparing the proximity data with a set of predetermined proximity data representative for specific orientations of the handle.
- The method may further comprise the act of categorizing 3000 the determined orientation of the handle 1 as a function of time representative for specific orientations of the handle 1. The result of such categorization may by way of example be used to analyse and which portions of the oral cavity have been brushed, for how long and in which sequence.
- While capacitive sensors are sensitive to water, the method may be provided with a calibration routine to allow that the signal strength and hence the proximity data resulting from a water film or a water droplet may be differentiated from that of the operator's hand.
- According to yet another aspect, a non-transitory computer readable recording medium having computer readable program code recorded thereon which when executed on a device having processing capability is configured to perform the method as presented above.
- To sum up, the invention resides in the discovery resulting from the inventor's extensive investigations that the hand grip physically differs depending on how the handle 1 is gripped depending on which part of the oral cavity is brushed. In a broader perspective, the discovery is equally applicable to other parts to be treated by a body by using a personal care device. By providing the handle 1 with a plurality of
sensors 6, proximity data referring to the proximity between eachsensor 6 and the operator's hand may be determined. Especially, the proximity data between the plurality ofsensors 6 and one or more of the carpus of the hand, the palm and the fingers may be detected. By the plurality ofsensors 6 being distributed along the circumference of the handle 1, thesensors 6 will provide a substantially 360 degree detection area around the gripping portion of the handle 1 in order to allow categorization and distinguishing between different types of hand grips. These categorized and distinguished handgrips may be equalled with different localizations of a tooling supported by the handle 1 such as a brush head in the event of the device being a tooth brush. - The person skilled in the art realizes that the present invention by no means is limited to the preferred embodiments described above. On the contrary, many modifications and variations are possible within the scope of the appended claims. For example, the categorization has been exemplified based on twelve localizations. The skilled person realizes that the number of localizations is adapted to the intended body part to be subjected to by the personal care device. The number of localizations may be fewer or more. In the context of the device being an electrical tooth brush the number of localizations may be extended to sixteen to thereby also cover the chewing surfaces of the upper and lower molars in the left and the right jaw.
- The device has been explained based on one set of sensors comprising six sensors. The skilled person realizes that the number of sets may be increased within the scope of the invention. Also, the number of sets of sensors and their relative position may be changed. Also, in the event of two or more sets of sensors, the number of sensors in each set may differ. Further, it is to be understood that the number of proximity intervals may be increased to more than three or decreased to fewer than three.
- The sensors have been exemplified as capacitive sensors or photocells. It is to be understood that also other types of sensors may be used with the same principle. It is also to be understood that two types of sensors may be combined.
Claims (14)
- A personal care device, comprising a handle (1) comprising a plurality of sensors (6) configured for detecting proximity data between each of the plurality of sensors (6) and a user's hand when gripping the handle (1).
- The personal care device according to claim 1, wherein the plurality of sensors (6) are equidistantly distributed along the circumference of the handle (1).
- The personal care device according to claim 1 or 2, wherein the plurality of sensors (6) is at least four sensors and more preferred at least six sensors.
- The personal care device according to any of the preceding claims, wherein the plurality of sensors (6) are capacitive sensors or photocells.
- The personal care device according to any of the preceding claims 1, further comprising a circuitry (4) configured to execute an orienting function configured to determine an orientation of the handle (1) by comparing the proximity data of the plurality of sensors (6) with a set of predetermined proximity intervals representative for specific orientations of the handle (1).
- The personal care device according to claim 5, wherein the orienting function is further configured to categorize the determined orientation of the handle (1) as a function of time representative for specific orientations of the handle (1).
- The personal care device according to claim 6, further comprising an internal data storage (7), wherein the circuitry (4) is further configured to execute a storing function configured to store the categorized orientation of the handle (1) as a function of time in the internal data storage (7).
- The personal care device according to claim 5 or 6, further comprising a communication unit (9), wherein the communication unit (9) is configured to communicate the categorized orientation of the handle (1) as a function of time, or the determined orientation of the handle (1) as a function of time to an external data storage (30) separate from the handle (1).
- An assembly comprising a personal care device according to claim 7 and a cradle (20) configured to support the handle (1) when not in use, the cradle (20) comprising circuitry (21) configured to execute an extraction function configured to extract the stored categorized orientation of the handle (1) as a function of time from the internal data storage (7) of the handle (1).
- The assembly according to claim 9, wherein the cradle (20) further comprises a display (22), wherein the circuitry (21) of the cradle (20) is further configured to execute a display function configured to display a representation of the categorized orientation of the handle (1) on the display (22).
- The assembly according to claim 9 or 10, wherein the cradle (20) further comprises a communication unit (23) configured to communicate the categorized orientation of the handle (1) as a function of time to an external data storage separate (30) from the cradle (20).
- Method of detecting an orientation of a handle (1) of a personal care device (100), comprising:receiving (1000) from a plurality of sensors (6) on the handle (1) proximity data of a proximity between a user's hand and the plurality of sensors (6); anddetermining (2000) the orientation of the handle (1) by comparing the proximity data with a set of predetermined proximity data representative for specific orientations of the handle (1).
- The method of claim 12, further comprising categorizing (3000) the determined orientation of the handle (1) as a function of time representative for specific orientations of the handle (1).
- A non-transitory computer readable recording medium having computer readable program code recorded thereon which when executed on a device having processing capability is configured to perform the method of any one of claims 12-13.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20185356.1A EP3936000A1 (en) | 2020-07-10 | 2020-07-10 | Personal care device |
PCT/EP2021/067939 WO2022008308A1 (en) | 2020-07-10 | 2021-06-30 | Personal care device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20185356.1A EP3936000A1 (en) | 2020-07-10 | 2020-07-10 | Personal care device |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3936000A1 true EP3936000A1 (en) | 2022-01-12 |
Family
ID=71607712
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20185356.1A Withdrawn EP3936000A1 (en) | 2020-07-10 | 2020-07-10 | Personal care device |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3936000A1 (en) |
WO (1) | WO2022008308A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160317267A1 (en) * | 2013-12-12 | 2016-11-03 | Koninklijke Philips N.V. | Toothbrush with variable touch selection system and method of operation thereof |
US20190299437A1 (en) * | 2018-03-27 | 2019-10-03 | Braun Gmbh | Personal care device |
-
2020
- 2020-07-10 EP EP20185356.1A patent/EP3936000A1/en not_active Withdrawn
-
2021
- 2021-06-30 WO PCT/EP2021/067939 patent/WO2022008308A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160317267A1 (en) * | 2013-12-12 | 2016-11-03 | Koninklijke Philips N.V. | Toothbrush with variable touch selection system and method of operation thereof |
US20190299437A1 (en) * | 2018-03-27 | 2019-10-03 | Braun Gmbh | Personal care device |
Also Published As
Publication number | Publication date |
---|---|
WO2022008308A1 (en) | 2022-01-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3525621B1 (en) | Connected hairbrush | |
Hoelscher et al. | Evaluation of tactile feature extraction for interactive object recognition | |
WO2020131471A2 (en) | Apparatus and method for operating a personal grooming appliance or household cleaning appliance | |
CN113710196A (en) | Oral care system for interdental space detection | |
CN108066030A (en) | Oral care systems and method | |
US20190209077A1 (en) | Grooming instrument configured to monitor hair loss/growth | |
JP6647438B1 (en) | Head sensing device, information processing device, head measurement method, information processing method, program | |
US20210022840A1 (en) | Systems and methods for measuring an oral arch geometry and selecting a mouthpiece based on oral arch measurements | |
CN107920790A (en) | The detection and assessment grasped with handheld tool to user | |
CN112884132B (en) | Tooth brushing detection method and device based on neural network, electric toothbrush and medium | |
JP2016073526A (en) | Hair brush | |
EP3936000A1 (en) | Personal care device | |
KR20190114120A (en) | Smart toothbrush and mobile terminal | |
US20200085313A1 (en) | Thermal field scanner | |
CN115300155A (en) | Method and device for identifying working area of toothbrush, electric toothbrush and storage medium | |
JP2012063844A (en) | System, device and program for identifying user | |
TWI693039B (en) | Toothbrush with physiological sensing function | |
WO2020237494A1 (en) | Toothbrush having physiological sensing function | |
WO2018166902A1 (en) | An apparatus and a method of estimating the position of a handheld personal care device relative to a user | |
KR20200034673A (en) | Shaver handle and how to use | |
US20220225928A1 (en) | Detection of paralysis, weakness and/or numbness in a part of a body of a subject | |
JPWO2020054468A1 (en) | Hairdressing and beauty leather | |
JP2021522936A (en) | A system that determines the brushing parameters of an electric toothbrush | |
US20210401360A1 (en) | Hair analysing device | |
CN113588144B (en) | Stress distribution detection system, method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
B565 | Issuance of search results under rule 164(2) epc |
Effective date: 20210114 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20220713 |