EP3522753B1 - Brosse à dents intelligente - Google Patents

Brosse à dents intelligente Download PDF

Info

Publication number
EP3522753B1
EP3522753B1 EP17783796.0A EP17783796A EP3522753B1 EP 3522753 B1 EP3522753 B1 EP 3522753B1 EP 17783796 A EP17783796 A EP 17783796A EP 3522753 B1 EP3522753 B1 EP 3522753B1
Authority
EP
European Patent Office
Prior art keywords
toothbrush
sensor
user
brushing
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP17783796.0A
Other languages
German (de)
English (en)
Other versions
EP3522753A1 (fr
Inventor
Adam Thomas Russell
Derek Guy Savill
Katharine Jane SHAW
Robert Lindsay Treloar
Ruediger ZILLMER
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Unilever Global IP Ltd
Unilever IP Holdings BV
Original Assignee
Unilever Global IP Ltd
Unilever IP Holdings BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Unilever Global IP Ltd, Unilever IP Holdings BV filed Critical Unilever Global IP Ltd
Publication of EP3522753A1 publication Critical patent/EP3522753A1/fr
Application granted granted Critical
Publication of EP3522753B1 publication Critical patent/EP3522753B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A46BRUSHWARE
    • A46BBRUSHES
    • A46B15/00Other brushes; Brushes with additional arrangements
    • A46B15/0002Arrangements for enhancing monitoring or controlling the brushing process
    • A46B15/0004Arrangements for enhancing monitoring or controlling the brushing process with a controlling means
    • A46B15/0006Arrangements for enhancing monitoring or controlling the brushing process with a controlling means with a controlling brush technique device, e.g. stroke movement measuring device
    • AHUMAN NECESSITIES
    • A46BRUSHWARE
    • A46BBRUSHES
    • A46B11/00Brushes with reservoir or other means for applying substances, e.g. paints, pastes, water
    • A46B11/0006Brushes with reservoir or other means for applying substances, e.g. paints, pastes, water specially adapted to feed the bristle upper surface
    • AHUMAN NECESSITIES
    • A46BRUSHWARE
    • A46BBRUSHES
    • A46B15/00Other brushes; Brushes with additional arrangements
    • A46B15/0002Arrangements for enhancing monitoring or controlling the brushing process
    • A46B15/0004Arrangements for enhancing monitoring or controlling the brushing process with a controlling means
    • A46B15/0008Arrangements for enhancing monitoring or controlling the brushing process with a controlling means with means for controlling duration, e.g. time of brushing
    • AHUMAN NECESSITIES
    • A46BRUSHWARE
    • A46BBRUSHES
    • A46B2200/00Brushes characterized by their functions, uses or applications
    • A46B2200/10For human or animal care
    • A46B2200/1066Toothbrush for cleaning the teeth or dentures

Definitions

  • the present invention relates to a smart toothbrush, more particularly to a smart toothbrush capable of generating a map of the journey taken by the toothbrush during a toothbrushing session.
  • the efficiency of toothbrushing depends not only on the regions of the mouth visited, but also on the path taken between these regions, the order in which they are visited, and the time spent at each region. These features together define the journey of a toothbrushing session. Often, users will unwittingly follow a journey through force of habit or routine, with no particular goal in mind. This can lead to some areas of the mouth receiving better or worse treatment.
  • toothbrushes with sensors to determine and/or track the position within a user's mouth are known.
  • accuracy and inconvenience for the user It is known that magnetic fields and magnetometers can be used to determine precise positions of the toothbrush during brushing in a lab or dental surgery.
  • magnetometers can be used to determine precise positions of the toothbrush during brushing in a lab or dental surgery.
  • Methods for tracking and assessing brushing activity may be affected by strong noise and variability.
  • a significant part of the noise can be removed by accurately detecting the onset of brushing action, i.e., contact of the bristles with a tooth or gum surface.
  • this can be achieved by a combination of activity and sound-level measurement ( US20130166220 A1 ).
  • US2015/044629A1 describes oral care systems having an oral care tool and a first software application.
  • the oral care tool includes a cleaning module, a sensing module and a communications module.
  • the first software application is run on a computing device, such as a mobile computing device, that receives the data from the sensing module and reproduces a simulated image reflecting the result of brushing.
  • the mobile computing device can be linked to a cloud server that receives data from the first software application and saves detailed brushing data for each user.
  • the first software application visualizes brushing and transforms brush into fun game.
  • the cloud server collects and stores the detailed brushing data and provides it to users and care providers.
  • US2009/044356A1 describes a toothbrush that includes a brush head having a capsule receiving zone sized to receive a dentrifice capsule, and a plurality of angled bristles adapted to retain the capsule in such zone. Loading of such toothbrush with a capsule may be performed with a dispenser including a toothbrush manipulation element adapted to move a retention structure associated with the brush head, and dispensation element adapted to deliver the capsule into the receiving zone.
  • a dispenser may include a desiccant material disposed to receive moisture from a capsule storage container.
  • a dispenser may further include a plurality of toothbrush insertion apertures each adapted to receive a portion of a different toothbrush, so as to enable multiple users each having different toothbrushes to use the same dispenser without requiring contact between multiple brushes and single capsule interface surface of the dispenser.
  • the present invention comprises a smart toothbrush and system as defined by the claims. Embodiments that do not fall within the scope of the claims are to be interpreted as examples useful for understanding the invention.
  • a method of monitoring toothbrushing comprising:
  • the method provides two separate phases: a coaching phase and a monitoring phase.
  • a coaching phase By producing an adapted statistical model during the coaching phase and using it during the monitoring phase, it is possible to carry out the monitoring phase at home but to still benefit from more accurate monitoring usually restricted to systems at the dentist and/or hygienist.
  • both steps can be carried out during a user's free brushing. Not only does this result in a more user-friendly device, but it also eliminates erroneous reading that can arise due to the user following "guide"-type instructions.
  • the proposed technology helps customers to brush their teeth better by improving the quality and user experience during the coaching phase and thus the value of the free brushing assessment and feedback.
  • the coaching phase comprises the steps of:
  • the step of detecting position information of the toothbrush may include recording accelerometer data from an accelerometer located on or within the toothbrush and also recording of positional data from the external position sensor.
  • Each of the toothbrush and the external sensor may send their data separately to an external computer which processes the information received and correlates acceleration measurements from the toothbrush with the positional information from the external sensor. For example, in some embodiments, a timestamp created upon turning on the toothbrush or in response to any specific stimulus at the toothbrush provides a reference point for the comparison and combination of the two sets of data.
  • the external computer takes the form of a mobile device.
  • the step of generating a user-specific statistical model from the combined data may take the form of multivariate classification models which may be trained for each user.
  • a multivariate normal approximation may be applied in addition to a Mahalanobis distance or multimodal probability in order to select either the most likely position or to assign weight (likelihood) to particular positions (which may correspond to particular segments at the user interface).
  • the monitoring phase comprises the steps of:
  • free brushing is simply referring to the user brushing their teeth as normal with no instructions.
  • the opposite of free brushing would be following instructions e.g. "brush top left teeth inner surface”.
  • the coaching phase includes an alternative coaching model in which the user follows an instruction guide on a mobile device, wherein the user-specific statistical model generated by the alternative coaching model maps the positional information displayed on the mobile device onto the movement information received simultaneously from the one or more additional sensors.
  • this "guide mode” may be less accurate than the "free brushing" coaching mode; however the provision of both modes in one system allows for a more flexible device.
  • the position sensor includes an external head-mounted sensor.
  • the transmitter produces a magnetic field in frame of reference of the jaw.
  • the position sensor in the brush samples the field and data generated at a given sample rate is sent back to the base station.
  • Communication between the external position sensor, toothbrush and external computer are preferably wireless.
  • the position sensor includes an external wall-mountable sensor.
  • the position sensor is located within a wall mountable mirror.
  • the position sensor is a magnetic field generator and a corresponding magnetic field sensor, the magnetic field sensor being located on the toothbrush.
  • the magnetic generator is wearable by the user.
  • the magnetic generator is wall mounted.
  • the magnetic generator is wall mounted.
  • the magnetic generator is wall mounted.
  • the magnetic generator is wall mounted.
  • the magnetic generator is wall mounted.
  • the mirror For example as part of, or attached to a mirror.
  • the magnetic field generator may take the form of a DC magnetic field, such as the switching DC magnetic field technology from Asencion technology corporation. However, the skilled person would appreciate that other commercial magnetic field generators are available.
  • a system for coaching a smart toothbrush comprising:
  • a smart toothbrush comprising:
  • the recordal of the journey may indicate that certain areas of the mouth are missed out altogether or very rarely visited. This kind of neglect can lead to an increase in plaque and staining, and potentially to a deteriorating appearance of teeth and/or gum disease.
  • the user can adapt their brushing journey accordingly, to find a more optimal path, with optimal transitions from one area of the mouth to another. In this way, it is possible for the user to minimise any adverse effect on their oral health caused by accidental neglect and/or poor brushing technique.
  • the computing module is programmed to use a travelling salesman machine learning algorithm to process the recorded position and time information and produce a map of an optimal path to be taken by the toothbrush during the toothbrushing session.
  • a travelling salesman approach to optimal route finding is one example of possible route/path finding algorithms that may be applicable for determining an optimal path to be taken by the toothbrush during a toothbrushing session.
  • a travelling salesman machine learning algorithm may take the form of any algorithm which solves "the travelling salesman problem".
  • the algorithm receives a number of data points (a "transition matrix"); in this case, the data points may correspond to a plurality of positions in the mouth and associated timings. This information is converted into a cost function for each possible transition between data points.
  • the algorithm determines the minimum cost route between the different data points. Along the route, each of the positions in the mouth may be “visited” by the toothbrush more than once. In other words, the route may include brushing one of more of the positions in the mouth more than once.
  • an example method of optimizing the path for a given individual is provided. This may take into account user-specific issues such as sensitivity, missing or misaligned teeth, and other problem areas. By inverting the transition matrix, a least optimal path could be calculated.
  • the toothbrush body further comprises:
  • the "smart" toothbrush is capable of dispensing an active ingredient based on information received by the toothbrush.
  • this enables an active ingredient to be more precisely dispensed.
  • This avoids many of the problems associated with regular toothbrushing in which an active ingredient is pre-loaded onto the head of the brush before the brushing episode begins.
  • One such disadvantage is the fact that users often start a brushing episode at a particular location and follow an automatic brushing pattern or journey, meaning that certain areas of the mouth (usually those contacted at the start of the brushing episode) routinely receive more of the active ingredient than others.
  • By selectively dispensing the active ingredient in response to sensor information it is also possible to better control wastage of the active ingredient (e.g. due to spitting out or undissolved ingredients). It is also possible to avoid over application of active ingredients such as bleach which could be damaging to the user if applied in excess.
  • the activation signal is triggered in response to information from the sensor.
  • the information detected by the sensor includes:
  • the predefined map of the user's mouth may correspond to the 10 segments described herein.
  • the information detected by the sensor includes timing information.
  • the start of timing may be triggered by a single event such a sensor threshold being reached, a change in direction being detected, a change in acceleration being detected. In other events the start of timing may be triggered when two or more sensors simultaneously record a threshold value (e.g. the microphone and accelerometer simultaneously recording sound and movement measurements).
  • a timestamp may be made for each record, starting either at brush turn-on or at start of brushing. Where the timestamps (and associated recorded data) are recorded from the brush-turn-on, an extra step of determining the start of brushing may also be incorporated. Suitable mechanisms for starting the brushing are described herein.
  • the smart toothbrush further comprises at least one additional reservoir(s).
  • the active located within the additional reservoir may provide an additional care step which complements a first active, such as toothpaste located within the first reservoir.
  • An example of such an additional care step is gum conditioning serum.
  • the different actives may be dispensed at the same time as one another, or at different times to one another. Where different actives are applied at different times, each active will be triggered by a different sensor reading. For example, when the sensor detects that the toothbrush head is in a first predetermined location within the mouth, a first active may be dispensed from the first reservoir. When the sensor detects that the toothbrush head is in a second predetermined location within the mouth, a second active may be dispensed from the second reservoir.
  • a second active may be dispensed from the at least one additional reservoir simultaneously with the dispensed first active from the first reservoir.
  • the smart toothbrush further comprises a sound sensor for detecting contact between the head of the toothbrush and the jaw of the user.
  • the map of the journey is displayed on a mobile device.
  • raw data is processed on brush, then position and time stamp sent via wireless communication methods to the external computer (which may take the form of a mobile device such as a mobile phone or tablet but may also take the form of a smart mirror or wall-mounted computer screen).
  • the raw data itself is transmitted to the external computer and subsequently processed at the external computer (again, this may take the form of a mobile device such as a mobile phone or tablet).
  • Wireless communication may take the form of Bluetooth, Wi-Fi or similar.
  • a tracking system for recording the position of a toothbrush during toothbrushing comprising:
  • the use of a combination of inertial measurement sensors, sound, and a bespoke magnetic field achieve high-resolution toothbrush tracking.
  • the magnetic field is generated by wearable magnets or field generators, allowing low-cost design and easy use of the system in dentist practices or at home.
  • the orientation sensor is an accelerometer or an inertial measurement unit for measuring the rotational position of the toothbrush.
  • the sensor for detecting contact between the toothbrush and the jaw of the user is a sound sensor; and wherein, upon detection by the sound sensor of contact between the toothbrush and the jaw, the measurement of position within the magnetic field is restricted to a measurement of position along a known, pre-calibrated, one-dimensional jaw line.
  • the magnetic field generator is a wearable magnetic field generator; wherein the magnetic sensor detects the absolute position of the toothbrush within the field created by the magnetic field generator.
  • the magnetic field is tuned to generate iso-surfaces of magnitude which lie perpendicular to the jaw.
  • the magnetic sensor comprises a processor which detects the absolute position of the toothbrush by measuring the field magnitude, M of the brush and the vertical field component M z of the brush.
  • an initial calibration step is carried out to create a look-up table for correlating measured magnetic field values with position information.
  • One such calibration step is carried out by initially moving a magnetometer probe in a controlled way, tooth-by-tooth along predetermined points along the jaw of the user, for example along the centres of the biting surfaces of each tooth. The resulting magnetometer readings are stored in the look-up table alongside the predetermined position points.
  • the magnetometer will have a known separation from the head of the toothbrush which contacts the surface of the user's teeth. This separation can be adjusted for, by using a pre-determined model. Since the level of adjustment may depend upon the orientation of the brush, the orientation of the brush can be taken from accelerometer data and incorporated into the model.
  • Values of acceleration such as acceleration due to gravity are desired to obtain information about the orientation of the brush (rather than information about the kinetic acceleration of the brush) and may be obtained by applying low pass filtering to the overall output obtained from the accelerometer, the overall output containing contributions from gravitational acceleration and also contributions from kinetic acceleration.
  • the toothbrush comprises:
  • the toothbrush comprises a data logger for recording data from the magnetic sensor and/or the orientation sensor; wherein the data logger is only activated when the sound receiver detects a value which meets a predetermined threshold criterion.
  • the data logger comprises a memory for recording the detected orientation information and a processor programmed to convert this orientation information into positional information.
  • the processor is configured to carry out multivariate classification which typically involves a trained lookup of cluster parameters used to classify new data.
  • the sound receiver includes a frequency filter for detecting changes in frequency of the electric motor and/or vibrator when the toothbrush is in contact with a tooth; and wherein the data logger is only activated when the sound receiver detects a value which meets a predetermined threshold criterion.
  • the sensor for detecting contact between the toothbrush and the jaw of a user is:
  • the pressure sensor may take the form of a piezoelectric material which generates a voltage when pressure applied to the head of the toothbrush is referred to the piezoelectric material.
  • the bristles of the toothbrush may directly transfer pressure from the interaction between the user's jaw and the toothbrush head to the piezoelectric material.
  • pressure may be applied to the piezoelectric material by a flexible portion of the body of the toothbrush, usually its neck.
  • a toothbrush comprising:
  • a toothbrush comprising:
  • the smart toothbrush 1 comprises a toothbrush body, which is made up of a head 2 for contacting the teeth of a user and a handle 3 for contacting the hand of a user.
  • the head includes an array of bristles 21 which contact the teeth and gums of the user during toothbrushing in order to clean away plaque and debris.
  • the array of bristles 21 is typically mounted upon a neck piece 22 which connects the head to the handle 3 of the toothbrush body.
  • the toothbrush body is hollow, with an outer shell defining one or more cavities within which smart components such as sensors, a power source, a display, and communication means are located.
  • components located within the cavity include: a display 41; sensors 42 (including one or more of the following: an accelerometer, a gyroscope, and a magnetometer 42); a microphone 43; a power source (such as a rechargeable battery), a computing module including a processor and a communication module.
  • the smart toothbrush is able to determine a number of parameters including: orientation (via an accelerometer and or gyroscope); average sound-level (via a microphone); axis correlations (via a magnetometer); phase velocity (via the accelerometer).
  • the handle has a curved exterior with ergonomic curves, including an external convex portion 31 which acts as thumb rest but which also provides a corresponding cavity for larger components such as the display 41.
  • the computing module typically includes a memory and a processor, the processor of the computing module configured to perform one or more desired functions.
  • the processor is programmed to record the position of the toothbrush as detected by one or more of the sensors, at various points during a toothbrushing session. This recordal may take place at repeated time intervals throughout use of the toothbrush by the user and the processor may also be programmed to record the time (in the memory) at which each detected position was detected.
  • the processor is also programmed to produce a map of the journey taken by the toothbrush during the toothbrushing session using the recorded position and time information.
  • the toothbrush includes an on/off switch (not shown), typically in the form of a capacitive switch.
  • the processor can be programmed to initiate an auto switch-off after the timer detects a period of inactivity which extends beyond a given threshold.
  • the prolonged period of inactivity may be measured in relation to a specific sensor, e.g. when no activity is detected by the accelerometer for the given threshold of time. In other cases, the period of inactivity may require that two or more sensors show no activity over the same time threshold.
  • the timer (which may be a part of the computing module) may also include a feature for providing an indication to the user once a predetermined period of activity has been measured. For example, the indication may provide a signal to the user once a continuous recording of activity has been measured by one sensor or by two or more sensors simultaneously for a predetermined brushing time (a sensible value for which would be "2" mins).
  • the communication module may take the form of a wireless communication module, such as a Bluetooth module from which data can be transmitted and/or received.
  • a wireless communication module such as a Bluetooth module from which data can be transmitted and/or received.
  • the communication module also provides an interface across which automatic downloads of data may be carried out following periods when the brush was not connected to the app.
  • Figure 15a, 15b, 15c, 15d, and 15e each show respective embodiments of a smart toothbrush, where the toothbrush includes one or more reservoirs for active ingredient(s).
  • the reservoirs and associated dosing mechanisms described below in relation to Figures 15a-15e could each be applied to the smart toothbrush described above in relation to Figures 1-3 .
  • the active ingredients are typically liquid based but could take other forms (e.g. gas, gel, foam or another gas/liquid mixture). Examples of active ingredients include a serum or a hardener.
  • the active material may include anti-microbial properties, or cosmetic properties such as tooth whitening.
  • the active ingredient is a solid ingredient and a reservoir which is in fluid communication with the solid ingredient contains a solvent. When the solvent comes into contact with a portion of the solid ingredient, it will dissolve an amount of the solid ingredient, which can then be transported to the head of the toothbrush as the active ingredient to be dispensed.
  • One advantage of such an embodiment is the fact that the frequency at which the solid ingredient needs to be replaced will be less than the frequency by which a liquid or gas ingredient would need to be replaced. If the solvent stored in the reservoir is water, it will be readily available to the user.
  • the reservoir and dosing mechanism enables adverse ingredients to be kept apart (e.g. an active component that would interact badly with toothpaste may only be dispensed once toothpaste has been washed away). It can also provide a mechanism for adding extra ingredients that will interact positively with other ingredients such as toothpaste.
  • some embodiments comprise a reservoir 1506 for the active ingredient.
  • the reservoir 1506 is in fluid communication with the head of the toothbrush via a fluid channel 1506b.
  • the release of an active ingredient is achieved by a biomechanical pumping system, triggered by manual actuation by the user.
  • a biomechanical pumping system triggered by manual actuation by the user.
  • at least a portion of the reservoir may be formed of a flexible material which the user pushes upon or squeezes in order to release the active material.
  • the toothbrush may be configured to issue an alert, usually an audio alert to indicate to the user that it is time for them to engage with the brush (e.g. to squeeze a portion of the brush) in order to release of the active ingredient.
  • the reservoir includes an electronically controlled actuator 1507, 1508 such as a pumping system or a valve, which is directly controlled by the computing module on the toothbrush.
  • the pumping system could be one or more of the following: microfluidic; piezoelectric (similar to an inkjet printer).
  • the reservoir 1506 for the active ingredient and the electronic control (and pump) 1507 are located within the handle of the toothbrush.
  • the electronics, pump and active ingredient are all located in a removable module 1508 located within the head of the body of the toothbrush.
  • this removable module could be easily relocated in a new toothbrush once the bristles of the old toothbrush become worn.
  • the active ingredient is delivered from the reservoir 1506 to user via apertures located at the base of the bristles.
  • the head of the toothbrush 1502 comprises bristles and one or more ingredient delivery structures 1510 located within the bristles.
  • the ingredient delivery structures 1510 transport the ingredient in a direction parallel to the longitudinal axis of the bristles before releasing the active ingredient. In this way, the user is able to more accurately target specific areas of the mouth for applying the dose of the active ingredient.
  • the ingredient delivery structure(s) 1510 may take the form of hollow bristles, hollow blades or of micro pipes where the hollow blades or micro pipes may be formed from a material such as a rubber-based material which is different from the material of the bristles. In other embodiments, the ingredient delivery structure(s) may take the form of a sponge.
  • the head of the toothbrush 1503 does not contain any conventional bristles. Instead, it comprises only ingredient delivery structures 1510 such as hollow bristles and/or blades.
  • the reservoir and associated dosing mechanism enables adverse ingredients to be kept apart (e.g. an active component that would interact badly with toothpaste may only be dispensed once toothpaste has been washed away). It can also provide a mechanism for adding extra ingredients that will interact positively with other ingredients such as toothpaste.
  • the smart toothbrush may include an electric motor and/or a vibrator for providing movement and/or rotation of the head of the toothbrush.
  • the sound receiver may be utilised to measure the frequency of the electric motor and/or vibrator.
  • a program run by the processor records this frequency over time and detects changes in frequency of the electric motor and/or vibrator which occur when the toothbrush is in contact with a tooth. This detection of contact between the bristles of the head and the tooth can be used as a trigger for other events (e.g. dosing, measurement. activation of sensors).
  • Figure 4 shows a schematic diagram of an example system capable for carrying out a method of monitoring toothbrushing
  • the system comprises the toothbrush 1 and a mobile device 20 upon which an app 2 can can be stored and run, the mobile device being communicably connected to the toothbrush.
  • the mobile device can be connected to one or more computers via a network 40.
  • Each of the mobile device and the toothbrush can communicate wirelessly with additional items in the system such as external sensors.
  • an external position sensor or a magnetic field generator This could be a one-to-one communication channel such a Bluetooth or may be by providing access for the additional items to the network 40.
  • the mobile device can be replaced with a different type of external computer such as a smart mirror or a wall mounted computer.
  • Wireless communications 4a between the toothbrush 1 and the mobile device 20 enable the flow of information therebetween.
  • This flow of information could be a flow of data from the toothbrush to the app running on the mobile device such as raw data from one or more of the sensors on the toothbrush or processed data which has been recorded by one or more of the sensors on the toothbrush and then processed by the processor of the toothbrush before being forwarded via the communication module of the toothbrush to the mobile device.
  • Information passed between the communication module of the toothbrush and one or more interfaces of the mobile device may include toothbrush parameter updates.
  • Wireless communications 4b between the mobile device 20 and the network 40 may also include the flow of data. This is usually processed by either the toothbrush or the app 2 on the mobile device, but could also take the form of raw data. Brush parameter updates and software updates for the app 2 may also be sent from the network 40 to the mobile device 20.
  • the network may include the internet and/or one or more local area networks (LANs) or wide area networks (WANs).
  • LANs local area networks
  • WANs wide area networks
  • One or more external computer(s) 30 such as servers may have connections 4c to the network in order to communicate with the mobile device 20 over the network 40.
  • One or more of the external computer(s) includes data storage and data processing.
  • the data stored and processed at the external computer(s) typically derives from a plurality of different users (via a plurality of different mobile devices and smart toothbrushes) and may include personal data, although this may be anonymous.
  • the external computer(s) therefore provide a platform for data analytics taken from all or at least a proportion of the total number of users using their respective toothbrushes in their own homes.
  • the communication channel between the company user 50 and the external computer(s) 30 therefore includes the flow of information from the external computer to the user of data mining insights from data analytics. It will also include the flow of settings and other information from the company user to the external computers. For example, software updates may originate from the company user 50 and reach the mobile device 20 via the external computer 30 and the network 40.
  • Data may be pushed to the server after a brushing session has finished.
  • External parties such as dentists may be provided with access to the data stored on the External computer 30 via the network 40.
  • the user themselves may relay information from the toothbrush or from the mobile device to the dentist.
  • FIG. 5 shows an illustrative embodiment of a mobile device which is suitable for practicing the various aspects and embodiments.
  • the mobile device is suitable for use with the method of monitoring toothbrushing described herein and also for communication with any one of the toothbrushes described herein.
  • the mobile device may include all of the components shown but may contain more, or less. It should be understood that in embodiments where the mobile device is replaced by alternative external computers such as a smart mirror or a wall mounted computer, the alternative external computer will include the same features as those described below in relation to the mobile device.
  • the mobile device 20 typically includes a digital imaging device 60 such as a digital camera for recording digital photographs. These photographs may then be stored in a data storage section of the memory 22.
  • a digital imaging device 60 such as a digital camera for recording digital photographs. These photographs may then be stored in a data storage section of the memory 22.
  • the mobile device 20 shown includes a central processing unit (CPU) 21 in communication with a memory 22 and various other components.
  • CPU central processing unit
  • a power supply 23 a network interface 24, a display 25, an input/output interface 26, an audio interface 27, a flash 28 and user controls 29.
  • the power supply 23 provides the power used by the mobile interface and may take the form of a rechargeable battery and/or external power source.
  • the network interface 24 provides a mechanism for the mobile device to communicate directly or indirectly with any compatible smart device and includes circuitry configured for use with one or more communication protocols and technologies including but not limited to: GPRS; GSM; TDMA; transmission control protocol/Internet protocol (TCP/IP); CDMA; WCDMA; Wi-Fi; 3G, 4G, Bluetooth or any other wireless communication protocols.
  • the display 25 may be an LCD (Liquid crystal display), a plasma display or any other suitable electronic display and may be touch sensitive in that it may include a screen configured to receive an input from a human digit or a stylus.
  • LCD Liquid crystal display
  • plasma display any other suitable electronic display and may be touch sensitive in that it may include a screen configured to receive an input from a human digit or a stylus.
  • Input/output interface(s) 26 may include one or more ports for outputting information e.g. audio information via headphones, but may also be an input port configured to receive signals including remote control signals.
  • the audio interface 27 typically includes a speaker which enables the mobile device to output signals and a microphone which enables the mobile device to receive audio signals including voice control inputs for use in controlling applications.
  • the mobile device 20 shown includes a flash 28 which may be used in conjunction with the digital imaging device to illuminate an object of which a photograph is being taken.
  • User controls 29 may take the form of external buttons or slider which allow a user to control various functions of the mobile device.
  • An application saved on the device may be configured to interact with the various components of the device such that upon receiving an input from one or more of the user controls.
  • the computer program described herein may take the form of an application stored in the memory 22.
  • the mobile device may be connected to an external computer 30 either directly or via a network 40 so that computationally extensive calculations can be carried out by a computational module on the external computer, the external computer being more powerful than the mobile device and therefore capable of performing the calculations more quickly.
  • the mobile device 20 may also be configured to exchange information with other computers via the network 40.
  • Figures 6 to 8 show examples of interactive displays provided to the user on the app 2 of the mobile device 20.
  • the first example, shown in Figure 6 is a graphical user interface (GUI) 21 presented to the user by the app 2, the GUI including two "segments".
  • GUI graphical user interface
  • a first segment provides an indicator for brushing measured by the app as having taken place on the upper jaw and the second segment provides an indicator for brushing measured by the app as having taken place on the upper jaw lower jaw respectively.
  • FIG. 8 A further improved example is shown in Figure 8 where a total of 10 segments 6i, 6j, 6k, 61, 6m, 6n, 6o, 6p, 6q and 6r correspond to:
  • the position of the segments mimics the position or the various surfaces around the mouth with the segment corresponding to the upper biting surface located at the upper portion of the GUI and the segment corresponding to the lower biting surface located at the lower portion of the GUI.
  • segments for "left" surfaces are located on the left side of the GUI and segments for "right” surfaces are located at the right side of the GUI.
  • the segments for the remaining tooth surfaces are grouped in pairs. For example, the upper right outer surface and upper right inner surface are located adjacent one another as sub segments of a larger segment.
  • a warning may be displayed to the user to inform them that they are brushing too hard.
  • This warning may take the form of a colour change on the visual display of the mobile device, particularly a colour change which overlays or highlights a graphic corresponding to the portion of the mouth in which brushing was taking place when the threshold was exceeded.
  • the visual warning may or may not be accompanied by an audio warning and/or a displayed message such as "brushing too hard”. Consumer studies have shown that audio warnings are particularly effective in the field of toothbrushing.
  • Figure 9 shows a high level flow diagram depicting example steps in the coaching phase of the method of monitoring toothbrushing.
  • the user must open the app on the mobile device (s2) and ensure that the mobile device is connected to the toothbrush (s3) by Bluetooth or other wireless communication.
  • Figure 10 shows a further example of an interactive display provided to the user on a mobile device, particularly an image which may be presented to the user during a guided mode.
  • the image shows a schematic of the jaws, with a section of teeth highlighted, the highlighted section corresponding to a segment of the 10 segment display.
  • an interactive display is provided to the user on a mobile device, giving them the option (s4) to select one of two brushing modes: a coaching mode (111) or free brushing mode (112).
  • the link to the free brushing function (s6c) of the app is non-actionable, appearing as greyed out text. Only once the coaching mode (s6a) has received and stored enough information to produce a personal model in the server (s6b) does the link for free brushing become actionable.
  • data received by the one or more sensors of the toothbrush is processed by the computing module of the toothbrush and/or external processors on the mobile device or on an external computer. This is described in more detail below, with reference to Figures 12a-12c .
  • Figure 12a depicts a user interface displaying live feedback to the user at the beginning of a toothbrushing session, where all segments are empty 121.
  • the sensor data is processed and the performance of the user displayed in the form of "filling up" segments 122; a full segment 123 indicating that a pre-set brushing threshold (e.g. number of brush strokes) has been met.
  • a pre-set brushing threshold e.g. number of brush strokes
  • FIG 12c An example of the process followed by the toothbrush during toothbrushing is shown in Figure 12c . These steps will be carried out regardless of whether the coaching mode (i.e. guide mode) or the "free brushing" mode is selected. The difference between the two modes is that, in the coaching (guide) mode, the mobile device will provide a guide to the user for the user to follow during brushing. The way in which data is processed and provided to the user will remain the same.
  • the coaching mode i.e. guide mode
  • free brushing free brushing
  • a first step the toothbrush is turned on. This may be via a switch or button controlled by the user or may be in response to detection of motion or touch via one or more of the sensors.
  • data from the one or more sensors is pre-processed (s122) at the processor of the computing module.
  • This pre-processed data may be stored in memory on the toothbrush itself or may be sent to the mobile device and/or another device such as a smart mirror or other computer.
  • the data from one or more of the sensors is used to determine whether or not toothbrushing has started. For example, in some embodiments, toothbrushing is detected when the toothbrush receives a combined detection of sound and motion via the microphone and accelerometer. Each of these values may have to exceed a pre-defined threshold in order to determine that toothbrushing has started. In other embodiments, a different combination of sensor triggers may be used to determine whether toothbrushing has started, such as a force sensor in combination with a motion sensor (e.g. the accelerometer).
  • the smart toothbrush and any connected devices will be placed in a standby mode (s124). After a pre-defined amount of time (“timeout”) the system, including the smart toothbrush will turn off (s125).
  • live feedback is presented to the user by the mobile device.
  • This live feedback may include one or more of the following:
  • the toothbrush After a predetermined amount of time in which no activity has been measured by the sensors or in response to a manual switch, the toothbrush sends a signal to the mobile device to indicate that the toothbrushing session has ended, and the mobile device (or other device such as a screen or smart mirror) displays end of session feedback to the user and/or recommendations (s128).
  • the toothbrush is configured to operate in a standalone mode when no contact can be made with a mobile device (or other device). This is described in more detail in relation to Figure 12d below.
  • the toothbrush is turned on (s131), either by way of a user operated switch or button or in response to detection of motion or touch by one or more of the one or more sensors.
  • the toothbrush attempts to connect (s132) with an external device such as a mobile device 20 or other device (e.g. smart mirror, wall mounted computer or smart cup).
  • an external device such as a mobile device 20 or other device (e.g. smart mirror, wall mounted computer or smart cup).
  • the connection is typically a wireless connection such as Bluetooth.
  • the toothbrush starts operation in a "data mode" (s133).
  • data mode sensor data id pre-processed and saved in the memory of the computing module of the toothbrush.
  • the processor of the toothbrush is programmed to provide basic feedback via visual or audio signals emitted from a light source or speaker on the toothbrush in response to the processor detecting that particular criteria have been met. For example, audio or visual signals may be emitted in response to:
  • a data transfer will take place (s135). This will include synching data between the toothbrush and the device as well as sending sensor data from the toothbrush to the device. Where historic data has been stored on the brush but not transferred to the external device, this historic data will be transferred.
  • the toothbrushing session will carry on as usual (s136) either in coaching mode or in freebrushing mode, as described in relation to Figure 12c steps s126-s129 above.
  • the device e.g. mobile device or otherwise
  • the device with which the toothbrush interacts is itself configured to connect with an external database such as an external computer or server 30.
  • an external database such as an external computer or server 30.
  • the device should be switched on (s141) and will then attempt to form a connection to the database (s142). This connection may be via a network 40.
  • the application i.e. the program
  • the device will trigger data transfer (s143) between the device and the external computer at which the database is located, and may also trigger analytics to be carried out by the external computer. For example, upon achieving a successful connection, the following steps may be carried out:
  • Figure 13a shows a map of the overall journey taken by the toothbrush during a toothbrushing session using the recorded position and time information from the session.
  • Each step in the journey is set out in Figure 13b , where the numbers in the sequence 5>3>2>8>1>4>6>10>9>7 correspond to particular parts of the jaw, as shown in Figure 13c .
  • Each of these numbered parts of the jaw correspond to a respective one of the 10 segments 121, 122, 123 displayed to the user by the mobile device during brushing as depicted in Figures 12a and 12b .
  • Figure 13d shows an example of a multivariate analysis used to match the detected position of the toothbrush by the one or more sensors to a relevant segment on the display shown to the user at the external device.
  • the clusters (corresponding to the 10 segments) are shown in a feature space (projected onto a suitable 2-D subspace).
  • Quadratic discriminant analysis was used to estimate the normal approximation (circles around each cluster of points) for each segment cluster.
  • the journey is produced by a smart toothbrush which includes at least a toothbrush body, comprising a head 2 for brushing and a handle 3 as well as at least one sensor for detecting the position of the toothbrush in the mouth at various points during a toothbrushing session.
  • the smart toothbrush further includes a timer for recording the time at which each detected position occurs; and a computing module with a memory, the computing module configured to (i.e. programmed to) record the detected position at various points during a toothbrushing session along with recording the time at which each detected position was detected. From these recorded points, the processor of the computing module produces time series data which corresponds to a map of the journey taken by the toothbrush around the jaw during the toothbrushing session.
  • the computing module is programmed to use a machine learning algorithm, e.g. solving a travelling salesman problem, to process the recorded position and time information and produce a map of an optimal path to be taken by the toothbrush during the toothbrushing session.
  • a machine learning algorithm e.g. solving a travelling salesman problem
  • One or more journeys made by the user's toothbrush may be recorded at the database of the external computer 30. Newly recorded journeys may be checked with previously recorded journeys. If a deviation from a previously made (and stored) journey is detected, a processor at the mobile device may trigger a warning signal to indicate to the user that unusual behaviour has been detected. This warning may take the form of a message displayed on the screen of the mobile device and may or may not be accompanied by an audio signal. The amount of deviation required before the warning signal will be triggered is pre-set, either as a standard pre-defined value built into the software run on the device, or by a pre-defined amount which is input by the user or a health professional.
  • a distance can be calculated between these and other paths to show how similar / different they are to each other.
  • the distance may the geometric distance, or may, for example, be a measure of distance between symbol strings associated with the journeys, for example the Levenshtein distance. Using the distance enables clustering to take place of brushing paths, to see how similar one brushing path is to another.
  • the distance metrics can be used to determine how close a recorded brushing journey is to to that recommended by a dentist as well as to determine whether the user is consistent in their brushing habits.
  • a deviation from the "regular" already recorded journeys can be an early indicator of a problem such as tooth sensitivity or gum disease. By triggering a warning upon such a deviation, the user can be made aware of potential problems before they would have been otherwise.
  • Figure 13e shows a map of an optimal path created using a travelling salesman machine learning algorithm.
  • a more efficient brushing technique can be implemented by training the user to follow this calculated optimised journey.
  • the path followed during the journey has been optimised by minimising an appropriate cost function related to the transition probabilities: the less likely a transition the higher is the associated cost.
  • the transition probabilities are a function of spatial distance and biomechanical constraints such as dexterity. In this case, it is found that brushing the regions in the order of 5>9>1>3>10>7>8>4>2>6 is more efficient than in the path taken by the user in Figure 13a .
  • the position data from the one or more sensors and the respective time information from the data are used to produce a time series (i.e. a path or journey) for that toothbrushing session (s142).
  • the time series is sent from the device of the user to a database, typically on an external computer, where it is analysed (s143) and may be compared with previous journeys by the same user or other users.
  • the external computer may be accessed by the user over the network 40.
  • the database may also receive data input from other users over the network (s144) thereby enabling analytics for an entire population of users rather than just a single user. Additional inputs may also be received at the database via the network. This may include information entered by the user themselves such as whether they are left or right handed , whether they have any problem areas in the mouth that they are aware of, or extra information they may have received from their dentist such as the need to focus more on a particular area of the mouth during toothbrushing.
  • data mining and statistical analysis (s146) of the data may be used to carry out one or more particular functions. Examples of such functions include:
  • the above steps may be carried out at the database regardless of whether the device is still “online” and therefore connected to the database over the network.
  • Feedback updates may include one or more of the following:
  • the ability to calculate and present a journey to the user and the ability to analyse time series data from a user may be used in conjunction with the dosing mechanisms previously described in relation to Figures 15a-e . This is described in more detail below in relation to Figure 16 .
  • a smart toothbrush with a dosing mechanism such as any of those shown in Figures 15a-e is provided (s151) and turned on.
  • data is obtained from the one or more sensors (s153).
  • This data is processed (s153), either in the internal processor of the computing module of the toothbrush, or at the mobile device.
  • This processing includes computing the location of the toothbrush in real time.
  • features of the teeth may be monitored in real time such as plaque strength or tooth whiteness, both of which could be detected by a camera on the toothbrush.
  • a dosing control algorithm is applied (s154) to the processed data, the algorithm including set conditions which have to be met for dosing to take place. These conditions may be set by the user via the device (e.g. mobile device) before brushing takes place.
  • the dosing algorithm tracks the position of the toothbrush within the jaw and detects when a condition is met. For example, when the location data from the sensors shows that the head of the toothbrush is in a particular location, which may correspond to a particular one of the 10 segments.
  • Further sensors that may be used to provide data suitable for generating a trigger or activation signal for dispensing one or more active ingredients from one or more reservoirs may include any of the following types: (i) light- or camera-based sensors for detecting live and dead plaque (e.g.
  • the dosing mechanism may be primed ready to dose ahead of reaching the location at which dosing is to take place, in a predictive step. This would allow an increase in the effective dosing time or amount in the required location.
  • the dosing control algorithm may also update a log of dosage history; logging the positions and times at which dosing occurred along the path of a particular journey. This data may later be fed back to a database on the external computer 30.
  • External input may be provided (s155) to the dosing algorithm.
  • this external input will include one or more of:
  • the processor on the toothbrush will control release (s156) of the ingredient by initiating a trigger for the dosage to occur.
  • this trigger could be in the form of a warning provided to the user (e.g. audio or visual signal) to alert the user that they need to actuate the biomechanical mechanism themselves.
  • the trigger may simply be an electrical signal which actuates the pump or valve to eject the required dosage from the reservoir via the head of the brush.
  • When and/or what is dosed by the toothbrush may be influenced or driven by data that is provided to the toothbrush from an outside source rather than data that is measured by sensors in the toothbrush.
  • advice from a dentist, or user preferences could be used as data to influence and/drive dosing.
  • data from other diagnostic sensors which may be internal or external to the toothbrush, may be used to provide data that can be used to influence or drive dosing.
  • Figure 17 shows an example of a tracking system for recording the position of a toothbrush during toothbrushing.
  • the tracking system 1701 includes a smart toothbrush 1702 which may contain one or more of the features set out above, in particular those features set out in relation to the embodiments shown in Figures 1-3 and also Figures 15a-e .
  • the toothbrush includes a head portion 2 and a handle portion 3.
  • the body of the toothbrush comprises one or more sensors, including at least an orientation sensor such as an accelerometer or a gyroscope.
  • the smart toothbrush 1702 of this embodiment includes a sensor for detecting contact between the toothbrush and the jaw of a user.
  • a sensor for detecting contact between the toothbrush and the jaw of a user This may, for example take the form of a sound sensor.
  • the system 1701 includes one or more magnetic field generators and the smart toothbrush also includes a magnetic sensor for detecting location within a magnetic field.
  • the magnetic sensor detects the absolute position of the toothbrush within the field created by the magnetic field generator.
  • the magnetic field generator 1703 takes the form of a plurality of magnets located on a pair of glasses 1704.
  • the system further comprises an external device 1705 which in this embodiment takes the form of a mobile device containing the application described in relation to Figure 9 .
  • the toothbrush 1702 interacts with the mobile device 1705 in the same way as described in relation to previous embodiments, only with the added benefit of a magnetic sensor output which is recorded on the toothbrush by the magnetic sensor and can be processed either on the toothbrush itself, or sent to the mobile device for processing in order to give an output of the location of the toothbrush.
  • the presence of the magnetic field generator and magnetic field sensor enables an actual position of the toothbrush to be measured, rather than relying upon a determination of the likely position based on movement and orientation sensors such as accelerometers. In this way, it is possible to provide the user with much more accurate information about where the toothbrush is at any specific time.
  • the magnetic systems described herein can be used in two different ways.
  • toothbrushing always occurs in the presence of a magnetic field, the magnetic field providing a mechanism for increasing the resolution of the system.
  • the magnetic field measurements also remove the need for a coaching phase since the actual position of the toothbrush can be measured in real time during free brushing.
  • the magnetic system can provide a resolution great enough to provide tooth-by-tooth tracking of the toothbrush.
  • the size of the toothbrush head is often larger than the size of one tooth and therefore limits the resolution of measurements that can be obtained.
  • the system could be utilised in a second way in which the magnetic field is used in the coaching phase to provide a link between real measurements which can be detected by the magnetic sensors and extrapolated measurements that are made by evaluating data taken from the accelerometer data.
  • Values of M (M x , M y , M z ) are measured at each tooth and used to build lookup table.
  • measurements may be taken at only a subset of teeth (e.g., one front, one back left, one back right) + use model of jaw (see WO2008116743A1 ) and model of field to extrapolate calibration values for other teeth.
  • Figure 18 shows a further example of a tracking system. This embodiment differs from that of Figure 17 only in that the external device is in the form of a mirror 660 and the magnetic field generator 116 is mounted on or integral with the mirror 660.
  • Figure 19 illustrates how the magnetic measurements and other sensor measurements taken are processed in order to arrive at a measurement of which tooth surface is being brushed.
  • One or more of the sensors on the toothbrush is used to detect contact between the head of the toothbrush and the teeth in a user's jaw (s1901). This enables a determination to be made that brushing is active (s1902). Measurements recorded by particular sensors such as the accelerometer or gyroscope (s1903) can be used to determine the orientation of the brush (1904), including its pitch and/or its roll.
  • the magnetic sensor on the toothbrush is used to measure the x, y, and z components of the magnetic field.
  • the measurement of the z component (M z , the vertical field component) (s1905) is used to determine whether it is the upper jaw or lower jaw which is being brushed (s1906) and also whether the left or right side of the mouth is being brushed (s1910).
  • the absolute magnitude M of the magnetic field may be obtained (s1907) with or without consultation from a lookup table (s1908) and enables determination of the position of the toothbrush and therefore, in combination with the determination of which mouth part is being brushed (s1910) and which jaw is being brushed (s1905), allows the processor to calculate which tooth or teeth are being brushed.
  • Figure 20 shows an example of magnetic sensor data taken using the system of Figure 17 .
  • the magnetic sensor located within the toothbrush was moved along an artificial jaw in presence of the field generated by the 2 magnets.
  • Each number on the plot corresponds to a particular tooth; number 1 corresponding to the back molar left, going around clockwise to number 14 which corresponds to the back molar right.
  • the points for the different teeth are well separated, showing that each tooth has a unique M z and
  • An initial calibration is typically carried out to calibrate the magnetic readings. Magnetic readings are taken in at least three locations along the jaw.
  • Figures 21a and 21b provide a simple illustration of how magnetic field measurements, particularly the vertical field component M z can be used to capture location of the toothbrush with increased resolution.
  • the vertical field component M z breaks the left/right symmetry and its sign can be used to determine whether the device is on the left (tooth 1-7) or right (tooth 8-14) side.
  • Figure 22 shows an example of further calibration to correct for arbitrary device orientation.
  • the probe was held in different orientations for different teeth.
  • the field magnitude is invariant, however, the measured Mz component is strongly affected ( Figure 22a ).
  • the gravitational acceleration can be separated from the overall acceleration data using a low pass filter.
  • the acceleration due to gravity can be used as a reference point against which the co-ordinate system is updated to retrieve only the vertical field component ( Figure 22b ).
  • the sound sensor simplifies interpretation of the data since it can detect when contact is made between the toothbrush head and the jaw so that the measured location can only be located on a position of the jaw.
  • the accelerometer enables the locational data to be improved further because it can be used to correct for the rotational position (orientation) of the toothbrush.
  • Figure 23 shows an example of an output from a sound receiver of a toothbrush.
  • the toothbrush detects changes in the continuous sound spectrum generated by an electric or sonic toothbrush.
  • the sound profile (pitch, magnitude) of an electric or sonic (i.e. vibrating) toothbrush changes when the brush head is in contact with a surface.
  • the sound is measured by the sound recorder on the toothbrush and recorded.
  • the processor then applies a combination of simple filters and efficient classification techniques (e.g. decision tree) to detect brushing contact based on sound profile changes. In this way, it is possible to avoid resource-hungry sound processing.
  • Figure 23a depicts a sound frequency spectrum recorded by a sound sensor of the toothbrush during "free vibrations", that is to say, the head of the toothbrush is not in contact with the jaw.
  • the additional strain on the electric generator/vibrator leads to shifts in the emitted sound. In this case, from 137Hz to 120Hz. This shift in frequency can be picked up efficiently by the computing module of the toothbrush.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Brushes (AREA)

Claims (12)

  1. Brosse à dents intelligente (1) comprenant :
    - un corps de brosse à dents, comprenant une tête (2) pour brossage et un manche (3) ;
    - un détecteur de position pour détecter la position de la brosse à dents (1) dans la bouche à différents points pendant une session de brossage des dents ;
    - un chronomètre pour enregistrer le temps auquel chaque position détectée apparaît ; et
    - un module de calcul avec une mémoire (U9), le module de calcul configuré pour :
    - enregistrer la position détectée à différents points pendant une session de brossage des dents ;
    - enregistrer le temps auquel chaque position détectée a été détectée ; et
    - produire une carte du trajet pris par la brosse à dents pendant la session de brossage des dents en utilisant les position enregistrée et information de temps,
    dans laquelle le corps de brosse à dents comprend de plus :
    - un réservoir (1506) pour un ingrédient actif, le réservoir en communication fluide avec la portion de tête (2) ; et
    - un distributeur pour distribuer l'ingrédient actif du réservoir (1506) via la tête (2) lors de la réception d'un signal d'activation.
  2. Brosse à dents intelligente (1) selon la revendication 1, dans laquelle le module de calcul est programmé pour utiliser un algorithme d'apprentissage machine de commercial en déplacement pour traiter les position enregistrée et information de temps et produire une carte d'une trajectoire optimale à prendre par la brosse à dents (1) pendant la session de brossage des dents.
  3. Brosse à dents intelligente (1) selon la revendication 1 ou revendication 2, incluant de plus au moins un autre détecteur comprenant l'un quelconque des types suivants : (i) un détecteur à base de lumière ou caméra pour détecter l'une quelconque des plaque dentaire vivante et/ou morte (par exemple tâchée), tâche de dent, couleur de dent, fluorescence, implants, matériau d'obturation dentaire, brillance, éclat, sang, et présence de dentifrice ; (ii) électrodes ions-sélectives pour fluorure, zinc, calcium, ou étain ; (iii) un détecteur pour gaz dissous ; (iv) un détecteur de gaz pour odeurs ; (v) un voltmètre pour potentiel Redox ; (vi) un détecteur pour peroxyde d'hydrogène ; (vii) un détecteur de viscosité ; (iii) un détecteur d'hormone ; (ix) un détecteur de température (U12) ; (x) un détecteur de conductivité, dans laquelle le signal d'activation est dérivé d'un ou plusieurs des autres détecteurs.
  4. Brosse à dents intelligente (1) selon la revendication 1, dans laquelle le signal d'activation est enclenché en réponse à une information provenant du détecteur de position pour détecter la position de la brosse à dents (1) dans la bouche.
  5. Brosse à dents intelligente (1) selon la revendication 4, dans laquelle l'information détectée par le détecteur de position inclut :
    - une position dans une carte prédéfinie de la bouche de l'utilisateur.
  6. Brosse à dents intelligente (1) selon la revendication 4 ou revendication 5, dans laquelle l'information détectée par le détecteur de position inclut une information de chronométrage.
  7. Brosse à dents intelligente (1) selon l'une quelconque des revendications 3 à 6, comprenant de plus au moins un réservoir supplémentaire.
  8. Brosse à dents intelligente (1) selon l'une quelconque des revendications précédentes, comprenant de plus un détecteur de son (U3) pour détecter un contact entre la tête (2) de la brosse à dents (1) et la mâchoire de l'utilisateur.
  9. Brosse à dents intelligente (1) selon l'une quelconque des revendications précédentes, dans laquelle la carte du trajet est affichée sur un appareil mobile (20).
  10. Brosse à dents intelligente (1) selon l'une quelconque des revendications précédentes, comprenant de plus un module de communication (U7) pour communiquer avec un appareil externe (20).
  11. Système comprenant la brosse à dents intelligente (1) selon l'une quelconque des revendications précédentes et un appareil externe (20) qui inclut ou est en contact avec un ordinateur externe (30) ayant une base de données ;
    - dans lequel l'appareil externe (20) envoie des trajets enregistrés à partir de la brosse à dents intelligente (1) à la base de données et compare des trajets nouvellement enregistrés avec des trajets préalablement stockés.
  12. Système selon la revendication 11, dans lequel un signal d'alerte est enclenché sur l'appareil mobile (20) si le trajet nouvellement enregistré est différent d'un trajet préalablement enregistré de plus d'une quantité prédéfinie.
EP17783796.0A 2016-10-07 2017-10-02 Brosse à dents intelligente Active EP3522753B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP16192821 2016-10-07
PCT/EP2017/075000 WO2018065374A1 (fr) 2016-10-07 2017-10-02 Brosse à dents intelligente

Publications (2)

Publication Number Publication Date
EP3522753A1 EP3522753A1 (fr) 2019-08-14
EP3522753B1 true EP3522753B1 (fr) 2022-07-13

Family

ID=57113190

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17783796.0A Active EP3522753B1 (fr) 2016-10-07 2017-10-02 Brosse à dents intelligente

Country Status (4)

Country Link
EP (1) EP3522753B1 (fr)
CN (1) CN109788845B (fr)
BR (1) BR112019006175B1 (fr)
WO (1) WO2018065374A1 (fr)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019120648A1 (de) 2018-08-02 2020-02-20 Ranir, Llc Drucksensorsystem und Verfahren für eine elektrische Zahnbürste
WO2020131473A1 (fr) * 2018-12-21 2020-06-25 The Procter & Gamble Company Appareil et procédé pour faire fonctionner un appareil de toilette personnelle ou un appareil de nettoyage domestique
US20200411161A1 (en) * 2019-06-25 2020-12-31 L'oreal User signaling through a personal care device
CA3147865A1 (fr) * 2019-08-02 2021-02-11 Colgate-Palmolive Company Brosse a dents, systeme et procede de detection de sang dans une cavite buccale pendant le brossage des dents
EP4203747A1 (fr) * 2020-11-03 2023-07-05 Colgate-Palmolive Company Système de détection de sang dans une cavité buccale pendant le brossage des dents
GB2602088B (en) * 2020-12-17 2023-06-14 Dyson Technology Ltd Oral treatment device
CN113317899B (zh) * 2021-05-31 2023-05-23 余姚永耀光电科技有限公司 一种科教儿童电动牙刷

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080109973A1 (en) * 2006-11-15 2008-05-15 Farrell Mark E Personal care products and methods
GB0706048D0 (en) 2007-03-28 2007-05-09 Unilever Plc A method and apparatus for generating a model of an object
US8881332B2 (en) * 2007-08-17 2014-11-11 Lise W. Noble Toothbrush system utilizing oral care capsule
WO2009151455A1 (fr) * 2008-06-13 2009-12-17 Colgate-Palmolive Company Instrument de soins buccaux à libération active
EP2512290B1 (fr) * 2009-12-17 2018-04-18 Unilever PLC Système de suivi de mouvement d'une brosse à dents
WO2012034786A1 (fr) 2010-09-15 2012-03-22 Unilever Plc Surveillance de l'utilisation d'une brosse à dents
US10172443B2 (en) * 2013-08-11 2019-01-08 Yong-Jing Wang Oral care tools and systems
EP2896319B1 (fr) * 2014-01-21 2018-04-18 Braun GmbH Brosse à dents électrique ou rasoir électrique
CA2937083C (fr) * 2014-02-20 2018-09-25 Braun Gmbh Systeme de soins bucco-dentaires
CN105528519B (zh) * 2015-12-08 2019-02-12 小米科技有限责任公司 智能协助清洁牙齿的方法及装置

Also Published As

Publication number Publication date
WO2018065374A1 (fr) 2018-04-12
CN109788845B (zh) 2021-03-09
EP3522753A1 (fr) 2019-08-14
CN109788845A (zh) 2019-05-21
BR112019006175B1 (pt) 2022-12-20
BR112019006175A2 (pt) 2019-06-18

Similar Documents

Publication Publication Date Title
EP3522752B1 (fr) Brosse à dents intelligente
EP3522753B1 (fr) Brosse à dents intelligente
EP3522751B1 (fr) Brosse à dents intelligente
JP6925343B2 (ja) フィードバック手段により最適な口腔衛生を得るための方法及びシステム
CA2985287C (fr) Systeme de brosse a dents dote d'un magnetometre pour assurer la surveillance de l'hygiene dentaire
EP3010441B1 (fr) Système de brosse à dents doté de capteurs pour un système de surveillance de l'hygiène dentaire
CN108066030A (zh) 口腔护理系统和方法
JP6684834B2 (ja) 口腔清掃装置の位置特定のための方法及びシステム
CN108430264B (zh) 用于提供刷洗过程反馈的方法和系统
JP7454376B2 (ja) 歯の清掃中のユーザの頭の向きを特定する方法
JP2019534094A (ja) 口腔クリーニング装置の位置特定のための方法およびシステム
JP2018531053A (ja) 口腔清掃装置の位置特定のための方法及びシステム
JP2018531053A6 (ja) 口腔清掃装置の位置特定のための方法及びシステム
CN110049743B (zh) 用于校准口腔清洁设备的方法及系统
CN108601445A (zh) 用于定位个人护理设备的方法和系统
JP2020501628A (ja) ガイド付き洗浄セッションに対する追従を決定するためのシステム及び方法

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190401

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

TPAC Observations filed by third parties

Free format text: ORIGINAL CODE: EPIDOSNTIPA

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

RIN1 Information on inventor provided before grant (corrected)

Inventor name: SHAW, KATHARINE JANE

Inventor name: TRELOAR, ROBERT, LINDSAY

Inventor name: ZILLMER, RUEDIGER

Inventor name: SAVILL, DEREK, GUY

Inventor name: RUSSELL, ADAM, THOMAS

17Q First examination report despatched

Effective date: 20191127

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: UNILEVER GLOBAL IP LIMITED

Owner name: UNILEVER IP HOLDINGS B.V.

RAP3 Party data changed (applicant data changed or rights of an application transferred)

Owner name: UNILEVER GLOBAL IP LIMITED

Owner name: UNILEVER IP HOLDINGS B.V.

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20220301

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602017059469

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1503837

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220815

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20220713

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221114

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221013

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1503837

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220713

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221113

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221014

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602017059469

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602017059469

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

26N No opposition filed

Effective date: 20230414

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20221031

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20221013

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221002

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221031

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230503

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221002

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221013

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20231023

Year of fee payment: 7

Ref country code: FR

Payment date: 20231026

Year of fee payment: 7

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20171002

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220713