EP3522751A1 - Intelligente zahnbürste - Google Patents

Intelligente zahnbürste

Info

Publication number
EP3522751A1
EP3522751A1 EP17780070.3A EP17780070A EP3522751A1 EP 3522751 A1 EP3522751 A1 EP 3522751A1 EP 17780070 A EP17780070 A EP 17780070A EP 3522751 A1 EP3522751 A1 EP 3522751A1
Authority
EP
European Patent Office
Prior art keywords
toothbrush
user
brushing
sensor
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP17780070.3A
Other languages
English (en)
French (fr)
Other versions
EP3522751B1 (de
Inventor
Adam Thomas Russell
Derek Guy Savill
Katharine Jane SHAW
Robert Lindsay Treloar
Ruediger ZILLMER
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Unilever PLC
Unilever NV
Original Assignee
Unilever PLC
Unilever NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Unilever PLC, Unilever NV filed Critical Unilever PLC
Publication of EP3522751A1 publication Critical patent/EP3522751A1/de
Application granted granted Critical
Publication of EP3522751B1 publication Critical patent/EP3522751B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A46BRUSHWARE
    • A46BBRUSHES
    • A46B15/00Other brushes; Brushes with additional arrangements
    • A46B15/0002Arrangements for enhancing monitoring or controlling the brushing process
    • A46B15/0004Arrangements for enhancing monitoring or controlling the brushing process with a controlling means
    • A46B15/0006Arrangements for enhancing monitoring or controlling the brushing process with a controlling means with a controlling brush technique device, e.g. stroke movement measuring device
    • AHUMAN NECESSITIES
    • A46BRUSHWARE
    • A46BBRUSHES
    • A46B2200/00Brushes characterized by their functions, uses or applications
    • A46B2200/10For human or animal care
    • A46B2200/1066Toothbrush for cleaning the teeth or dentures

Definitions

  • the present invention relates to a method of monitoring toothbrushing, more particularly to a method comprising a coaching phase and monitoring phase; both phases involving free-brushing by the toothbrush user.
  • the efficiency of toothbrushing depends not only on the regions of the mouth visited, but also on the path taken between these regions, the order in which they are visited, and the time spent at each region. These features together define the journey of a toothbrushing session. Often, users will unwittingly follow a journey through force of habit or routine, with no particular goal in mind. This can lead to some areas of the mouth receiving better or worse treatment.
  • toothbrushes with sensors to determine and/or track the position within a user's mouth are known.
  • accuracy and inconvenience for the user It is known that magnetic fields and magnetometers can be used to determine precise positions of the toothbrush during brushing in a lab or dental surgery.
  • magnetometers can be used to determine precise positions of the toothbrush during brushing in a lab or dental surgery.
  • the present invention aims to solve the above problems by providing, according to a first aspect, a method of monitoring toothbrushing comprising: a coaching phase in which:
  • a position sensor detects position information of a toothbrush during a first
  • one or more additional sensors record movement information of the toothbrush during the first instance of free brushing
  • a user-specific statistical model is generated which maps the positional information onto the movement information; and a monitoring phase in which:
  • the one or more additional sensors record movement information of the
  • the toothbrush compares the movement information received during the second instance of free brushing with the user-specific statistical model to calculate positional information.
  • the method provides two separate phases: a coaching phase and a monitoring phase.
  • a coaching phase By producing an adapted statistical model during the coaching phase and using it during the monitoring phase, it is possible to carry out the monitoring phase at home but to still benefit from more accurate monitoring usually restricted to systems at the dentist and/or hygienist.
  • both steps can be carried out during a user's free brushing. Not only does this result in a more user-friendly device, but it also eliminates erroneous reading that can arise due to the user following "guide"-type instructions.
  • the proposed technology helps customers to brush their teeth better by improving the quality and user experience during the coaching phase and thus the value of the free brushing assessment and feedback.
  • Optional features of the invention will now be set out. These are applicable singly or in any combination with any aspect of the invention.
  • the coaching phase comprises the steps of: - providing a smart toothbrush, the smart toothbrush comprising at least one position sensor and at least one accelerometer;
  • the step of detecting position information of the toothbrush may include recording accelerometer data from an accelerometer located on or within the toothbrush and also recording of positional data from the external position sensor.
  • Each of the toothbrush and the external sensor may send their data separately to an external computer which processes the information received and correlates acceleration measurements from the toothbrush with the positional information from the external sensor. For example, in some embodiments, a timestamp created upon turning on the toothbrush or in response to any specific stimulus at the toothbrush provides a reference point for the comparison and combination of the two sets of data.
  • the external computer takes the form of a mobile device.
  • the step of generating a user-specific statistical model from the combined data may take the form of multivariate classification models which may be trained for each user.
  • a multivariate normal approximation may be applied in addition to a Mahalanobis distance or multimodal probability in order to select either the most likely position or to assign weight (likelihood) to particular positions (which may correspond to particular segments at the user interface).
  • the monitoring phase comprises the steps of:
  • free brushing is simply referring to the user brushing their teeth as normal with no instructions.
  • the opposite of free brushing would be following instructions e.g. "brush top left teeth inner surface”.
  • the coaching phase includes an alternative coaching model in which the user follows an instruction guide on a mobile device, wherein the user-specific statistical model generated by the alternative coaching model maps the positional information displayed on the mobile device onto the movement information received simultaneously from the one or more additional sensors.
  • this "guide mode” may be less accurate than the "free brushing" coaching mode; however the provision of both modes in one system allows for a more flexible device.
  • the position sensor includes an external head-mounted sensor.
  • the transmitter produces a magnetic field in frame of reference of the jaw.
  • the position sensor in the brush samples the field and data generated at a given sample rate is sent back to the base station.
  • Communication between the external position sensor, toothbrush and external computer are preferably wireless.
  • the position sensor includes an external wall-mountable sensor.
  • the position sensor is located within a wall mountable mirror.
  • the position sensor is a magnetic field generator and a corresponding magnetic field sensor, the magnetic field sensor being located on the toothbrush.
  • the magnetic generator is wearable by the user.
  • the magnetic generator is wall mounted.
  • the magnetic generator is wall mounted.
  • the magnetic generator is wall mounted. For example as part of, or attached to a mirror.
  • the magnetic field generator may take the form of a DC magnetic field, such as the switching DC magnetic field technology from Asencion technology corporation. However, the skilled person would appreciate that other commercial magnetic field generators are available.
  • a system for coaching a smart toothbrush comprising:
  • a position sensor detects position information of a toothbrush during a first
  • a computer comprising a memory and a processor, the processor programmed to generate a user-specific statistical model by mapping the positional information from the position sensor onto the movement information from the one or more additional sensors to create a look-up table stored in the memory.
  • a smart toothbrush comprising:
  • toothbrush body comprising a head for brushing and a handle
  • computing module with a memory, the computing module configured to:
  • map of the journey it should be understood that the spatiotemporal path taken by the toothbrush is produced from the recorded data. In this way, it is possible to generate far more useful data than could be generated by simply recording individual positions.
  • the recordal of the temporal order in which different mouth regions are brushed enables provision of a more meaningful and personalized feedback and recommendations. For example, by recording an entire "journey” or “walk” of the toothbrush around the mouth it is possible to determine whether parts of the mouth are being neglected. Where the path recorded shows that at particular user tends to end their brushing session at a particular location of their mouth, it is likely that this area will not be receiving the same levels of attention to detail as those areas of the mouth which are targeted at the beginning of the brushing journey. It is therefore useful for the user to be alerted to this.
  • the recordal of the journey may indicate that certain areas of the mouth are missed out altogether or very rarely visited. This kind of neglect can lead to an increase in plaque and staining, and potentially to a deteriorating appearance of teeth and/or gum disease.
  • the user can adapt their brushing journey accordingly, to find a more optimal path, with optimal transitions from one area of the mouth to another. In this way, it is possible for the user to minimise any adverse effect on their oral health caused by accidental neglect and/or poor brushing technique.
  • the computing module is programmed to use a travelling salesman machine learning algorithm to process the recorded position and time information and produce a map of an optimal path to be taken by the toothbrush during the toothbrushing session.
  • a travelling salesman approach to optimal route finding is one example of possible route/path finding algorithms that may be applicable for determining an optimal path to be taken by the toothbrush during a toothbrushing session.
  • a travelling salesman machine learning algorithm may take the form of any algorithm which solves "the travelling salesman problem".
  • the algorithm receives a number of data points (a "transition matrix"); in this case, the data points may correspond to a plurality of positions in the mouth and associated timings. This information is converted into a cost function for each possible transition between data points.
  • the algorithm determines the minimum cost route between the different data points. Along the route, each of the positions in the mouth may be “visited” by the toothbrush more than once. In other words, the route may include brushing one of more of the positions in the mouth more than once.
  • an example method of optimizing the path for a given individual is provided. This may take into account user-specific issues such as sensitivity, missing or misaligned teeth, and other problem areas. By inverting the transition matrix, a least optimal path could be calculated.
  • the toothbrush body further comprises:
  • a dispenser for dispensing the active ingredient from the reservoir via the head upon receipt of an activation signal.
  • the "smart" toothbrush is capable of dispensing an active ingredient based on information received by the toothbrush.
  • this enables an active ingredient to be more precisely dispensed.
  • This avoids many of the problems associated with regular toothbrushing in which an active ingredient is pre-loaded onto the head of the brush before the brushing episode begins.
  • One such disadvantage is the fact that users often start a brushing episode at a particular location and follow an automatic brushing pattern or journey, meaning that certain areas of the mouth (usually those contacted at the start of the brushing episode) routinely receive more of the active ingredient than others.
  • By selectively dispensing the active ingredient in response to sensor information it is also possible to better control wastage of the active ingredient (e.g. due to spitting out or undissolved ingredients). It is also possible to avoid over application of active ingredients such as bleach which could be damaging to the user if applied in excess.
  • the activation signal is triggered in response to information from the sensor.
  • the information detected by the sensor includes:
  • the predefined map of the user's mouth may correspond to the 10 segments described herein.
  • the information detected by the sensor includes timing information.
  • the start of timing may be triggered by a single event such a sensor threshold being reached, a change in direction being detected, a change in acceleration being detected. In other events the start of timing may be triggered when two or more sensors simultaneously record a threshold value (e.g. the microphone and accelerometer simultaneously recording sound and movement measurements).
  • a timestamp may be made for each record, starting either at brush turn-on or at start of brushing. Where the timestamps (and associated recorded data) are recorded from the brush-turn-on, an extra step of determining the start of brushing may also be incorporated. Suitable mechanisms for starting the brushing are described herein.
  • the smart toothbrush further comprises at least one additional reservoir(s).
  • the active located within the additional reservoir may provide an additional care step which complements a first active, such as toothpaste located within the first reservoir.
  • An example of such an additional care step is gum conditioning serum.
  • the different actives may be dispensed at the same time as one another, or at different times to one another. Where different actives are applied at different times, each active will be triggered by a different sensor reading. For example, when the sensor detects that the toothbrush head is in a first predetermined location within the mouth, a first active may be dispensed from the first reservoir. When the sensor detects that the toothbrush head is in a second predetermined location within the mouth, a second active may be dispensed from the second reservoir.
  • a second active may be dispensed from the at least one additional reservoir simultaneously with the dispensed first active from the first reservoir.
  • the smart toothbrush further comprises a sound sensor for detecting contact between the head of the toothbrush and the jaw of the user.
  • the map of the journey is displayed on a mobile device.
  • raw data is processed on brush, then position and time stamp sent via wireless communication methods to the external computer (which may take the form of a mobile device such as a mobile phone or tablet but may also take the form of a smart mirror or wall-mounted computer screen).
  • the raw data itself is transmitted to the external computer and subsequently processed at the external computer (again, this may take the form of a mobile device such as a mobile phone or tablet).
  • Wireless communication may take the form of Bluetooth, Wi-Fi or similar.
  • a tracking system for recording the position of a toothbrush during toothbrushing comprising: a toothbrush, the toothbrush comprising:
  • the magnetic sensor detects the absolute position of the toothbrush within the field created by the magnetic field generator.
  • the use of a combination of inertial measurement sensors, sound, and a bespoke magnetic field achieve high-resolution toothbrush tracking.
  • the magnetic field is generated by wearable magnets or field generators, allowing low-cost design and easy use of the system in dentist practices or at home.
  • the orientation sensor is an accelerometer or an inertial measurement unit for measuring the rotational position of the toothbrush.
  • the sensor for detecting contact between the toothbrush and the jaw of the user is a sound sensor
  • the magnetic field generator is a wearable magnetic field generator
  • the magnetic sensor detects the absolute position of the toothbrush within the field created by the magnetic field generator.
  • the magnetic field is tuned to generate iso-surfaces of magnitude which lie perpendicular to the jaw.
  • the magnetic sensor comprises a processor which detects the absolute position of the toothbrush by measuring the field magnitude, M of the brush and the vertical field component M z of the brush.
  • an initial calibration step is carried out to create a look-up table for correlating measured magnetic field values with position information.
  • One such calibration step is carried out by initially moving a magnetometer probe in a controlled way, tooth-by-tooth along predetermined points along the jaw of the user, for example along the centres of the biting surfaces of each tooth. The resulting magnetometer readings are stored in the look-up table alongside the predetermined position points.
  • the magnetometer will have a known separation from the head of the toothbrush which contacts the surface of the user's teeth. This separation can be adjusted for, by using a pre-determined model. Since the level of adjustment may depend upon the orientation of the brush, the orientation of the brush can be taken from accelerometer data and incorporated into the model.
  • Values of acceleration such as acceleration due to gravity are desired to obtain information about the orientation of the brush (rather than information about the kinetic acceleration of the brush) and may be obtained by applying low pass filtering to the overall output obtained from the
  • accelerometer the overall output containing contributions from gravitational acceleration and also contributions from kinetic acceleration.
  • the toothbrush comprises:
  • a sound receiver for detecting changes in frequency of the electric motor and/or vibrator when the toothbrush is in contact with a tooth.
  • the toothbrush comprises a data logger for recording data from the magnetic sensor and/or the orientation sensor;
  • the data logger is only activated when the sound receiver detects a value which meets a predetermined threshold criterion.
  • the data logger comprises a memory for recording the detected orientation information and a processor programmed to convert this orientation information into positional information.
  • the processor is configured to carry out multivariate classification which typically involves a trained lookup of cluster parameters used to classify new data.
  • the sound receiver includes a frequency filter for detecting changes in frequency of the electric motor and/or vibrator when the toothbrush is in contact with a tooth; and - wherein the data logger is only activated when the sound receiver detects a value which meets a predetermined threshold criterion.
  • the sensor for detecting contact between the toothbrush and the jaw of a user is:
  • the pressure sensor may take the form of a piezoelectric material which generates a voltage when pressure applied to the head of the toothbrush is referred to the piezoelectric material.
  • the bristles of the toothbrush may directly transfer pressure from the interaction between the user's jaw and the toothbrush head to the piezoelectric material.
  • pressure may be applied to the piezoelectric material by a flexible portion of the body of the toothbrush, usually its neck.
  • a toothbrush comprising:
  • a body including: a head portion and a handle portion; the body further comprising: at least one sensor and a reservoir for an active ingredient, the reservoir in fluid communication with the head portion;
  • a dispenser for dispensing the active ingredient from the reservoir via the head upon receipt of an activation signal
  • a toothbrush comprising:
  • - a body having a head portion and a handle portion
  • an electric motor and/or vibrator for providing movement and/or rotation of the head of the toothbrush
  • a sound receiver for detecting changes in frequency of the electric motor and/or vibrator when the toothbrush is in contact with a tooth
  • the data logger is activated when one or more of the sensors or sound receiver detects a value which meets a predetermined threshold criterion.
  • Figure 1 shows an example of a smart toothbrush according to any one of the embodiments described herein
  • Figure 2 shows a schematic diagram of a smart toothbrush according to any one of the embodiments described herein
  • Figure 3 shows further schematic diagrams of a smart toothbrush according to any one of the embodiments described herein.
  • Fig 3a shows an overview of the smart toothbrush including the location of a printed circuit board (PCB) within the body of the toothbrush
  • Fig 3b shows a top view of the PCB
  • Fig 3c shows a bottom view of the PCB
  • PCB printed circuit board
  • Figure 4 shows a schematic diagram of an example system capable of carrying out a method of monitoring toothbrushing according to the present invention
  • Figure 5 shows an example of an external device such as a mobile device for use with the method of monitoring toothbrushing and/or for communication with any one of the toothbrushes described herein;
  • Figure 6 shows an example of an interactive display provided to the user on a mobile device
  • Figure 7 shows a further example of an improved interactive display provided to the user on a mobile device
  • Figure 8 shows an example of a further improved interactive display provided to the user on a mobile device
  • Figure 9 shows a flow diagram depicting example steps in the method of monitoring toothbrushing
  • Figure 10 shows shows a further example of an interactive display provided to the user on a mobile device
  • Figure 1 1 shows shows a further example of an interactive display provided to the user on a mobile device
  • Figure 12a shows shows a further example of an interactive display provided to the user on a mobile device before brushing
  • Figure 12b shows an example of the same display viewed during or after brushing
  • Figure 12c shows an example of the steps taken to provide feedback in the form of user displays to the user during and after brushing
  • Figure 12d shows an example of the steps taken when the toothbrush is unable to connect with the external device
  • Figure 12e shows connections between the external device and an external database
  • Figure 13a shows a map of the journey taken by the toothbrush during a toothbrushing session using the recorded position and time information from the session;
  • Figure 13b shows the same data, broken down into each step of the journey;
  • Figure 13c is a key explaining which parts of the jaw, and in particular which tooth surfaces correspond to which of the numbers 1 to 10 used throughout this application to label the sections of the jaw corresponding to the 10 segments on the user display at the external device;
  • Figure 13d shows an example of a multivariate analysis used to match the detected position of the toothbrush by the one or more sensors to a relevant segment on the display shown to the user at the external device;
  • Figure 13e shows a map of an optimal path created using a travelling salesman machine learning algorithm
  • Figure 14 shows example steps taken to track the journey (i.e. the path) taken by the toothbrush during a brushing session in order to provide feedback to the user;
  • Figure 15a, 15b, 15c, 15d, and 15e each show respective embodiments of a smart toothbrush according to the present invention
  • Figure 16 shows an example of the steps taken to track the journey (i.e. the path) taken by the toothbrush during a brushing session in order to provide dosing triggers to the user and/or to control dosing
  • Figure 17 shows an example of a tracking system according to an embodiment of the present invention
  • Figure 18 shows a further example of a tracking system according to an embodiment of the present invention.
  • Figure 19 shows an example processes followed by a smart brush with a magnetic sensor according to an embodiment of the present invention
  • Figure 20 shows an example of data taken using the system of Figure 17
  • Figures 21 a and 21 b show examples of data from calibration measurements carried out by a tracking system for recording the position of a toothbrush during toothbrushing;
  • Figures 22a and 22b show examples of further calibration to correct for arbitrary device orientation carried out by a tracking system for recording the position of a toothbrush during toothbrushing according to the present invention.
  • Figures 23a and 23b show examples of an output from a sound receiver of a toothbrush according to or for use with the present invention.
  • the smart toothbrush 1 comprises a toothbrush body, which is made up of a head 2 for contacting the teeth of a user and a handle 3 for contacting the hand of a user.
  • the head includes an array of bristles 21 which contact the teeth and gums of the user during toothbrushing in order to clean away plaque and debris.
  • the array of bristles 21 is typically mounted upon a neck piece 22 which connects the head to the handle 3 of the toothbrush body.
  • the toothbrush body is hollow, with an outer shell defining one or more cavities within which smart components such as sensors, a power source, a display, and communication means are located.
  • components located within the cavity include: a display 41 ; sensors 42 (including one or more of the following: an accelerometer, a gyroscope, and a magnetometer 42); a microphone 43; a power source (such as a rechargeable battery), a computing module including a processor and a communication module.
  • the smart toothbrush is able to determine a number of parameters including: orientation (via an accelerometer and or gyroscope); average sound-level (via a microphone); axis correlations (via a
  • phase velocity (via the accelerometer).
  • the handle has a curved exterior with ergonomic curves, including an external convex portion 31 which acts as thumb rest but which also provides a corresponding cavity for larger components such as the display 41 .
  • the computing module typically includes a memory and a processor, the processor of the computing module configured to perform one or more desired functions.
  • the processor is programmed to record the position of the toothbrush as detected by one or more of the sensors, at various points during a toothbrushing session. This recordal may take place at repeated time intervals throughout use of the toothbrush by the user and the processor may also be programmed to record the time (in the memory) at which each detected position was detected. In some embodiments, the processor is also programmed to produce a map of the journey taken by the toothbrush during the toothbrushing session using the recorded position and time information.
  • the toothbrush includes an on/off switch (not shown), typically in the form of a capacitive switch.
  • the processor can be programmed to initiate an auto switch-off after the timer detects a period of inactivity which extends beyond a given threshold.
  • the prolonged period of inactivity may be measured in relation to a specific sensor, e.g. when no activity is detected by the accelerometer for the given threshold of time. In other cases, the period of inactivity may require that two or more sensors show no activity over the same time threshold.
  • the timer (which may be a part of the computing module) may also include a feature for providing an indication to the user once a predetermined period of activity has been measured. For example, the indication may provide a signal to the user once a continuous recording of activity has been measured by one sensor or by two or more sensors simultaneously for a predetermined brushing time (a sensible value for which would be "2" mins).
  • the communication module may take the form of a wireless communication module, such as a Bluetooth module from which data can be transmitted and/or received.
  • a wireless communication module such as a Bluetooth module from which data can be transmitted and/or received.
  • the communication module also provides an interface across which automatic downloads of data may be carried out following periods when the brush was not connected to the app.
  • U3 microphone or sound receiver
  • U7 communication module such as a Bluetooth module, specifically Bluetooth low energy (BTLE);
  • BTLE Bluetooth low energy
  • U2 temperature sensor and/or relative humidity sensor (i.e. a thermometer and/or hygrometer);
  • U12 temperature sensor (i.e. a thermometer);
  • D1 LED (light emitting diode). This may take the form of a tricolour LED and may be used as a warning indicator and/or as a status indicator. Warning indications may be related to brushing technique or may be related to toothbrush operation. For example, the LED may give a visual indication of when charging is taking place, or when the power source requires recharging;
  • the bend sensor may take the form of a piezoelectric bending sensor. In this way, it is possible to detect when the user is applying too much force to the teeth/gums via the head of the toothbrush.
  • the processor may be configured to detect when a predetermined "bend threshold" is reached and to issue a warning to the user.
  • the warning may take the form of a visual indication (e.g. lighting up or flashing of the LED) or an audio warning (via a speaker);
  • P1 speaker connector, for sending signals to the speaker, the speaker also being housed inside of the shell of the body of the toothbrush;
  • U1 processor, more specifically a microprocessor
  • U9 memory in the form of a flash drive. Combined with the processor, this memory forms the "computing module”.
  • the memory may function as a “data logger” which records information received from one or more of the sensors;
  • U1 1 voltage controller
  • U13 voltage controller
  • Figure 15a, 15b, 15c, 15d, and 15e each show respective embodiments of a smart toothbrush according to the present invention, where the toothbrush includes one or more reservoirs for active ingredient(s).
  • the reservoirs and associated dosing mechanisms described below in relation to Figures 15a-15e could each be applied to the smart toothbrush described above in relation to Figures 1 -3.
  • the active ingredients are typically liquid based but could take other forms (e.g. gas, gel, foam or another gas/liquid mixture). Examples of active ingredients include a serum or a hardener.
  • the active material may include anti-microbial properties, or cosmetic properties such as tooth whitening.
  • the active ingredient is a solid ingredient and a reservoir which is in fluid communication with the solid ingredient contains a solvent. When the solvent comes into contact with a portion of the solid ingredient, it will dissolve an amount of the solid ingredient, which can then be transported to the head of the toothbrush as the active ingredient to be dispensed.
  • One advantage of such an embodiment is the fact that the frequency at which the solid ingredient needs to be replaced will be less than the frequency by which a liquid or gas ingredient would need to be replaced.
  • the solvent stored in the reservoir is water, it will be readily available to the user.
  • the reservoir and dosing mechanism enables adverse ingredients to be kept apart (e.g. an active component that would interact badly with toothpaste may only be dispensed once toothpaste has been washed away). It can also provide a mechanism for adding extra ingredients that will interact positively with other ingredients such as toothpaste.
  • some embodiments comprise a reservoir 1506 for the active ingredient.
  • the reservoir 1506 is in fluid communication with the head of the toothbrush via a fluid channel 1506b.
  • the release of an active ingredient is achieved by a biomechanical pumping system, triggered by manual actuation by the user.
  • the reservoir may be formed of a flexible material which the user pushes upon or squeezes in order to release the active material.
  • the toothbrush may be configured to issue an alert, usually an audio alert to indicate to the user that it is time for them to engage with the brush (e.g. to squeeze a portion of the brush) in order to release of the active ingredient.
  • the reservoir includes an electronically controlled actuator 1507, 1508 such as a pumping system or a valve, which is directly controlled by the computing module on the toothbrush.
  • the pumping system could be one or more of the following: microfluidic; piezoelectric (similar to an inkjet printer).
  • the reservoir 1506 for the active ingredient and the electronic control (and pump) 1507 are located within the handle of the toothbrush.
  • the electronics, pump and active ingredient are all located in a removable module 1508 located within the head of the body of the toothbrush.
  • this removable module could be easily relocated in a new toothbrush once the bristles of the old toothbrush become worn.
  • the active ingredient is delivered from the reservoir 1506 to user via apertures located at the base of the bristles.
  • the head of the toothbrush 1502 comprises bristles and one or more ingredient delivery structures 1510 located within the bristles.
  • the ingredient delivery structures 1510 transport the ingredient in a direction parallel to the longitudinal axis of the bristles before releasing the active ingredient. In this way, the user is able to more accurately target specific areas of the mouth for applying the dose of the active ingredient.
  • the ingredient delivery structure(s) 1510 may take the form of hollow bristles, hollow blades or of micro pipes where the hollow blades or micro pipes may be formed from a material such as a rubber-based material which is different from the material of the bristles. In other embodiments, the ingredient delivery structure(s) may take the form of a sponge.
  • the head of the toothbrush 1503 does not contain any conventional bristles. Instead, it comprises only ingredient delivery structures 1510 such as hollow bristles and/or blades. Whilst the heads of Figures 15a, 15b and 15c, and more particularly the bristle options, are shown in combination with a manually actuated toothbrush, they could equally be applied to the toothbrush bodies of Figures 15d and 15e. Similarly, although the "smart features" of the smart toothbrush are not shown in the schematic diagrams of Figures 15a-15e, each of these dosing toothbrushes will contain the smart features described elsewhere in this application (including the one or more sensors, the computing module etc.).
  • the reservoir and associated dosing mechanism enables adverse ingredients to be kept apart (e.g. an active component that would interact badly with toothpaste may only be dispensed once toothpaste has been washed away). It can also provide a mechanism for adding extra ingredients that will interact positively with other ingredients such as toothpaste.
  • the smart toothbrush may include an electric motor and/or a vibrator for providing movement and/or rotation of the head of the toothbrush.
  • the sound receiver may be utilised to measure the frequency of the electric motor and/or vibrator.
  • a program run by the processor records this frequency over time and detects changes in frequency of the electric motor and/or vibrator which occur when the toothbrush is in contact with a tooth. This detection of contact between the bristles of the head and the tooth can be used as a trigger for other events (e.g.
  • FIG. 4 shows a schematic diagram of an example system capable for carrying out a method of monitoring toothbrushing according to the present invention
  • the system comprises the toothbrush 1 and a mobile device 20 upon which an app 2 can can be stored and run, the mobile device being communicably connected to the toothbrush.
  • the mobile device can be connected to one or more computers via a network 40.
  • Each of the mobile device and the toothbrush can communicate wirelessly with additional items in the system such as external sensors.
  • an external position sensor or a magnetic field generator This could be a one-to-one communication channel such a Bluetooth or may be by providing access for the additional items to the network 40.
  • the mobile device can be replaced with a different type of external computer such as a smart mirror or a wall mounted computer.
  • Wireless communications 4a between the toothbrush 1 and the mobile device 20 enable the flow of information therebetween.
  • This flow of information could be a flow of data from the toothbrush to the app running on the mobile device such as raw data from one or more of the sensors on the toothbrush or processed data which has been recorded by one or more of the sensors on the toothbrush and then processed by the processor of the toothbrush before being forwarded via the communication module of the toothbrush to the mobile device.
  • Information passed between the communication module of the toothbrush and one or more interfaces of the mobile device may include toothbrush parameter updates.
  • Wireless communications 4b between the mobile device 20 and the network 40 may also include the flow of data. This is usually processed by either the toothbrush or the app 2 on the mobile device, but could also take the form of raw data. Brush parameter updates and software updates for the app 2 may also be sent from the network 40 to the mobile device 20.
  • the network may include the internet and/or one or more local area networks (LANs) or wide area networks (WANs).
  • LANs local area networks
  • WANs wide area networks
  • One or more external computer(s) 30 such as servers may have connections 4c to the network in order to communicate with the mobile device 20 over the network 40.
  • One or more of the external computer(s) includes data storage and data processing.
  • the data stored and processed at the external computer(s) typically derives from a plurality of different users (via a plurality of different mobile devices and smart toothbrushes) and may include personal data, although this may be anonymous.
  • the external computer(s) therefore provide a platform for data analytics taken from all or at least a proportion of the total number of users using their respective toothbrushes in their own homes.
  • the communication channel between the company user 50 and the external computer(s) 30 therefore includes the flow of information from the external computer to the user of data mining insights from data analytics. It will also include the flow of settings and other information from the company user to the external computers. For example, software updates may originate from the company user 50 and reach the mobile device 20 via the external computer 30 and the network 40.
  • Data may be pushed to the server after a brushing session has finished.
  • External parties such as dentists may be provided with access to the data stored on the External computer 30 via the network 40.
  • the user themselves may relay information from the toothbrush or from the mobile device to the dentist.
  • FIG. 5 shows an illustrative embodiment of a mobile device which is suitable for practicing the various aspects and embodiments of the present invention.
  • the mobile device is suitable for use with the method of monitoring toothbrushing described herein and also for communication with any one of the toothbrushes described herein.
  • the mobile device may include all of the components shown but may contain more, or less. It should be understood that in embodiments where the mobile device is replaced by alternative external computers such as a smart mirror or a wall mounted computer, the alternative external computer will include the same features as those described below in relation to the mobile device.
  • the mobile device 20 typically includes a digital imaging device 60 such as a digital camera for recording digital photographs. These photographs may then be stored in a data storage section of the memory 22.
  • the mobile device 20 shown includes a central processing unit (CPU) 21 in
  • a power supply 23 a network interface 24, a display 25, an input/output interface 26, an audio interface 27, a flash 28 and user controls 29.
  • the power supply 23 provides the power used by the mobile interface and may take the form of a rechargeable battery and/or external power source.
  • the network interface 24 provides a mechanism for the mobile device to communicate directly or indirectly with any compatible smart device and includes circuitry configured for use with one or more communication protocols and technologies including but not limited to: GPRS; GSM; TDMA; transmission control protocol/Internet protocol (TCP/IP); CDMA; WCDMA; Wi-Fi; 3G, 4G, Bluetooth or any other wireless communication protocols.
  • the display 25 may be an LCD (Liquid crystal display), a plasma display or any other suitable electronic display and may be touch sensitive in that it may include a screen configured to receive an input from a human digit or a stylus.
  • LCD Liquid crystal display
  • plasma display any other suitable electronic display and may be touch sensitive in that it may include a screen configured to receive an input from a human digit or a stylus.
  • Input/output interface(s) 26 may include one or more ports for outputting information e.g. audio information via headphones, but may also be an input port configured to receive signals including remote control signals.
  • the audio interface 27 typically includes a speaker which enables the mobile device to output signals and a microphone which enables the mobile device to receive audio signals including voice control inputs for use in controlling applications.
  • the mobile device 20 shown includes a flash 28 which may be used in conjunction with the digital imaging device to illuminate an object of which a photograph is being taken.
  • User controls 29 may take the form of external buttons or slider which allow a user to control various functions of the mobile device.
  • An application saved on the device may be configured to interact with the various components of the device such that upon receiving an input from one or more of the user controls.
  • the computer program described herein may take the form of an application stored in the memory 22.
  • the mobile device may be connected to an external computer 30 either directly or via a network 40 so that computationally extensive calculations can be carried out by a computational module on the external computer, the external computer being more powerful than the mobile device and therefore capable of performing the calculations more quickly.
  • the mobile device 20 may also be configured to exchange information with other computers via the network 40.
  • Application "App" is also be configured to exchange information with other computers via the network 40.
  • Figures 6 to 8 show examples of interactive displays provided to the user on the app 2 of the mobile device 20.
  • the first example, shown in Figure 6 is a graphical user interface (GUI) 21 presented to the user by the app 2, the GUI including two "segments".
  • a first segment provides an indicator for brushing measured by the app as having taken place on the upper jaw and the second segment provides an indicator for brushing measured by the app as having taken place on the upper jaw lower jaw respectively.
  • the position of the segments mimics the position or the various surfaces around the mouth with the segment corresponding to the upper biting surface located at the upper portion of the GUI and the segment corresponding to the lower biting surface located at the lower portion of the GUI.
  • segments for "left" surfaces are located on the left side of the GUI and segments for "right” surfaces are located at the right side of the GUI.
  • the segments for the remaining tooth surfaces are grouped in pairs. For example, the upper right outer surface and upper right inner surface are located adjacent one another as sub segments of a larger segment.
  • a warning may be displayed to the user to inform them that they are brushing too hard.
  • This warning may take the form of a colour change on the visual display of the mobile device, particularly a colour change which overlays or highlights a graphic corresponding to the portion of the mouth in which brushing was taking place when the threshold was exceeded.
  • the visual warning may or may not be accompanied by an audio warning and/or a displayed message such as "brushing too hard”. Consumer studies have shown that audio warnings are particularly effective in the field of toothbrushing.
  • Figure 9 shows a high level flow diagram depicting example steps in the coaching phase of the method of monitoring toothbrushing.
  • Figure 10 shows a further example of an interactive display provided to the user on a mobile device, particularly an image which may be presented to the user during a guided mode.
  • the image shows a schematic of the jaws, with a section of teeth highlighted, the highlighted section corresponding to a segment of the 10 segment display.
  • an interactive display is provided to the user on a mobile device, giving them the option (s4) to select one of two brushing modes: a coaching mode (1 1 1 ) or free brushing mode (1 12).
  • the link to the free brushing function (s6c) of the app is non-actionable, appearing as greyed out text. Only once the coaching mode (s6a) has received and stored enough information to produce a personal model in the server (s6b) does the link for free brushing become actionable.
  • data received by the one or more sensors of the toothbrush is processed by the computing module of the toothbrush and/or external processors on the mobile device or on an external computer. This is described in more detail below, with reference to Figures 12a-12c.
  • Figure 12a depicts a user interface displaying live feedback to the user at the beginning of a toothbrushing session, where all segments are empty 121 .
  • the sensor data is processed and the performance of the user displayed in the form of "filling up" segments 122; a full segment 123 indicating that a pre-set brushing threshold (e.g. number of brush strokes) has been met.
  • a pre-set brushing threshold e.g. number of brush strokes
  • FIG. 12c An example of the process followed by the toothbrush during toothbrushing is shown in Figure 12c. These steps will be carried out regardless of whether the coaching mode (i.e. guide mode) or the "free brushing" mode is selected. The difference between the two modes is that, in the coaching (guide) mode, the mobile device will provide a guide to the user for the user to follow during brushing. The way in which data is processed and provided to the user will remain the same.
  • a first step the toothbrush is turned on. This may be via a switch or button controlled by the user or may be in response to detection of motion or touch via one or more of the sensors.
  • data from the one or more sensors is pre-processed (s122) at the processor of the computing module. This pre-processed data may be stored in memory on the toothbrush itself or may be sent to the mobile device and/or another device such as a smart mirror or other computer. The data from one or more of the sensors is used to determine whether or not
  • toothbrushing has started. For example, in some embodiments, toothbrushing is detected when the toothbrush receives a combined detection of sound and motion via the microphone and accelerometer. Each of these values may have to exceed a pre-defined threshold in order to determine that toothbrushing has started. In other embodiments, a different combination of sensor triggers may be used to determine whether toothbrushing has started, such as a force sensor in combination with a motion sensor (e.g. the accelerometer).
  • the smart toothbrush and any connected devices will be placed in a standby mode (s124). After a pre-defined amount of time (“timeout”) the system, including the smart toothbrush will turn off (s125).
  • live feedback is presented to the user by the mobile device.
  • This live feedback may include one or more of the following:
  • the toothbrush After a predetermined amount of time in which no activity has been measured by the sensors or in response to a manual switch, the toothbrush sends a signal to the mobile device to indicate that the toothbrushing session has ended, and the mobile device (or other device such as a screen or smart mirror) displays end of session feedback to the user and/or recommendations (s128).
  • the toothbrush is configured to operate in a standalone mode when no contact can be made with a mobile device (or other device). This is described in more detail in relation to Figure 12d below.
  • the toothbrush is turned on (s131 ), either by way of a user operated switch or button or in response to detection of motion or touch by one or more of the one or more sensors.
  • the toothbrush attempts to connect (s132) with an external device such as a mobile device 20 or other device (e.g. smart mirror, wall mounted computer or smart cup).
  • the connection is typically a wireless connection such as Bluetooth.
  • the toothbrush starts operation in a "data mode" (s133).
  • sensor data id pre-processed and saved in the memory of the computing module of the toothbrush.
  • the processor of the toothbrush is programmed to provide basic feedback via visual or audio signals emitted from a light source or speaker on the toothbrush in response to the processor detecting that particular criteria have been met. For example, audio or visual signals may be emitted in response to:
  • a data transfer will take place (s135). This will include synching data between the toothbrush and the device as well as sending sensor data from the toothbrush to the device. Where historic data has been stored on the brush but not transferred to the external device, this historic data will be transferred.
  • the toothbrushing session will carry on as usual (s136) either in coaching mode or in freebrushing mode, as described in relation to Figure 12c steps s126-s129 above.
  • statistical analysis will be carried out of the historical data corresponding to sessions carried out when the toothbrush and external device were unconnected.
  • the device e.g. mobile device or otherwise
  • the device should be switched on (s 141 ) and will then attempt to form a connection to the database (s142). This connection may be via a network 40.
  • the application i.e. the program
  • the device will trigger data transfer (s143) between the device and the external computer at which the database is located, and may also trigger analytics to be carried out by the external computer. For example, upon achieving a successful connection, the following steps may be carried out: historic data can be downloaded from the device to the database data can be synchronised between the database and the device
  • one or modes in which the application can run may be updated.
  • the external computer may, based upon information from the database, allow the application to unlock certain modes such as "free brushing".
  • Figure 13a shows a map of the overall journey taken by the toothbrush during a toothbrushing session using the recorded position and time information from the session.
  • Each step in the journey is set out in Figure 13b, where the numbers in the sequence 5>3>2>8>1 >4>6>10>9>7 correspond to particular parts of the jaw, as shown in Figure 13c.
  • Each of these numbered parts of the jaw correspond to a respective one of the 10 segments 121 , 122, 123 displayed to the user by the mobile device during brushing as depicted in Figures 12a and 12b.
  • Figure 13d shows an example of a multivariate analysis used to match the detected position of the toothbrush by the one or more sensors to a relevant segment on the display shown to the user at the external device.
  • the clusters
  • a smart toothbrush which includes at least a toothbrush body, comprising a head 2 for brushing and a handle 3 as well as at least one sensor for detecting the position of the toothbrush in the mouth at various points during a
  • the smart toothbrush further includes a timer for recording the time at which each detected position occurs; and a computing module with a memory, the computing module configured to (i.e. programmed to) record the detected position at various points during a toothbrushing session along with recording the time at which each detected position was detected. From these recorded points, the processor of the computing module produces time series data which corresponds to a map of the journey taken by the toothbrush around the jaw during the toothbrushing session. It will be appreciated that other methods of measuring or determining a path taken by a toothbrush around the jaw during a toothbrushing session are also possible. What is important is the path, rather than the method and sensors by which the path is
  • the computing module is programmed to use a machine learning algorithm, e.g. solving a travelling salesman problem, to process the recorded position and time information and produce a map of an optimal path to be taken by the toothbrush during the toothbrushing session.
  • a machine learning algorithm e.g. solving a travelling salesman problem
  • One or more journeys made by the user's toothbrush may be recorded at the database of the external computer 30. Newly recorded journeys may be checked with previously recorded journeys. If a deviation from a previously made (and stored) journey is detected, a processor at the mobile device may trigger a warning signal to indicate to the user that unusual behaviour has been detected. This warning may take the form of a message displayed on the screen of the mobile device and may or may not be
  • the amount of deviation required before the warning signal will be triggered is pre-set, either as a standard pre-defined value built into the software run on the device, or by a pre-defined amount which is input by the user or a health professional.
  • a distance can be calculated between these and other paths to show how similar / different they are to each other.
  • the distance may the geometric distance, or may, for example, be a measure of distance between symbol strings associated with the journeys, for example the Levenshtein distance.
  • Using the distance enables clustering to take place of brushing paths, to see how similar one brushing path is to another.
  • the distance metrics can be used to determine how close a recorded brushing journey is to to that recommended by a dentist as well as to determine whether the user is consistent in their brushing habits.
  • a deviation from the "regular" already recorded journeys can be an early indicator of a problem such as tooth sensitivity or gum disease.
  • triggering a warning upon such a deviation the user can be made aware of potential problems before they would have been otherwise.
  • Figure 13e shows a map of an optimal path created using a travelling salesman machine learning algorithm.
  • a more efficient brushing technique can be implemented by training the user to follow this calculated optimised journey.
  • the path followed during the journey has been optimised by minimising an appropriate cost function related to the transition probabilities: the less likely a transition the higher is the associated cost.
  • the transition probabilities are a function of spatial distance and biomechanical constraints such as dexterity. In this case, it is found that brushing the regions in the order of
  • the position data from the one or more sensors and the respective time information from the data are used to produce a time series (i.e. a path or journey) for that toothbrushing session (s142).
  • the time series is sent from the device of the user to a database, typically on an external computer, where it is analysed (s143) and may be compared with previous journeys by the same user or other users.
  • the external computer may be accessed by the user over the network 40.
  • the database may also receive data input from other users over the network (s144) thereby enabling analytics for an entire population of users rather than just a single user. Additional inputs may also be received at the database via the network. This may include information entered by the user themselves such as whether they are left or right handed, whether they have any problem areas in the mouth that they are aware of, or extra information they may have received from their dentist such as the need to focus more on a particular area of the mouth during toothbrushing.
  • data mining and statistical analysis (s146) of the data may be used to carry out one or more particular functions. Examples of such functions include:
  • the average path calculated for a given user may be updated based upon the path of the most recently recorded journey
  • transition probabilities may be calculated, the transition probabilities being the probability to go to state Y, given that the previous state was X
  • the above steps may be carried out at the database regardless of whether the device is still “online” and therefore connected to the database over the network.
  • the feedback that is presented to the user at their device 20 is updated based upon the results of the preceding analysis and statistics (s147).
  • Feedback updates may include one or more of the following:
  • the warning may flag up the potential problem area identified to the user, for example by highlighting the region of the jaw corresponding to the relevant "segment" on a picture of the jaw
  • a smart toothbrush with a dosing mechanism such as any of those shown in Figures 15a-e is provided (s151 ) and turned on.
  • data is obtained from the one or more sensors (s153).
  • This data is processed (s153), either in the internal processor of the computing module of the toothbrush, or at the mobile device.
  • This processing includes computing the location of the toothbrush in real time.
  • features of the teeth may be monitored in real time such as plaque strength or tooth whiteness, both of which could be detected by a camera on the toothbrush.
  • a dosing control algorithm is applied (s154) to the processed data, the algorithm including set conditions which have to be met for dosing to take place. These conditions may be set by the user via the device (e.g. mobile device) before brushing takes place.
  • the dosing algorithm tracks the position of the toothbrush within the jaw and detects when a condition is met. For example, when the location data from the sensors shows that the head of the toothbrush is in a particular location, which may correspond to a particular one of the 10 segments.
  • Further sensors that may be used to provide data suitable for generating a trigger or activation signal for dispensing one or more active ingredients from one or more reservoirs may include any of the following types: (i) light- or camera- based sensors for detecting live and dead plaque (e.g.
  • dosing or dispensing of one or more active ingredients from the one or reservoirs may be performed in response to a detected level of one or more of the above sensed parameters or combinations thereof listed in items (i) to (x) above.
  • the dosing mechanism may be primed ready to dose ahead of reaching the location at which dosing is to take place, in a predictive step. This would allow an increase in the effective dosing time or amount in the required location.
  • the dosing control algorithm may also update a log of dosage history; logging the positions and times at which dosing occurred along the path of a particular journey. This data may later be fed back to a database on the external computer 30.
  • External input may be provided (s155) to the dosing algorithm. In some embodiments, this external input will include one or more of:
  • the processor on the toothbrush will control release (s156) of the ingredient by initiating a trigger for the dosage to occur.
  • this trigger could be in the form of a warning provided to the user (e.g. audio or visual signal) to alert the user that they need to actuate the biomechanical mechanism themselves.
  • the trigger may simply be an electrical signal which actuates the pump or valve to eject the required dosage from the reservoir via the head of the brush.
  • When and/or what is dosed by the toothbrush may be influenced or driven by data that is provided to the toothbrush from an outside source rather than data that is measured by sensors in the toothbrush.
  • advice from a dentist, or user preferences could be used as data to influence and/drive dosing.
  • data from other diagnostic sensors which may be internal or external to the toothbrush, may be used to provide data that can be used to influence or drive dosing. Examples of a tracking system according to the present invention are described below with reference to Figures 17 to 18.
  • FIG 17 shows an example of a tracking system for recording the position of a toothbrush during toothbrushing according to an embodiment of the present invention.
  • the tracking system 1701 includes a smart toothbrush 1702 which may contain one or more of the features set out above, in particular those features set out in relation to the embodiments shown in Figures 1 -3 and also Figures 15a-e.
  • the toothbrush includes a head portion 2 and a handle portion 3.
  • the body of the toothbrush comprises one or more sensors, including at least an orientation sensor such as an accelerometer or a gyroscope.
  • the smart toothbrush 1702 of this embodiment includes a sensor for detecting contact between the toothbrush and the jaw of a user.
  • a sensor for detecting contact between the toothbrush and the jaw of a user This may, for example take the form of a sound sensor.
  • the system 1701 includes one or more magnetic field generators and the smart toothbrush also includes a magnetic sensor for detecting location within a magnetic field.
  • the magnetic sensor detects the absolute position of the toothbrush within the field created by the magnetic field generator.
  • the magnetic field generator 1703 takes the form of a plurality of magnets located on a pair of glasses 1704.
  • the system further comprises an external device 1705 which in this embodiment takes the form of a mobile device containing the application described in relation to Figure 9.
  • the toothbrush 1702 interacts with the mobile device 1705 in the same way as described in relation to previous embodiments, only with the added benefit of a magnetic sensor output which is recorded on the toothbrush by the magnetic sensor and can be processed either on the toothbrush itself, or sent to the mobile device for processing in order to give an output of the location of the toothbrush.
  • the presence of the magnetic field generator and magnetic field sensor enables an actual position of the toothbrush to be measured, rather than relying upon a determination of the likely position based on movement and orientation sensors such as accelerometers. In this way, it is possible to provide the user with much more accurate information about where the toothbrush is at any specific time.
  • the magnetic systems described herein can be used in two different ways. In the first of the two ways, toothbrushing always occurs in the presence of a magnetic field, the magnetic field providing a mechanism for increasing the resolution of the system. In such situations, the magnetic field measurements also remove the need for a coaching phase since the actual position of the toothbrush can be measured in real time during free brushing.
  • the magnetic system can provide a resolution great enough to provide tooth- by-tooth tracking of the toothbrush.
  • the size of the toothbrush head is often larger than the size of one tooth and therefore limits the resolution of measurements that can be obtained.
  • Values of M (M x , M y , M z ) are measured at each tooth and used to build lookup table.
  • measurements may be taken at only a subset of teeth (e.g., one front, one back left, one back right) + use model of jaw (see WO20081 16743A1 ) and model of field to extrapolate calibration values for other teeth.
  • a subset of teeth e.g., one front, one back left, one back right
  • use model of jaw see WO20081 16743A1
  • model of field to extrapolate calibration values for other teeth.
  • Figure 18 shows a further example of a tracking system according to an embodiment of the present invention. This embodiment differs from that of Figure 17 only in that the external device is in the form of a mirror 660 and the magnetic field generator 1 16 is mounted on or integral with the mirror 660.
  • Figure 19 illustrates how the magnetic measurements and other sensor measurements taken are processed in order to arrive at a measurement of which tooth surface is being brushed.
  • One or more of the sensors on the toothbrush is used to detect contact between the head of the toothbrush and the teeth in a user's jaw (s1901 ). This enables a determination to be made that brushing is active (s1902). Measurements recorded by particular sensors such as the accelerometer or gyroscope (s1903) can be used to determine the orientation of the brush (1904), including its pitch and/or its roll.
  • the magnetic sensor on the toothbrush is used to measure the x, y, and z components of the magnetic field.
  • the measurement of the z component (M z , the vertical field component) (s1905) is used to determine whether it is the upper jaw or lower jaw which is being brushed (s1906) and also whether the left or right side of the mouth is being brushed (s1910).
  • the absolute magnitude M of the magnetic field may be obtained (s1907) with or without consultation from a lookup table (s1908) and enables
  • Figure 20 shows an example of magnetic sensor data taken using the system of Figure 17.
  • the magnetic sensor located within the toothbrush of the present invention
  • Each number on the plot corresponds to a particular tooth; number 1 corresponding to the back molar left, going around clockwise to number 14 which corresponds to the back molar right.
  • the points for the different teeth are well separated, showing that each tooth has a unique M z and
  • An initial calibration is typically carried out to calibrate the magnetic readings. Magnetic readings are taken in at least three locations along the jaw.
  • Figures 21 a and 21 b provide a simple illustration of how magnetic field measurements, particularly the vertical field component M z can be used to capture location of the toothbrush with increased resolution.
  • a magnetometer probe was moved along the jaw (in this case an upper jaw).
  • the measurement was repeated for inside, centre, and outside positions.
  • the field magnitude M allows position estimation on either the left or right jaw side ( Figure 21 a).
  • Each number corresponds to a specific tooth of a jaw, where 1 -7 are located on the left hand side of the jaw, and 8-14 are located on the right hand side of the jaw.
  • the vertical field component M z breaks the left/right symmetry and its sign can be used to determine whether the device is on the left (tooth 1 -7) or right (tooth 8-14) side.
  • Figure 22 shows an example of further calibration to correct for arbitrary device orientation.
  • the probe was held in different orientations for different teeth.
  • the field magnitude is invariant, however, the measured Mz component is strongly affected (Figure 22a).
  • the gravitational acceleration can be separated from the overall acceleration data using a low pass filter.
  • the acceleration due to gravity can be used as a reference point against which the co-ordinate system is updated to retrieve only the vertical field component ( Figure 22b).
  • the sound sensor simplifies interpretation of the data since it can detect when contact is made between the toothbrush head and the jaw so that the measured location can only be located on a position of the jaw.
  • the accelerometer enables the locational data to be improved further because it can be used to correct for the rotational position (orientation) of the toothbrush.
  • Figure 23 shows an example of an output from a sound receiver of a toothbrush according to or for use with the present invention.
  • the toothbrush detects changes in the continuous sound spectrum generated by an electric or sonic toothbrush.
  • the sound profile (pitch, magnitude) of an electric or sonic (i.e. vibrating) toothbrush changes when the brush head is in contact with a surface.
  • the sound is measured by the sound recorder on the toothbrush and recorded.
  • the processor then applies a combination of simple filters and efficient classification techniques (e.g. decision tree) to detect brushing contact based on sound profile changes. In this way, it is possible to avoid resource-hungry sound processing.
  • Figure 23a depicts a sound frequency spectrum recorded by a sound sensor of the toothbrush during "free vibrations", that is to say, the head of the toothbrush is not in contact with the jaw.
  • the additional strain on the electric generator/vibrator leads to shifts in the emitted sound. In this case, from 137Hz to 120Hz. This shift in frequency can be picked up efficiently by the computing module of the toothbrush.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Brushes (AREA)
EP17780070.3A 2016-10-07 2017-10-02 Intelligente zahnbürste Active EP3522751B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP16192878 2016-10-07
PCT/EP2017/074998 WO2018065372A1 (en) 2016-10-07 2017-10-02 Smart toothbrush

Publications (2)

Publication Number Publication Date
EP3522751A1 true EP3522751A1 (de) 2019-08-14
EP3522751B1 EP3522751B1 (de) 2020-09-16

Family

ID=57113208

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17780070.3A Active EP3522751B1 (de) 2016-10-07 2017-10-02 Intelligente zahnbürste

Country Status (4)

Country Link
EP (1) EP3522751B1 (de)
CN (1) CN109862807B (de)
BR (1) BR112019006426B1 (de)
WO (1) WO2018065372A1 (de)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11324307B2 (en) 2018-08-02 2022-05-10 Ranir, Llc Pressure sensing system and method for an electric toothbrush
CN112970073A (zh) 2018-11-01 2021-06-15 联合利华知识产权控股有限公司 用于提供用户反馈的方法和设备
CN111722568B (zh) * 2020-06-30 2021-09-24 广州星际悦动股份有限公司 电动牙刷的控制方法、装置和中央控制装置
CN113038073A (zh) * 2021-02-23 2021-06-25 深圳创维-Rgb电子有限公司 刷牙行为的提示方法、牙刷、显示端及存储介质

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0706048D0 (en) 2007-03-28 2007-05-09 Unilever Plc A method and apparatus for generating a model of an object
KR100947046B1 (ko) * 2007-11-19 2010-03-10 황진상 운동체 자세 추적 장치, 운동체 자세 추적 방법, 이를이용한 칫솔 자세 추적 장치 및 칫솔 자세 추적 방법
FI20085488A0 (fi) * 2008-05-23 2008-05-23 Pump & Brush Finland Oy Älykäs hammasharjamonitorointilaite
CA2762817A1 (en) * 2009-05-20 2010-11-25 Braun Gmbh Personal body cleaning device
WO2011073010A1 (en) * 2009-12-17 2011-06-23 Unilever Plc Toothbrush tracking system
WO2012034786A1 (en) 2010-09-15 2012-03-22 Unilever Plc Toothbrush usage monitoring
US20140065588A1 (en) * 2012-08-31 2014-03-06 Ideas That Work, Llc Toothbrush Training System

Also Published As

Publication number Publication date
WO2018065372A1 (en) 2018-04-12
EP3522751B1 (de) 2020-09-16
BR112019006426B1 (pt) 2023-01-17
CN109862807A (zh) 2019-06-07
BR112019006426A2 (pt) 2019-06-25
CN109862807B (zh) 2021-02-09

Similar Documents

Publication Publication Date Title
EP3522752B1 (de) Intelligente zahnbürste
EP3522753B1 (de) Intelligente zahnbürste
EP3522751B1 (de) Intelligente zahnbürste
JP6925343B2 (ja) フィードバック手段により最適な口腔衛生を得るための方法及びシステム
CN108066030A (zh) 口腔护理系统和方法
JP6684834B2 (ja) 口腔清掃装置の位置特定のための方法及びシステム
CN108430264B (zh) 用于提供刷洗过程反馈的方法和系统
JP7454376B2 (ja) 歯の清掃中のユーザの頭の向きを特定する方法
JP2019534094A (ja) 口腔クリーニング装置の位置特定のための方法およびシステム
CN107427350B (zh) 用于口腔清洁设备定位的方法及系统
JP2018531053A6 (ja) 口腔清掃装置の位置特定のための方法及びシステム
CN108601445A (zh) 用于定位个人护理设备的方法和系统
JP2020501628A (ja) ガイド付き洗浄セッションに対する追従を決定するためのシステム及び方法
EP4236864A1 (de) Systeme und verfahren zur bestimmung der position und ausrichtung einer mundpflegevorrichtung

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190405

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

TPAC Observations filed by third parties

Free format text: ORIGINAL CODE: EPIDOSNTIPA

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20200504

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602017023831

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1313323

Country of ref document: AT

Kind code of ref document: T

Effective date: 20201015

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201216

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201216

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201217

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1313323

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200916

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20200916

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210118

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210116

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602017023831

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201002

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20201031

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

REG Reference to a national code

Ref country code: DE

Ref legal event code: R081

Ref document number: 602017023831

Country of ref document: DE

Owner name: UNILEVER GLOBAL IP LIMITED, WIRRAL, GB

Free format text: FORMER OWNER: UNILEVER N.V., ROTTERDAM, NL

26N No opposition filed

Effective date: 20210617

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201031

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201031

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201002

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20211020

Year of fee payment: 5

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20211002

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200916

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211002

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602017023831

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200923

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230503

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20231023

Year of fee payment: 7

Ref country code: FR

Payment date: 20231026

Year of fee payment: 7