WO2024007098A1 - Système de scanner buccal - Google Patents

Système de scanner buccal Download PDF

Info

Publication number
WO2024007098A1
WO2024007098A1 PCT/CN2022/103580 CN2022103580W WO2024007098A1 WO 2024007098 A1 WO2024007098 A1 WO 2024007098A1 CN 2022103580 W CN2022103580 W CN 2022103580W WO 2024007098 A1 WO2024007098 A1 WO 2024007098A1
Authority
WO
WIPO (PCT)
Prior art keywords
oral
oral health
data
condition
sensor
Prior art date
Application number
PCT/CN2022/103580
Other languages
English (en)
Inventor
Ingo Vetter
Reiner Engelmohr
Bettina ROWLANDS
Faiz Feisal Sherman
Pei Li
Xinru Cui
Original Assignee
Braun Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Braun Gmbh filed Critical Braun Gmbh
Priority to PCT/CN2022/103580 priority Critical patent/WO2024007098A1/fr
Priority to PCT/CN2023/102649 priority patent/WO2024007884A1/fr
Priority to PCT/CN2023/102651 priority patent/WO2024007885A1/fr
Priority to PCT/CN2023/102655 priority patent/WO2024007887A1/fr
Priority to PCT/CN2023/102656 priority patent/WO2024007888A1/fr
Priority to PCT/CN2023/102654 priority patent/WO2024007886A1/fr
Priority to PCT/CN2023/102657 priority patent/WO2024007889A1/fr
Publication of WO2024007098A1 publication Critical patent/WO2024007098A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0088Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • A61B5/067Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe using accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4538Evaluating a particular part of the muscoloskeletal system or a particular medical condition
    • A61B5/4542Evaluating the mouth, e.g. the jaw
    • A61B5/4547Evaluating teeth
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4842Monitoring progression or stage of a disease
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • A61C9/004Means or methods for taking digitized impressions
    • A61C9/0046Data acquisition means or methods
    • A61C9/0053Optical means or methods, e.g. scanning the teeth by a laser or light beam
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C15/00Devices for cleaning between the teeth
    • A61C15/04Dental floss; Floss holders
    • A61C15/046Flossing tools
    • A61C15/047Flossing tools power-driven
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C17/00Devices for cleaning, polishing, rinsing or drying teeth, teeth cavities or prostheses; Saliva removers; Dental appliances for receiving spittle

Definitions

  • the present disclosure is concerned with an oral scanner system that comprises an oral scanner having an oral health sensor and a processor structured and arranged for the determination of oral health data relating to at least one oral health condition.
  • an oral scanner system comprising an oral scanner having at least one oral health sensor structured and/or arranged for outputting oral health sensor data relating to at least one oral health condition, the oral scanner being structured and/or arranged for performing a scanning procedure of at least a portion of an oral cavity of a subject using the oral health sensor for acquiring the oral health sensor data, a processor structured and/or arranged to receive the oral health sensor data, and to either process the oral health sensor data to determine oral health data relating to the at least one oral health condition, and after a completion of the scanning procedure to classify the oral health data with respect to at least two condition classes relating to the at least one oral health condition and to determine one condition class from the at least two condition classes to which the oral health data belongs, or to classify, preferably after a completion of the scanning procedure, the oral health sensor data with respect to at least two condition classes relating to the at least one oral health condition and to determine one condition class from the at least two condition classes to which the oral health sensor data belongs, and a
  • an oral scanner comprising an oral health sensor having a camera structured and/or arranged for acquiring image data from at least a portion of a dentition of a subject during an optical scanning procedure and a position detector structured and/or arranged for acquiring position sensor data during the optical scanning procedure, wherein the position sensor comprises at least one of an accelerometer or a gyroscope, a processor structured and arranged to receive the image data and the position sensor data, to process the position sensor data to determine at least one position or location from at least two positions or locations of the at least portion of the oral cavity at which the oral scanner is currently performing the scanning procedure or has performed the scanning procedure at a given time instant, and either to process the image data to determine oral health data relating to the at least one oral health condition, and after a completion of the scanning procedure to classify the oral health data with respect to at least two condition classes relating to the at least one oral health condition and to determine one condition class from the at least two condition classes to which the oral health data belongs, or to classify,
  • the above aspects that are in more detailed discussed in the following assist the user in the usage of the oral scanner system by means of a continued and/or guided human-machine interaction.
  • the feedback relating to the condition classes support a more optimized usage of the oral scanner system.
  • a method of scanning at least a portion of the oral cavity using an oral scanner system is also considered.
  • Fig. 1 is a schematic depiction of an example oral scanner system comprising an oral scanner and a processor disposed in the oral scanner;
  • Fig. 2 is a schematic depiction of an example oral scanner system comprising an oral scanner and a separate device realizing or comprising the processor;
  • Fig. 3 is a schematic depiction of the basic components enabling a position or location determination for an oral scanner, where the outcome of the position or location determination is the position or location in the oral cavity at which a head of the oral scanner currently performs a scanning procedure;
  • Fig. 4 is a schematic depiction of another example oral scanner system comprising an oral scanner and an oral care device and further optional components such as a charger and where the oral scanner system may be structured and/or arranged to communicate data between its various components and to a remote computing instance;
  • Fig. 5 is a depiction of an example feedback screen as may be visualized on a display being part of an oral scanner system, where the feedback screen comprises the visualization of a live or conserve image taken in a scanning procedure and an abstract visualization of a dentition onto which scanning progress data and oral health data are overlaid in a live manner;
  • Fig. 6 is a depiction of another feedback screen as may be visualized on a display being part of the oral scanner system, where the feedback screen shows a summary of oral health data overlaid on an abstract depiction of a dentition and further a trend of a temporal development of an oral health condition is visualized in the center of the screen; and
  • Fig. 7 is a depiction of another feedback screen as may be visualized on a display being part of the oral scanner system, where various oral health data are visually overlaid onto an abstract depiction of the dentition and further a classification of the related oral health conditions is visually provided.
  • example oral scanner systems comprising example oral scanners and example processors and further optional components such as a separate device realizing at least a part of a feedback unit, e.g., comprising a display, and/or an oral care device.
  • a separate device realizing at least a part of a feedback unit, e.g., comprising a display, and/or an oral care device.
  • the phrase “structured and/or arranged” used in the present disclosure refers to structural and/or computer-implemented features of the respective component and this shall imply that the respective feature or component is not only suited for something but is structurally and/or software-wise arranged to indeed perform as intended in operation.
  • the oral scanner in accordance with the present disclosure is understood to be an oral scanner that does not provide any oral care activity as such, in particular does not comprise any oral cleaning elements, i.e., is free from oral cleaning elements or other oral treatment or care elements and does not provide any oral cleaning or oral treatment or oral care.
  • the present disclosure is concerned with an oral scanner having at least one oral health sensor without any further oral cleaning/treatment/care features.
  • the present disclosure is concerned with an oral scanner system that comprises at least an oral scanner and a processor, where the processor may be physically located at or inside of the oral scanner or may be realized as a processor that is separate, i.e., remote from the oral scanner. As will be discussed in more detail further below, the processor may also be realized in a distributed manner.
  • the oral scanner system may specifically comprise at least one separate or remote device that, e.g., realizes at least a part of a feedback unit such as a display. This shall not exclude that, e.g., the oral scanner itself comprises alternatively or additionally a display and/or at least one visual feedback element.
  • a remote display and a remote processor may be arranged together in a separate device, i.e., they may have a joint outer housing.
  • the separate device may be a proprietary or custom-made device, e.g., a charger with a display, or a generally known device such as a computer, a laptop, a notebook, a tablet, a phone like a mobile phone or smartphone, or a smart watch, which may be used to realize a separate display and/or a separate processor.
  • the oral scanner system may alternatively or additionally to the separate device comprise at least one oral care device like a toothbrush, specifically an electric toothbrush that may at least for a limited time period be coupled with the oral scanner and/or the processor, preferably coupled for the exchange of data such as by wireless communication.
  • the oral scanner and the oral care device may share the same handle and only become as such realized by attaching a respective oral scanner head or an oral care head to the handle.
  • the oral scanner system may comprise at least one charger for charging a rechargeable energy storage of the oral scanner and/or of the oral care device and/or of the separate device.
  • the charger may be a wireless charger, such as an inductive charger.
  • the oral scanner may comprise at least one oral health sensor for acquiring, detecting, measuring or determining and for outputting oral health sensor data relating to at least one oral health condition–where in the following one of the terms “acquiring” , “detecting” , “measuring” or “determining” (or other forms of these verbs or nouns derived from these verbs) is used in connection with an oral health sensor, this shall include the other terms as well.
  • the oral scanner system may comprise at least one position sensor that is structured and/or arranged to provide, i.e., output position sensor data that allows to detect, measure or determine at least one position or location at which the oral scanner currently performs a scanning procedure or has been performing a scanning procedure at a given time instant, where the scanning procedure includes the determination of oral health sensor data.
  • sensor shall be understood to cover sensor types that measure or determine a parameter relevant for an oral health condition based on an external measurement medium such as ambient light impinging onto the sensor or saliva available in the oral cavity being analyzed by the sensor, i.e., sensors comprising a sensor receiver.
  • sensor shall further cover sensor types comprising a sensor emitter arranged for emitting a measurement medium such as light, i.e., a light emitter, and a sensor receiver such as a light receiver so that the measurement or determination depends at least in part on a non-external measurement medium, which means a measurement medium that is provided by the respective sensor emitter.
  • the oral scanner is structured and/or arranged for performing a scanning procedure in which the oral scanner acquires oral health sensor data from at least a portion of the oral cavity via an oral health sensor, preferably oral health sensor data relevant for determining oral health data relating to the at least one oral health condition.
  • the oral health sensor data and/or the therefrom determined oral health data are acquired in a position-resolved or location-resolved manner, i.e., where the respective oral health sensor data and/or oral health data is assigned to position data or location data derived from position sensor data acquired by the position sensor with respect to the same time instant or period of time at or during which the oral health sensor data was acquired.
  • oral health sensor data refers to the essentially unprocessed data outputted by the oral health sensor during the scanning procedure (e.g., image data if the oral health sensor is a camera) and the term “oral health data” refers to processed oral health sensor data (e.g., normalized or absolute area per tooth or per position or location showing plaque) . It shall be understood that in some instances the oral health sensor data is itself a direct measure for the oral health condition, e.g., the oral health sensor data from a malodor sensor may not need any further processing to allow to determine whether or not the user has a bad odor as the malodor sensor may provide a level of sulfur emissions.
  • the processing of the oral health sensor data may then be considered a classification of the oral health sensor data into one of at least two condition classes, e.g., “no relevant level of malodor” as one condition class and “relevant level of malodor” as another condition class.
  • the classification may then be done by the processor by a comparison with at least one threshold value. More complex classification concepts are discussed below.
  • the processor is coupled with the oral health sensor and/or with the position sensor to receive at least one sensor datum, preferably a plurality and/or sequence of sensor data, where a single sensor datum may be received in temporal sequence to accumulate to a plurality of temporally spaced sensor data or a plurality of sensor data may be received at each measurement time instant so that this accumulates to a multiplicity of temporally spaced pluralities of sensor data.
  • Sensor data may be transmitted to the processor as sensor signals, e.g., a sensor signal may be a voltage signal as is often the output of a sensor measuring a physical, chemical or material property.
  • the sensor signals may be analog signals or digital signals.
  • the term “datum” or “data” here refers to the information content and “signal” to the physical quantity by which the sensor datum or sensor data is/are transmitted.
  • sensor data this shall refer to “oral health sensor data” provided by the oral health sensor and to “position sensor data” provided by the position sensor. Where only one of the two types of data is intended to be meant, the respective more limited term will be used.
  • the processor is preferably arranged to process sensor data from the at least one oral health sensor and the at least one position sensor so that at least one position-resolved or location-resolved oral health datum relating to at least one oral health condition is determined.
  • the oral health sensor outputs oral health sensor data that may be processed by the processor to determine oral health data and the position sensor outputs position sensor data that is processed by the processor to determine position data and the processor may further relate or assign oral health data and position data to each other so that position-resolved or location-resolved oral health data results.
  • the oral health sensor data may be assigned to the position data without a need of further processing of the oral health sensor data.
  • the oral scanner may comprise a scanner head and a scanner handle, which may be detachably connected, even so this shall not exclude that the scanner head and the scanner handle may be non-detachably connected and may form one integral device.
  • the oral scanner may have a housing enveloping a hollow in which components of the oral scanner may be disposed such as an energy source, a controller, a communicator etc.
  • the housing may allow a user to grasp the oral scanner conveniently with a hand.
  • the scanner head may be sized and shaped to be conveniently inserted into the oral cavity.
  • the housing may accommodate at least one user-operable control element such as an on/off button or on/off switch or a selector button or selector switch or other such elements typically expected or found on an oral scanner.
  • the housing may further accommodate at least one feedback element of a feedback unit structured and/or arranged for providing user-perceptible feedback to the user.
  • the feedback unit may comprise one or several feedback elements, e.g., a display provided by a separate device.
  • the at least one feedback element may comprise at least one from the list comprising, in a non-limiting manner, an optical feedback element such as a light emitter or a plurality of light emitters or a display, an acoustic feedback element such as a loudspeaker or a piezo-electric speaker or buzzer, and a tactile or haptic feedback element such as a vibrator or any other type of tactile or haptic feedback generator, e.g., a refreshable braille display.
  • an optical feedback element such as a light emitter or a plurality of light emitters or a display
  • an acoustic feedback element such as a loudspeaker or a piezo-electric speaker or buzzer
  • a tactile or haptic feedback element such
  • the oral scanner system comprises a display as an element of a feedback unit, e.g., realized at the oral scanner and/or by or at a separate device
  • the display can be arranged to visualize feedback about the oral health (sensor) data relating to at least one oral health condition for at least two positions or locations, e.g., the display may be structured and/or arranged to visualize position-resolved or location-resolved oral health (sensor) data.
  • the word sensor being in backets in “oral health (sensor) data” shall encompass oral health sensor data and oral health data.
  • the display may be structured and/or arranged to show a depiction or visualization of at least a portion of an oral cavity, e.g., an abstract depiction or more realistic depiction of at least a part of an oral cavity such as the dentition and the display may be arranged to additionally depict at least one feedback relating to the oral health (sensor) data and/or relating to the at least one oral health condition and/or to at least one condition class into which the oral health (sensor) data may have been classified with respect to the at least one oral health condition, which feedback may be realized by a change of the depiction or visualization of the at least portion of the oral cavity or by overlaying a visual representation of the position-resolved or location-resolved oral health data onto the depiction of the at least portion of the oral cavity or by depicting oral health data on the display, e.g., as text data, and relating it to a position or location within the depiction of at least the portion of the oral cavity.
  • the feedback and the depiction or visualization mentioned here may occur in a position-resolved manner. While the present disclosure is focusing on either an abstract or on a more realistic depiction of at least a portion of the oral cavity such as the complete dentition, e.g., maxilla and mandibular, together with overlayed oral health data relating to one or more oral health conditions, this shall not exclude that the oral health (sensor) data is displayed in a different manner, e.g., as a table of oral health (sensor) data relating to one or several oral health conditions per position or location within the at least portion of the oral cavity.
  • the complete dentition e.g., maxilla and mandibular
  • the feedback relating to the oral health (sensor) data may occur “live” or in real time, e.g., while the user is using the oral scanner to perform a scanning procedure, which means that the feedback may be adaptive to the live progress of the scanning procedure, where “live” shall mean that there is only a short time delay between the acquisition step and the feedback step, e.g., a time delay of below 10 seconds or below 5 seconds or below 4 seconds or below 3 seconds or below 2 seconds or below 1 second.
  • Feedback relating to the oral health (sensor) data may alternatively or additionally occur at the end of a scanning procedure by way of a summary feedback where the accumulated oral health (sensor) data is shown as a final result.
  • feedback described herein shall be understood as including feedback that is position-resolved or location-resolved.
  • This may include a classification, preferably a position-resolved or location-resolved classification of the oral health (sensor) data with respect to at least two condition classes relating to the at least one oral health condition.
  • the oral health (sensor) data and or the condition class determined in the classification step of the current scanning procedure may be compared with historic oral health (sensor) data and/or condition classes from a previous scanning procedure or from a sequence of previous scanning procedures and a trend or development of the oral health (sensor) data and/or the condition class over time may be visualized as feedback. Again, this may happen in a position-resolved or location-resolved manner.
  • Such historic data may be stored in a memory that is coupled or connected with the processor. Stored historic data may include oral care activity data relating to at least one oral care activity procedure performed with an oral care device as will be discussed in more detail below.
  • the processor may be arranged to classify the oral heath (sensor) data into at least two different condition classes relating to the at least one oral health condition, e.g., two condition classes relating to the severity of the oral health condition.
  • the processor may be arranged to preferably classify the oral health (sensor) data in a position-resolved or location-resolved manner, i.e., where the classification is done for a first position or first location such as the upper right molars and also for at least a second position or second location such as the lower left molars or the front teeth. Potential subdivisions of the oral cavity into positions or locations are in more detail discussed further below.
  • the processor may be structured and/or arranged to process the sensor data in a “live” manner, e.g., during the scanning procedure so that “live” or, commonly speaking, real-time information about the progress or status of the scanning procedure and/or the progress or status of the oral health data acquisition can be visualized as feedback on a display as was already mentioned.
  • the live display may as well comprise an abstract or more realistic depiction of at least a portion of the oral cavity and of overlayed feedback relating to at least the status of the scanning procedure.
  • the various positions or locations of the oral cavity to be scanned may be individually highlighted in a graded or staged manner so that the user can easily identify where the oral scanner still needs to be moved to or positioned to complete the scanning procedure.
  • the depicted at least part of the oral cavity may be shown in a start color, e.g., dark blue, and the individual portions relating to different positions or locations of the depicted at least part of the oral cavity may gradually be depicted in a brighter color until they are essentially white to indicate to the user a partially complete or finally complete scanning procedure with respect to the indicated position or location of the oral cavity.
  • the feedback relating to scanning procedure progress may be solely derived from position sensor data, e.g., from the accumulated time the oral scanner has performed a scanning procedure at the individual positions or locations.
  • the processor is structured and/or arranged to determine the scanning procedure progress in a more elaborated manner, e.g., by checking whether images taken by a camera preferably being part of the oral health sensor from the respective position or location of the oral cavity comprise a sufficiently complete coverage of the position or location of the oral cavity and/or whether such images have a certain image quality, e.g., are not blurred or unfocused or the like.
  • the feedback relating to the scanning procedure progress may further comprise the overlay of position-resolved or location-resolved oral health (sensor) data onto the abstract or more realistic depiction of the at least portion of the oral cavity.
  • overlaying of visualized feedback for displaying it on a display means the generation of a single image that is displayed on the display by a display controller.
  • Overlaying means here that a base image, e.g., a depiction of a dentition, is amended to reflect the additional feedback that shall be provided.
  • the various components of the oral scanner system e.g., the oral scanner, the processor, a separate display, a charger, and/or an oral care device may be arranged for data exchange or, more generally, for communication between at least two of these components in at least a unidirectional manner, preferably in a bidirectional manner. While such a data exchange or communication may be realized by a wired connection, e.g., when the processor is housed inside of the oral scanner, it is preferably realized by a wireless communication if the data exchange should occur between separate components.
  • one of the components of the oral scanner system comprises a communicator such as a transmitter or a transceiver and the other component, e.g., the processor realized in or by a separate device, comprises a communicator such as a receiver or a transceiver that may employ a proprietary or a standardized wireless communication protocol such as a Bluetooth protocol, a WiFi IEEE 802.11 protocol, a Zigbee protocol etc.
  • Each of the components of the oral scanner system may be arranged for communication with one or several other components of the oral scanner system and/or may be arranged for wireless communication with an Internet router or another device such as a mobile phone or tablet or a computer to establish a connection with the Internet, e.g., to send data to a cloud server that may be part of the oral scanner system and/or to receive data from a cloud server or any Internet service such as a weather channel or a news channel. That means that the oral scanner system may be arranged to communicate with the Internet directly or indirectly by a detour via a device not being a part of the oral scanner system.
  • the (position-resolved or location-resolved) oral health (sensor) data and/or the (position-resolved or location-resolved) condition classes are communicated from the oral scanner and/or the processor to an oral care device such as an electric toothbrush, a gum massager, an oral irrigator, a flossing device, a scaler, a tooth polishing device, a tooth whitening device, or the like.
  • the processor may communicate control data to the oral care device so that the oral care device is enabled to select one of at least two operational settings based on the control data, preferably in a position-resolved or location-resolved manner.
  • the latter requires that also a position or location at which the oral care device is currently performing an oral care activity procedure is determined or tracked or monitored.
  • An oral care device position sensor may be used for this task and reference is made to the description of position or location determination of the oral scanner as the principles are the same.
  • the oral scanner may comprise an attachment that preferably is arranged to be replaceable so that different attachments can be used for different users or for different applications.
  • an oral scanner comprising an oral health sensor that comprises a camera as sensor receiver and at least a first light source as sensor emitter (see also description further below) .
  • a light inlet for the camera and a light outlet of the at least first light source may be provided at a head of the oral scanner.
  • the attachment may then be realized as a detachable distance attachment.
  • the distance attachment may be arranged to enable a scanning procedure with an essentially constant distance between the object or the objects that are being scanned, e.g., teeth, and the light inlet of the camera.
  • a distance piece of the distance attachment may stay in contact with the object being scanned–specifically an outer surface of the object–to maintain the constant distance.
  • the camera may have a focal length that creates sharp images of objects that have the distance to the light inlet of the camera that is defined by the distance piece.
  • the distance piece may be realized as a closed wall element that surrounds the light outlet of the first light source and the light inlet of the camera so that the closed wall element effectively blocks ambient light from illuminating the currently scanned object and thus from eventually reaching the camera.
  • the distance attachment can thus solve two objects, namely, to maintain a constant distance during the scanning procedure and to effectively block ambient light from reaching the object surfaces to be scanned. The latter being of particular benefit for embodiments where the light emitted by the first light source shall mainly be responsible for the oral health sensor data, i.e., image data outputted by the camera.
  • the attachment e.g., the distance attachment
  • the attachment may be detachable to allow replacing the attachment when it is worn out or to allow changing the attachment if different attachments are used by different users of the oral scanner.
  • the attachment may also be detachable to improve accessibility of parts of the oral scanner that benefit from regular cleaning such as a window covering the light outlet of the first light source and/or the light inlet of the camera.
  • the detachable attachment itself may benefit from regular cleaning, which cleaning becomes simpler when the attachment is detachable.
  • the attachment may be immersed into a cleaning liquid to clean and to potentially sterilize it.
  • the oral scanner as proposed herein comprises at least one oral health sensor and may comprise two or more different oral health sensors.
  • the oral health sensor is understood to be a sensor that is arranged to acquire and output oral health sensor data relating to at least one property of the oral cavity that is relevant for determining a status of an oral health condition or that may be a direct measure of an oral health condition.
  • the oral health condition may relate to the presence of at least one of the following: plaque, calculus (tartar) , decalcification, white spot lesions, gum inflammation, tooth discoloration, stains, gingivitis, enamel erosion and/or abrasion, cracks, fluorosis, caries lesions, molar incisor hypo-mineralization (MIH) , malodor, presence of germs such as pathogenic germs or fungi causing candidiasis, tooth misalignment, periodontal disease or periodontitis, peri-implantitis, cysts, abscesses, aphthae, and any other indicator that a skilled person would understand to relate to an oral health condition.
  • plaque calculus
  • decalcification white spot lesions
  • gum inflammation gum inflammation
  • tooth discoloration stains
  • gingivitis enamel erosion and/or abrasion
  • cracks fluorosis
  • caries lesions molar incisor hypo-mineralization (MIH)
  • MIH
  • the oral scanner may be arranged to acquire the oral health sensor data in a position-resolved or location-resolved manner where this is possible, e.g., malodor may be an oral health condition that affects the whole oral cavity, and which may thus not sensibly be acquired in a position-resolved or location-resolved manner.
  • malodor may be an oral health condition that affects the whole oral cavity, and which may thus not sensibly be acquired in a position-resolved or location-resolved manner.
  • the latter shall not exclude that malodor is nonetheless acquired in a position-resolved or location-resolved manner and that also feedback relating to this oral health sensor data may be provided in a position-resolved or location-resolved manner, e.g., where then the feedback for all positions or locations has the same malodor level or respective condition class.
  • a classification of an input image may be done by a neural network such as a convolutional neural network (CNN) , which preferably was trained with training images and relating condition class results.
  • CNN convolutional neural network
  • a classifier used by the processor may be directly fed with the oral health sensor data, e.g., image data, or the oral health sensor data may first be processed by the processor to determine, e.g., one or several features that herein are also called oral health data relating to at least one oral health condition.
  • the oral health sensor may include only a sensor receiver that acquires oral health sensor data by using an external medium such as ambient light or saliva or gaseous components present in the oral cavity etc.
  • the oral health sensor may include at least one sensor emitter providing a primary medium and at least one sensor receiver that is arranged to detect at least the primary medium and/or a secondary medium created by interaction of the primary medium with the oral cavity, e.g., by interaction with oral cavity tissue. This shall not exclude that the sensor receiver is simultaneously also sensitive to an external medium as previously discussed.
  • the at least one sensor emitter is a narrow band light source emitting light of a certain wavelength range as primary medium and by interaction of this emitted light with certain material present in the oral cavity a second medium, namely fluorescence light of a higher wavelength may be created.
  • the oral health sensor may then further include at least one sensor filter that filters out at least a portion of the primary medium and/or at least a portion of the secondary medium prior to the respective medium reaching the sensor receiver. It seems obvious that the sensor receiver may then as well be sensitive to ambient light that can pass the at least one sensor filter. The influence of the ambient light on the data acquisition can be reduced by specific measures such as a distance attachment discussed above.
  • the oral health sensor is an optical sensor such as a photodiode, an M times N array of light sensitive elements or a camera.
  • the oral scanner comprises an oral health sensor having at least a first light source and at least one camera, the oral scanner being structured and/or arranged for performing a scanning procedure, which typically is an optical scanning procedure, where optical scanning procedure here refers to a procedure in which a sequence of images is captured by the camera.
  • the first light source may comprise a light outlet and the camera may comprise a light inlet, the light outlet and the light inlet may be provided at a head of the oral scanner.
  • a light-sensitive sensor element array such as an M times N light-sensitive sensor element array of the camera at a distance to the light inlet such as in the handle and to guide the light from the light inlet to the light-sensitive sensor element array by means of optical elements such as one or more lenses, one or more mirrors and/or one or more prisms and/or one or more lightguides etc.
  • a user-operable input element may be provided at the oral scanner that upon operation by the user may initiate the optical scanning procedure.
  • the oral scanner may comprise two or more cameras that may be arranged to allow a three-dimensional scanning of the at least portion of the oral cavity.
  • the oral scanner may comprise a second light source and potentially further light sources.
  • the different light sources may use the same light outlet, or each light source may have its own light outlet.
  • the first light source may emit light of a first wavelength or having a first wavelength range and the second light source may emit light of a second wavelength different to the first wavelength or light having a second wavelength range that does not or only partly overlap with the first wavelength or first wavelength range of the first light source.
  • the first and the second light sources may be arranged to emit different light intensities. But this shall not exclude that a first and a second light source are provided to emit light of essentially the same wavelength or having the same wavelength range and also of essentially the same intensity.
  • the first light source may emit light having a wavelength of or comprising a dominant wavelength of about 405 nm and the second light source may emit “white” light, i.e., light essentially covering the complete visual wavelength range between 400 nm and 700 nm or comprising several dominant wavelengths so that a human may consider the color impression of the emitted light as essentially white.
  • the light sources are not limited to light sources that emit light in the visual range and any light source that emits in the infrared (IR) or ultraviolet (UV) wavelength range or that at least comprises wavelength ranges that extend into these areas is considered as well.
  • the first and/or second light source may be realized by a light emitting diode (LED) , but also other light sources are contemplated, e.g., laser diodes, conventional light bulbs–specifically incandescent light bulbs, halogen light sources, gas discharge lamps, arc lamps etc.
  • LED light emitting diode
  • other light sources e.g., laser diodes, conventional light bulbs–specifically incandescent light bulbs, halogen light sources, gas discharge lamps, arc lamps etc.
  • the camera may comprise an array of light-sensitive sensor elements, where each light-sensitive sensor element may be arranged to output a signal indicative of a light intensity impinging onto a light-sensitive area of the light-sensitive sensor element. While each of the light-sensitive sensor elements may have an individual sensitivity range, i.e., individual wavelength sensitivity, the light-sensitive sensor element array may typically comprise light-sensitive sensor elements that all have about the same light sensitivity (ignoring differences in gain and the like as are typical and which are dealt with by calibration) .
  • the array of light-sensitive sensor elements may be realized as a regular M times N array even though this shall not exclude that the light-sensitive sensor elements are arranged in a different manner, e.g., in coaxial circles or the like.
  • the array of light-sensitive sensor elements may be realized as a CCD chip or a CMOS chip as are typically used in digital cameras.
  • the number of light-sensitive sensor elements may be chosen in accordance with the needs and the processing power of the processor.
  • a resolution of 640 time 480 may be one choice but essentially all other resolutions are conceivable, e.g., the camera may be a 4K camera having a 3840 times 2160 resolution or the camera may have a lower resolution, e.g., a 320 times 240 resolution. It shall not be excluded that the camera comprises a line sensor as is typically used in a paper scanner.
  • a light-sensitive sensor element encompasses RGB sensor elements, i.e., each RGB-type light-sensitive sensor element would then deliver three signals that relate to the R (red) , G (green) and B (blue) colour channels.
  • the camera of the oral scanner may comprise further optical elements such as at least one sensor lens to focus the light onto the array of light-sensitive sensor elements, even though this shall not exclude that the camera is realized as a pinhole camera.
  • the camera may also comprise at least one sensor mirror that guides the light onto the array.
  • the camera may comprise at least one sensor filter to selectively absorb or transmit light of a certain wavelength or light in at least one wavelength range.
  • the at least one sensor filter may be fixed or may be moveable, i.e., the sensor filter may be arranged for being moved into and out of the light path of the camera.
  • Several sensor filters may be provided to allow selective filtering of the light that should reach the array of light-sensitive sensor elements.
  • the sensor filter may be a long-pass filter, a short-pass filter, a band-pass filter, or a monochromatic filter.
  • the sensor filter may apply a wavelength-dependent filter characteristic so that a certain wavelength or wavelength range can pass but only at a reduced amplitude, while another wavelength or wavelength range may pass without attenuation and an even other wavelength or wavelength range may be completely blocked.
  • the sensor filter may be realized as a coloured filter or as a dichroic filter.
  • the first light source may be a narrow-band light source such as an LED.
  • the narrow-band light source may emit light in the range of between 390 nm and 410 nm (FWHM) such that a wavelength of about 405 nm is at least close to the dominant wavelength of the LED. Light of around 405 nm causes fluorescence light to be emitted by tooth enamel and by plaque as was already mentioned.
  • a sensor filter may then be used that transmits only light having a wavelength above about 430 nm, preferably the sensor filter may be a cut-off filter having a cut-off wavelength of 450 nm allowing light of greater wavelength to pass towards the light-sensitive sensor element array so that reflected light originating from the first light source is absorbed and only fluorescence light transmitted by the sensor filter is determined.
  • the camera may be realized by a camera module as is available, e.g., from Bison Electronics Inc., Taiwan.
  • the camera module may comprise a light-sensitive sensor array having an M times N pixel count of 1976 times 1200 (i.e., a 2.4 megapixel chip) realized in CMOS technology, but not all pixel may necessarily be used for capturing images during a scan.
  • the camera module may comprise a lens causing a 12.5 mm focal length so that sharp images can be captured of objects close to the camera. This shall not exclude that an autofocus camera is used.
  • a hyperspectral imaging camera may be used as well.
  • the examples relating to optical sensors, specifically cameras, are not to be understood as limiting.
  • the at least one oral health sensor may also be realized as one from the group comprising, in a non-limiting manner, temperature sensors, pressure sensors, pH sensors, refractive index sensors, resistance sensors, impedance sensors, conductivity sensors, bio sensors such as sensors comprising a biological detection element, e.g., an immobilized biologically active system, that is coupled with a physical sensor (transducer) that converts the biological-chemical signal into an electrical or optical signal and typically includes an amplifier etc.
  • a biological detection element e.g., an immobilized biologically active system
  • a physical sensor transducer
  • the oral health sensor acquires and outputs oral health sensor data that are transmitted in the form of analog or digital signals and the processor may be arranged to process the oral health sensor data to determine oral health data and/or condition class data, preferably condition class data relating to the oral health condition.
  • position sensor shall encompass all position sensor arrangements that can determine a position or location where the oral scanner head performs a scanning procedure at a given time instant in the oral cavity and may include such determination also with respect to at least one position or location that relates to the outside of the oral cavity. It shall be understood that the use of the term “position sensor” does not mean that the position sensor is itself able to directly determine a position or location inside or outside of the oral cavity, but that a position or location inside or outside of the oral cavity can be derived from the position sensor data, e.g., by a deterministic computation based on the input from the position sensor, by a decision tree, by a clustering or by a classification algorithm, to name just a few.
  • the processor may be structured and/or arranged to perform such position or location determination based on at least the position sensor data.
  • An oral health sensor such as a camera may also provide position sensor data, i.e., the oral health sensor may additionally be used as position sensor or a further camera may be provided as position sensor.
  • the image data provided by a camera provided at a head of the oral scanner may allow to determine the type of tooth that was imaged and to thus derive the position or location in the mouth (see reference to EP 2 189 198 B1 below) .
  • Document EP 3 141 151 A1 describes, inter alia, a location determination based on a fusion of image data from a camera acquiring images of the user while performing an oral care activity with an oral care device, which camera is separate from the oral care device, and of data from an accelerometer disposed in the oral care device to determine the orientation of the oral care device relative to Earth’s gravitational field.
  • a fused location determination result is computed.
  • the classification algorithms output values similar to probabilities for the plurality of locations within the oral cavity at which the oral care activity might be performed.
  • the position sensor in this example comprises a separate camera as a first position sensor and an accelerometer disposed in the oral care device (which might be the oral scanner in accordance with the present disclosure) as second position sensor.
  • the position sensor does not refer to a single sensor arrangement but that “position sensor” encompasses embodiments using two or more different position sensors to provide position sensor data.
  • Document EP 3 528 172 A2 describes, inter alia, the determination of a position or location within the oral cavity at which an oral care activity is currently performed, which determination relies on the classification of position sensor data that is a temporal sequence of inertial sensor data created by, e.g., an accelerometer and/or a gyroscope located in an oral care device by means of a neural network, preferably a recurrent neural network. Based on the trained neural network, the classification of a current temporal sequence of position sensor data provides a set of values similar to probabilities with respect to the plurality of possible positions or locations within the oral cavity. The highest value typically indicates the location at which the activity is performed.
  • EP 3 528 172 A2 shall be incorporated herein by reference.
  • the latter mentioned technologies can determine a position or location at which the oral care activity, e.g., toothbrushing, is performed with a relatively high precision (e.g., on the level of individual teeth) , which precision may justify the use of the term “position” .
  • the technologies described in the paragraphs before have, at least at the time of filing the present disclosure, not been developed to deliver results at such a high precision and may allow to determine one of 16 different segments in the dentition at which the oral care activity is performed.
  • location may be better suited as the determination typically relates to a group of teeth (e.g., the left upper molars) or to a group of surfaces of a group of teeth (e.g., the buccal surfaces of the right lower molars) .
  • Document EP 2 189 198 B1 describes the determination of a position or location in the oral cavity by analyzing camera data from a camera located at a toothbrush head. It is described that the analysis of image data can identify the tooth that is shown on the image. One may contemplate to train a classifier with labelled images of the teeth and/or other portions of the oral cavity of the user so that the processor can reliably identify the position in the oral cavity at which the scanning procedure is currently performed.
  • the processor may be any kind of general-purpose integrated circuit (e.g., IC; CPU) or application specific integrated circuit (e.g., ASIC) that may be realized by a microprocessor, a microcontroller, a system on chip (SOC) , or an embedded system etc.
  • IC general-purpose integrated circuit
  • ASIC application specific integrated circuit
  • the processor shall not be understood as necessarily being a single circuit or chip but it is contemplated to provide the processor in a distributed manner where one part of a processing task may be performed by a first processor sub-unit and one or several further processing task (s) may be performed by at least a second or several further processor sub-unit (s) , where the different processor sub-units may be physically disposed at different locations, e.g., in or at the oral scanner, in or at a remote device and/or in or at a cloud computer etc.
  • the processor may be essentially completely realized by a cloud computing device. It shall also be included that the processor may comprise analog circuit elements and integrated circuit elements or only analog circuit elements.
  • the processor has at least one input and at least one output.
  • the processor receives sensor data and/or oral care activity data from an oral care device via the input and outputs oral health data and/or condition class data and/or control data, preferably position or location resolved oral health data and/or condition class data and/or control data via the output.
  • Condition class data refers to data that classifies oral health (sensor) data into at least one of at least two condition classes, e.g., into a not severe class and a severe class or into more than two classes, e.g., into a not severe class, into a to be monitored class and in into an oral care professional visit recommended class.
  • the latter examples are for exemplification and a skilled person may use any other number of classes and may name these classes appropriately.
  • the processor is structured and/or arranged to classify the oral health sensor data and/or oral health data into at least two condition classes.
  • the oral health (sensor) data (preferably for a given position or location) may be said to be an observation and the condition classes to be categories and a classifier algorithm may then be used to decide to which of the categories the observation belongs to.
  • the oral health (sensor) data may comprise one or several variables or features characterizing the oral health condition, e.g., the oral health data may comprise a normalized area of plaque per considered position or location.
  • the classifier may then simply label the input feature (size of plaque) into the categories by comparison with one or several threshold values.
  • the threshold value (s) themselves may be derived from expert opinions or from an analysis of oral health conditions of a plurality of subjects by means of a machine learning algorithm. Instead of using a feature or a vector of features derived from the oral health sensor data, the oral health sensor data may be used as input into a classifier without any prior processing, e.g., a neural network may be directly fed with the image data acquired by an oral health sensor comprising a camera.
  • Threshold values or other parameters affecting the classification may be set to different values for different positions or locations in the oral cavity.
  • Such a position or location dependent threshold value or parameter affecting the classification for a given oral health condition may preferably be influenced by at least one from the non-limiting list including the position or location in the oral cavity in a global sense (i.e., for all users) or for an individuum, a history of evolvement of the oral health (sensor) data or the condition class relating to this position or location for the given oral health condition, or an overall or average oral health condition status for a given user.
  • a different classifier algorithm may be used in case the oral health data comprises a plurality of features.
  • a neural network may then be applied for the classification task, or any other classification algorithm known to a skilled person.
  • the classification algorithm may be chosen to be one from a non-limiting list comprising: linear classifiers, support vector machines, quadratic classifiers, kernel estimation, boosting, decision trees, neural networks, transformers, genetic programming and learning vector quantization.
  • Condition classes may be determined for at least one of the at least two positions or locations, preferably for all the positions and locations that are used to segment the at least portion of the oral cavity that is scanned. At each such position or location at least two condition classes may be defined, preferably at least three condition classes may be used (similarly to a traffic light system that either shows a green, a yellow, or a red light) .
  • the underlying threshold values or parameters used by a classifier algorithm may be adaptive and may thus change over time and may be different for different users.
  • the oral scanner system proposed herein is intended for a regular repetition of a scanning procedure such as an optical scanning procedure of at least a portion of the oral cavity of a user or of a treated subject to thereby create new oral health (sensor) data.
  • the oral scanner system may preferably be structured and/or arranged to compare newly determined oral health (sensor) data and/or condition class data with previously created oral health (sensor) data and/or condition class data and to update information about the temporal development of the oral health (sensor) data and the condition class data, which may then lead to updated information to be fed back to the user.
  • the comparison process may result in comparison data and/or in position-resolved or location-resolved comparison data.
  • the processor may comprise a memory for storing and later accessing previously and currently acquired oral health sensor data and/or position sensor data and any data created by processing such data, e.g., oral health data and/or condition class data and/or position-resolved or location-resolved oral health sensor data and/or position-resolved or location-resolved condition class data an further may also comprise comparison data and/or position-resolved or location-resolved comparison data.
  • Stored data relating to a previous scanning procedure is also referred to as historic data.
  • further data may be stored in the memory, e.g., historic scanning procedure progress data or historic oral care activity data relating to a previous oral care activity procedure performed by an oral care device, which oral care activity data may have been transmitted to the processor and be stored in the memory.
  • the historic data stored in the memory may also be used by the processor to adapt a next scanning procedure, e.g., to adapt at least one scanning procedure parameter and/or to a scanning procedure guidance, which comprises at least one feedback to be provided to the user prior to or during a next scanning procedure.
  • the oral scanner system may comprise a display as a feedback element of a feedback unit, preferably a display allowing visual depiction of oral health data and/or condition class data and/or oral scanning progress data.
  • the display may be any type of display such as an LCD, LED, OLED (PMOLED or AMOLED) etc. display.
  • the display may be a monochromatic or color display.
  • the display may have any suitable resolution such as a 96 times 48 resolution for a display implemented on the oral scanner or may comprise custom-made illuminable areas.
  • a display of a user device such as a mobile phone, table computer, laptop, smart watch etc. may be used, the respective technologies and resolutions of the displays of such user devices are to be considered.
  • an App or software running on such a device may provide the relevant programming for the general-purpose processor of the user device to function at least as one processor sub-unit or as the processor in accordance with the present disclosure.
  • the respective App or software may also implement any display control needed to visualize the information as discussed herein.
  • a display shall not exclude that the oral health (sensor) data and the scanning procedure progress data etc. are additionally or alternatively fed back to the user by means of other feedback elements of a feedback unit such as a plurality of individual visual feedback elements and/or an audio feedback element and/or a haptic feedback unit as was already described.
  • the scanning procedure progress data can be fed back by using four visual feedback elements that start at a first color, e.g., dark green, and that are controlled to gradually show a brighter green until the scanning procedure is considered as complete for a given position or location and the light indicator might then, e.g., show a white signal.
  • a first color e.g., dark green
  • the light indicator might then, e.g., show a white signal.
  • An RGB LED per light feedback element could be used for this.
  • the live communication of the oral health data could use four visual feedback elements as well and start at white to indicate no plaque and be gradually changed on a scale towards red to communicate the amount of plaque detected at the respective position or location.
  • the oral health data may be fed back to the user only at the end of the scanning procedure to indicate the levels of plaque identified in the scanning procedure.
  • a classification of the oral health data relating to plaque may then be indicated with a flashing light for a condition class “severe” .
  • a skilled person will understand how to modify the number of visual feedback elements, colors used and other means of feedback such as flashing, intensity variations etc.
  • the display comprises a display controller that converts oral health (sensor) data, preferably position-resolved or location-resolved oral health (sensor) data and/or condition class data and/or scanning procedure progress data into a visualization that is shown on the display, where the visualization is called the feedback screen.
  • the feedback screen may comprise at least one element of a graphical user interface.
  • focus is put on a feedback screen that comprises a visualization of at least a portion of the oral cavity, which visualization may be a two-dimensional visualization or a 3D type of visualization, where the latter means a visualization on the two-dimensional display that provides a three-dimensional impression.
  • the visualization of the at least portion of the oral cavity may comprise a visualization of the dentition, i.e., a visualization of the teeth of the dentition, which may be an abstract visualization or a more realistic visualization.
  • the visualization may be based on a generic model of a dentition or may take into account individual data from a user such as missing teeth or the like.
  • An abstract visualization of the complete dentition may comprise a circle or an annulus, where the top of the circle or annulus visualized on the display may be understood to represent the upper front teeth and the bottom of the circle or annulus may represent the lower front teeth, while the sides then represent the left and right molars, respectively.
  • a plurality of segments of a circle or an annulus may be visualized, e.g., an upper about 180-degree segment and a lower about 180-degree segment may indicate the maxilla and the mandible, respectively.
  • four about 90-degree segments may be used to display quadrants of the dentition, which is known to the skilled person from, e.g., the visualization on the Oral-B SmartGuide.
  • six segments may be used. Is may also be contemplated to visualize each tooth of a generic or individualized dentition by a single segment or to use any other kind of segmentation that would seem appropriate to a skilled person.
  • At least one of the segments may be separated into at least two areas that may represent inner and outer tooth surfaces, preferably three areas that represent inner and outer tooth surfaces (such as buccal and lingual surfaces) and the biting or occlusal surface, which may be particularly sensible for molars and wisdom teeth.
  • the segments were described as portions of a circle or an annulus, it is also contemplated that segments may be visualized in a different manner.
  • each tooth may be represented by a circle or a segment representing a plurality of teeth may be visualized as a plurality of overlapping circles, where the number of circles may coincide with the number of teeth that typically are represented by this segment, even though this is not to be understood as limiting.
  • the visualization of the dentition may comprise information as is used in accordance with ISO 3950: 2016.
  • a more realistic depiction of the dentition may be chosen, e.g., up to 32 teeth for the permanent dentition of a grown-up user or up to 24 teeth for the primary dentition of a child.
  • the visualization may be individualized, e.g., a user may be able to input personal tooth characteristics such as missing teeth, misaligned teeth, fillings, inlays, crowns, artificial teeth, braces etc. that may be taken into account in the visualization.
  • the user may also be allowed to provide information about the oral heath health condition of at least one surface of a tooth, at least one tooth, a group of teeth or the complete dentition and/or about the gums.
  • a user may provide input about tooth discoloration or braces or cavities etc., where the oral scanner and/or the separate device may provide an interface for inputting information.
  • the oral scanner may be structured and/or arranged to perform a scanning procedure in which relevant information of the oral cavity is acquired to individualize a visualization at the at least portion of the oral cavity in an automated manner.
  • the mentioned interface may be realized as a graphical user interface, this shall not exclude that the user can additionally or alternatively provide input by a voice recognition interface and/or a keyboard etc.
  • the interface may also allow the user to input personalization information, e.g., a name, an email address or the like and/or may allow dedicated access by a dentist to any stored data, where the latter may preferably be allowed by means of a remote access, e.g., from a computer at a dentists office.
  • personalization information e.g., a name, an email address or the like
  • dedicated access by a dentist to any stored data where the latter may preferably be allowed by means of a remote access, e.g., from a computer at a dentists office.
  • the visualization of at least a portion of the oral cavity further comprises the tongue, preferably various areas of the tongue, the inner cheeks, the lips, the uvula, the pharynx, the palate etc.
  • the tongue preferably various areas of the tongue, the inner cheeks, the lips, the uvula, the pharynx, the palate etc.
  • at least one of the previously mentioned portions and at least one portion of the dentition is visualized, such as the tongue and the complete dentition.
  • This abstract or more realistic visualization of the at least portion of the oral cavity provides a map onto which further data such as oral health data or scanning procedure progress data may be visualized in a manner that the user can relate the additional information to a location or position within the oral cavity.
  • the mentioned visualization may be used in manifold feedback applications.
  • the visualization may be used to provide feedback about the scanning procedure progress in real-time, i.e., in a live manner, which means that the position or location at which the oral scanner is currently performing a scanning procedure and the respective visualized segment or segments relating to this position or location may then be amended so that the scanning procedure progress can be understood by the user.
  • the visualized segment at which a scanning procedure is performed may be additionally visually highlighted, e.g., by a halo or similar visual measures to allow a user to immediately identify where the oral scanner performs scanning.
  • An example where the coloring of the respective segments starts at a first color that is gradually changed to a second color was already discussed (white and black are here understood to be colors) .
  • start and end colors may be chosen to be different for different segments. It is not necessary to have a gradual change.
  • a stepwise change or a single step from the start color to the end color is also envisioned.
  • segments may comprise a start pattern and an end pattern to visualize the scanning progress.
  • the oral scanner system may comprise an oral care device such as an electric toothbrush, an electric flossing device, an electric irrigation device etc.
  • the oral care device may preferably be equipped with its own oral care device position sensor (e.g., an IMU sensor) so that its position or location in the oral cavity where an oral care activity procedure such as tooth brushing or flossing, or irrigation is performed can be determined.
  • the position or location where the oral care device performs an oral care activity procedure may at least in part be determined by using the same position detector that serves to determine the position or location of the oral scanner (e.g., by the same external camera) , i.e., the position sensor of the oral scanner may be a shared position sensor.
  • the oral care device may comprise a communicator such as a receiver or a transceiver for at least receiving control data from the processor, which control data is specifically used to select one from at least two different operational settings of the oral care device, preferably wherein the control data is used to select one from at least two different operational settings in a position or location dependent manner.
  • a communicator such as a receiver or a transceiver for at least receiving control data from the processor, which control data is specifically used to select one from at least two different operational settings of the oral care device, preferably wherein the control data is used to select one from at least two different operational settings in a position or location dependent manner.
  • Such an operational setting may relate to a recommended time for performing an oral care activity procedure either in general or at a particular position or location or may relate to a recommended minimum and/or maximum pressure or force value to be applied by an oral care head either in general or at a particular position or location or may relate to feedback to be provided to the user either in general or when a particular position or location is treated or may relate to an operational mode to be used in general or at a particular position or location and where the oral care device may then be arranged to automatically switch into this mode due to the control data that was received.
  • An operational mode may preferably be a motion mode at which an oral care head of the oral care device is driven and may include at least one parameter from a list including velocity, frequency, or amplitude.
  • an oral scanner system comprising an oral scanner having an oral health sensor and a processor.
  • the oral scanner is structured and/or arranged to perform a scanning procedure of at least a portion of the oral cavity, such as a portion of the dentition or of the complete dentition and/or of more or other portions of the oral cavity and to acquire oral health sensor data relating to at least one oral health condition by means of the oral health sensor.
  • the processor is arranged to receive the oral health sensor data and to either process the oral health sensor data to determine oral health data or to directly utilize the oral health sensor data.
  • the processor is structured and/or arranged to, specifically after a completion of the scanning procedure or at least a completion of the scanning procedure for a certain position or location, classify the oral health sensor data and/or the oral health data with respect to at least two condition classes relating to the at least one oral health condition and to determine one condition class to which the oral health sensor data and/or the oral health data belongs. It is also referred to the previous discussion of classification.
  • the oral scanner system may comprise a feedback unit to provide feedback about the determined condition class.
  • the oral health sensor data is image data and the oral health data determined by the processor may be number of teeth having a discoloration and a color of the discolored teeth, which oral health data may be determined by image processing.
  • the condition class may then relate to the severity of the discoloration, which may depend on the difference between a color representing no discoloration and the color of a discolored tooth.
  • the oral health sensor data are image data the image data may be directly inputted into a classifier, i.e., a classification algorithm, to determine a condition class from at least two condition classes relating to an oral health condition.
  • the oral health condition may again be tooth discoloration and the severity may be directly determined from the image data.
  • the classifier may have been trained with images that were labelled relating to the degree or severity of tooth discoloration, e.g., a label given by a cosmetics expert or the like.
  • the oral scanner system may further comprise a position sensor that is structured and/or arranged to acquire and output position sensor data relating to the position or location in the oral cavity at which the oral scanner performs the scanning procedure at the present time instant or performed the scanning procedure at a given time instant, where here time instant includes a time period needed to acquire the oral health sensor data and the respective position data.
  • time instant includes a time period needed to acquire the oral health sensor data and the respective position data.
  • the center time may be used as time instant.
  • the at least portion of the oral cavity may thus be divided into at least two positions or locations as was already discussed.
  • the processor is structured and/or arranged to determine the position or location at which the oral scanner is or was performing the scanning procedure and to determine position-resolved or location-resolved oral health sensor data and/or position-resolved or location-resolved oral health data for each of the at least two positions or locations, where the respective oral health sensor data and/or oral health data is assigned to the determined position or location in temporal alignment, i.e., in the assignment procedure it is ensured that the oral health sensor data and/or the therefrom derived oral health data were acquired at the same time instant or at a time instant as close together as possible at which also the position sensor data was acquired.
  • each oral health sensor datum or set of essentially simultaneously acquired oral health sensor data e.g., image data from a camera
  • position sensor datum or set of essentially simultaneously acquired position sensor data e.g., the three values from a three-axis accelerometer or the image data from a camera
  • the classification may then be done with respect to position-resolved or location-resolved oral health sensor data and/or position-resolved or location-resolved oral health data and feedback on the classification result, i.e., the determined condition class or classes, may then be fed back for at least one position or location, preferably for at least two positions or locations and further preferably for all positions or locations.
  • completion of a scanning procedure this shall refer to a completion decided by the oral scanner system or by the user, i.e., when the user has stopped scanning. It is then contemplated that a position or location that was not sufficiently scanned may be respectively indicated as having a limited reliability or a low confidence level. This may be part of the feedback provided by the feedback unit. While it shall not be excluded that a classification may also be performed prior to a completion of the scanning procedure, the classification result may be considered to have highest reliability or confidence level after completion of the scanning procedure.
  • Classification of the oral health sensor data and/or oral health data may be based on a simple comparison with a threshold value.
  • a threshold value E.g., in case were the normalized area having plaque is assessed, one condition class may be assigned when this normalized plaque area is below 0.05, where 1.0 would be the complete tooth surface and another condition class if the normalized area of plaque is 0.05 or above.
  • the same threshold value may be applied for all positions or locations, i.e., 0.05 in the previous example.
  • the first threshold may be 0.05 in the plaque example and the second threshold 0.15, where then the classification would be done with respect to three condition classes.
  • different threshold values at the different positions or locations.
  • a normalized plaque surface of the buccal or lingual surfaces of the molars may be classified by using 0.1 as threshold value as these tooth surfaces typically have a higher plaque level.
  • different classifiers at the different positions or locations specifically classifiers that were trained for a particular position or location may then be used for this position or location.
  • the classification may be different for at least two of the different positions or locations as a threshold value or a parameter used in the classification is different.
  • the threshold value or the threshold values may be adapted due to a temporal evolvement of the oral health (sensor) data and/or the respective condition class.
  • the respective threshold value may be reduced, e.g., to 0.04 or 0.025 etc., to make the risk associated with such a constant plaque level clearer to the user.
  • the processor may be coupled or connected with a memory to store oral health (sensor) data and/or classification data and to access such historic data in a later scanning procedure to analyze the temporal evolvement.
  • a classifier for classifying the oral health sensor data and/or the oral health data may implement at least one classification algorithm from a list comprising linear classifiers, support vector machines, quadratic classifiers, kernel estimation, boosting, decision trees, neural networks, transformers, genetic programming and learning vector quantization.
  • IMUs Inertial measurement units
  • MEMS sensors preferably those realized as a MEMS sensor.
  • an oral health sensor for acquiring oral health sensor data may simultaneously also serve as position sensor.
  • the image data outputted by the camera may, e.g., be classified by a classifier algorithm to determine whether the image taken at a given time instant belongs to a certain position or location.
  • data from an IMU sensor may be in parallel classified and the results may be fused to determine the position or location or IMU data and image data or feature (s) derived from IMU data and/or image data may be inputted into a classifier algorithm.
  • the oral health sensor may comprise an optical sensor such as an M times N array of light sensitive sensor elements and may be realized as a camera for taking images. While in some instances the oral health sensor data may already provide a direct insight into the oral health condition (e.g., reference is made to the discussion of a malodor sensor senor above) , it is contemplated that the processor may be structured and/or arranged to process the oral health sensor data to determine oral health data that are a direct measure of the oral condition. E.g., in a case where the oral health sensor is a camera, the oral health sensor data is image data and the processor needs to process the image data to determine the oral health data, which may relate to plaque visible in the image or caries lesions or missing teeth or discoloration etc.
  • the processor may further be arranged to classify the oral health sensor data or the oral health data with respect to at least two condition classes.
  • the oral health data and the classification results may in particular be determined for at least two of the at least two positions or locations.
  • the processor may be structured and or arranged to compare the currently determined condition class or the position-resolved or location-resolved condition classes with at least one historic condition class or historic position-resolved or location resolved condition classes, respectively, that were determined in the course of at least one previous scanning procedure and which condition classes are stored as historic condition class data in the memory.
  • the feedback unit may be structured and/or arranged to provide feedback about the condition classification data and preferably about the oral health (sensor) data, either during the scanning procedure and/or after a completion of the scanning procedure.
  • the feedback unit may comprise at least one feedback element for a visual, audible and/or haptic or tactile feedback relating to the oral health sensor data and/or the oral health data and/or the condition classification data, specifically in case such feedback is provided as position-resolved or location-resolved feedback.
  • the present focus is on the feedback provided about the classification of the oral health condition, preferably in a position-resolved or location-resolved manner, to continuously guide the user to achieve an optimum use of the oral scanner system.
  • the feedback unit may comprise at least two visual feedback elements for feedback of position-resolved or location-resolved feedback.
  • the feedback unit may in particular comprise a display, where it shall be understood that a display may be used to define a plurality of visual feedback elements and reference is made to the respective discussion in a previous paragraph.
  • the display may be used to show elements of a graphical user interface.
  • the feedback unit may be provided by a separate device such as a computer, a notebook, a laptop, a tablet, a smartphone, or a smart watch.
  • the processor may at least partially be provided by the separate device.
  • separate units or devices as discussed herein may communicate in a wireless manner.
  • each of the units or devices may comprise a communicator for establishing at least a unidirectional or a bi-direction or multi-directional wireless communication.
  • the feedback unit may provide an abstract or more realistic visualization of the at least portion of the oral cavity that shall be scanned, e.g., a depiction of the dentition, which is taken as an example.
  • the visualization of the dentition may be overlaid with a visualization of the oral health sensor data, the oral health data and/or the condition classification data.
  • the term “overlaid” should be understood to mean that a two-dimensional image may be displayed that is based on the depiction of the dentition and may comprise further information that is additionally depicted and/or of, e.g., colorations or patterns of at least portions of the depiction of the dentition.
  • An oral scanner system may, as has already been discussed, further comprise an oral care device such as an electric toothbrush, which oral care device may comprise a communicator.
  • the processor may be structured and/or arranged to determine after a completion of the scanning procedure control data based on the oral health sensor data acquired during the scanning procedure, and preferably to send the control data to the oral care device, and wherein the oral care device may be structured and/or arranged to select at least one from at least two operational settings of the oral care device in dependance on the control data.
  • Fig. 1 is a schematic depiction of an example oral scanner system 1 in accordance with the present disclosure.
  • the oral scanner system 1 comprises an example oral scanner 100 and a processor 200, where the processor 200 is in this example disposed at or inside of the oral scanner 100.
  • the oral scanner 100 comprises a handle portion 101 and a head portion 102.
  • An oral health sensor 110 is disposed in or at the oral scanner 100.
  • two or more different oral health sensors may be used.
  • at least a measurement inlet such as a light inlet cooperating with the oral health sensor 110 is provided at the head portion 102 so that an acquisition of oral health data based on light measurements by the oral health sensor 110 is enabled at the head portion 102.
  • the head portion 102 here comprises a flat transparent window 1021 surrounded by a frame structure 1022 that may be arranged and/or structured to receive a preferably detachable attachment (see Fig. 2) .
  • the head portion 102 is dimensioned such that it can be conveniently introduced into the oral cavity of a human or an animal.
  • the handle portion 101 is dimensioned such that it can be conveniently grasped by a hand of a human user.
  • the handle portion 101 and the head portion 102 may be separable from each other.
  • the handle portion 101 can be equipped with different replaceable head portions in addition to an oral scanner head portion such as a brush head portion or the like.
  • the handle portion 101 may comprise at least one user operable input element 103 such as an on/off button and/or a selector button or switch.
  • the oral scanner 100 has an outer housing 104 that may preferably be hollow to accommodate various internal components such as a preferably rechargeable energy source and a related charging circuit for preferably wireless charging of the energy source, a circuit board comprising various electronic components for the control of the oral scanner etc.
  • the oral scanner 100 is structured and/or arranged for performing a scanning procedure of at least a portion of the oral cavity of a subject, i.e., while the user is holding and moving the oral scanner 100, the oral scanner acquires oral health sensor data and determines the progress of scanning and preferably analyses the acquired oral health sensor data with respect to the at least one oral health condition.
  • the processor 200 may be disposed on said circuit board.
  • the processor 200 is coupled or connected with the oral health sensor 110 for receiving signals from the oral health sensor 110 by which the oral health sensor data are provided at the processor 200.
  • the processor 200 may be structured and/or arranged for processing the oral health sensor data to derive or determine oral health data that relates to at least one oral health condition, such as plaque.
  • the oral health sensor may output oral health sensor data that are a direct measure of the relating oral health condition so that only a limited processing of the oral health sensor data may be required, e.g., some reduction to an integer number or a computation of a normalized value etc.
  • the processor 200 may also be structured and/or arranged to classify the oral health (sensor) data into at least two condition classes relating to the at least one oral health condition, e.g., into a “no oral health concern” class (or “green” class) and into an “oral health concern” class (or “red” class) , which may be done based on a comparison with at least one threshold value.
  • condition classes relating to the at least one oral health condition
  • the oral scanner system 1 may additionally comprise at least one position sensor that is coupled or connected with the processor 200 so that the processor 200 receives in operation signals from the position sensor that deliver position sensor data from which the processor 200 can determine a position or location at which the oral scanner is currently performing a scanning procedure or has been performing a scanning procedure at a given time instant in the oral cavity.
  • Time data relating to an absolute time at which the data was acquired may be part of the position sensor data and also of the previously mentioned oral health sensor data.
  • the determination of a position or location allows that that the processor 200 can compute oral health data relating to at least one oral health condition and/or classify the oral health sensor data and/or the oral health data into at least two oral health condition classes in a position or location resolved manner.
  • the oral health sensor data and the position sensor data acquired at essentially the same time instant may be delivered to the processor 200 together due to the design of the oral scanner system or the processor may be structured and/or arranged to assign oral health sensor data and position data having the same time information or best fitting time information to each other.
  • position sensor shall include embodiments where two distinct position sensors are used that together realize the “position sensor” , e.g., an IMU provided at the oral scanner and a separate camera.
  • the oral scanner system 1 may comprise a feedback unit 120 to provide user perceptible feedback.
  • the oral scanner 100 may comprise a visual feedback unit 121 being part of the feedback unit 120 for visually providing feedback.
  • the visual feedback unit 121 comprises four quarter-annulus light areas 1211, 1212, 1213, 1214 that are arranged to form an annulus that may be understood to represent the four quadrants of the dentition.
  • user perceptible feedback can be provided, e.g., live during a scanning procedure so that the user can understand the progress of the scanning procedure in a position or location resolved manner.
  • the four light areas 1211, 1212, 1213, 1214 may be used to indicate the severeness of an oral health condition during or at the end of the scanning procedure in a position or location resolved manner, e.g., by illuminating the respective light area in a particular color and/or by applying an intensity variation pattern.
  • the oral scanner 100 may comprise two or three or five or six or sixteen or thirty-two etc. light areas and/or the oral scanner system 1 may comprise a display to visualize user-perceptible feedback in an even more versatile manner. Reference is made to the previous paragraphs relating to visualization of feedback.
  • the oral scanner 100 may, additionally or alternatively, comprise one or several other feedback elements 122 being part of the feedback unit 120 such as a light ring at the bottom of the oral scanner 100 to communicate that the oral scanner 100 is switched on or that the energy storage requires charging etc., one or several haptic or tactile feedback elements and/or one or several audible feedback elements.
  • the processor 200 may be coupled or connected with a memory for storing oral health sensor data and/or oral health data and/or scanning progress data and/or condition classification data and/or oral care activity data, where this stored data may be stored in a position-resolved or location-resolved manner and specifically where current and historic stored data may be present, where here “historic” relates to previous scanning procedures or oral care activities.
  • Oral care activity data relates to an oral care activity procedure performed with an oral care device and which data was sent to the processor.
  • Fig. 2 is a schematic depiction of another example oral scanner system 1A in accordance with the present disclosure.
  • the oral scanner system 1A here comprises an example oral scanner 100A and an example separate device 300A that comprises a processor 200A and a display 310A as part of a feedback unit for visualizing user-perceptible feedback (reference is again made to previous paragraphs providing details on the visualization and to the disclosure further below with reference to Figs. 5 to 7) .
  • the oral scanner 100A may comprise a communicator 140A and the separate device 300A may comprise a communicator 340A so that the oral scanner 100A and the separate device 300A can communicate, i.e., can exchange signals delivering data, in a wireless manner, e.g., via a Bluetooth protocol or an IEEE 820.11 protocol etc.
  • the wireless communication possibility is here and in the following figures indicated by an icon comprising a small circle and three concentric circular segments. This shall not exclude a permanent or temporal additional or alternate wired connection for the exchange of signals or a communication via a further device, e.g., a charger or a router or a cloud computing device etc.
  • the separate device 300A is here schematically indicated to be a mobile phone, even though this shall not be understood as limiting. Reference is made to the possibilities to realize a separate device described in a previous paragraph.
  • an oral health sensor 110A is provided in or at the oral scanner 100A for acquisition of oral health sensor data at a head section 102A of the oral scanner 100A.
  • the oral health sensor 110A may comprise a sensor receiver 111A, e.g., an optical sensor such as a camera, and a sensor emitter 112A, such as a light emitter.
  • a preferably detachable attachment 105A is here attached to the head portion 102A that may preferably be realized as a distance attachment.
  • the head portion 102A may comprise an outlet that communicates with the sensor emitter 112A so that the emitted medium can exit the head portion 102A at an intended location and the sensor emitter 112A itself may be disposed elsewhere in the oral scanner 100A.
  • an inlet may be provided at the head portion 102A, which inlet may communicate with the sensor receiver 111A so that a medium to be measured can enter the head portion 102A at the intended location and the sensor receiver 111A may be disposed somewhere else in the oral scanner 100A.
  • the feedback unit is at least partly be provided at the oral scanner and/or at a separate device, the intention of the feedback discussed herein to allow the user to respond to the feedback and to optimize the use of the oral scanner system.
  • the use of the oral scanner system is hereby on the one hand focusing on the use of the oral scanner system during a single scanning procedure and on the other hand on a long-term usage of the oral scanner system over various instance of procedures to be performed with the components of the oral scanner system, e.g., comprising the oral scanner and optionally an oral care device.
  • Fig. 3 is a schematic depiction of an example oral scanner system 1B in accordance with the present disclosure that comprises an oral scanner 100B, a separate device 300B comprising a display 310B as part of a feedback unit and a processor 200B, a position sensor 400B, 410B comprising a first and a second position sensor 400B and 410B and is structured and/or arranged to utilize position sensor data outputted by the position sensor 400B, 410B to determine a position or location in an oral cavity 500B at which the oral scanner 100B is currently performing a scanning procedure or has been performing a scanning procedure at a given time instant, where the time instant may be derivable from a time value outputted by the position sensor 400B, 410B together with the relating position sensor data or a clock may be used for absolute time values.
  • the position sensor 400B, 410B in this example comprises two position sensors, one disposed in or at the oral scanner 100B and one being separate from the oral scanner 100B.
  • the oral cavity 500B shown in Fig. 3 comprises, without wanting to be complete, a dentition 510B, gums, 520B, a tongue 530B, a uvula 540B, lips 550B, inner cheeks 560B, and a palate 570B.
  • a dentition 510B is further discussed even though all other positions or locations in the oral cavity 500B may be considered as well.
  • the dentition 510B is virtually separated into four quadrants 511B, 512B, 513B, 514B that are considered different locations in the oral cavity 500B at which the oral scanner 100B may perform a scanning procedure.
  • the first position sensor 400B is here disposed at or in the oral scanner 100B and may be realized as an accelerometer and/or a gyroscope and/or a magnetometer (generally speaking, as an IMU) .
  • Position sensor data and oral health sensor data may be wirelessly transmitted to and received by the processor 200B via communicators as has been already described and the processor 200B may be structured and/or arranged to determine a position or location at which the oral scanner 100B currently performs a scanning procedure based on the position sensor data or where the oral scanner 100B has been performing a scanning procedure at a given time instant based on the position sensor data that may include timer data.
  • the processor 200B may output one of the four dentition quadrants 511B, 512B, 513B, 514B as the scanning position or location.
  • the processor 200B may preferably be structured and/or arranged to also output that no scanning occurs in any of the positions or locations that were defined.
  • the processor may output that the scanning procedure occurs at none of the used positions or locations or the processor 200B may explicitly indicate that the oral scanner 200B is outside of the used positions or locations.
  • the processor 200B may be further structured and/or arranged to compute oral health data from the oral health sensor data in a position-resolved or location-resolved manner, i.e., by assigning the oral health sensor data and/or the therefrom derived oral health data to the determined position or location. Reference is made to the previous paragraphs disclosing details of the position or location determination and how oral health data is assigned to the position or location.
  • the processor 200B determines an orientation of the oral scanner 100B with respect to Earth’s gravity field and determines a position or location by sorting the orientation values into pre-determined position or location buckets as is known in the art.
  • the second position sensor 410B may be utilized which in this example is a separate camera that takes images from the outside of the oral cavity 500B, where images are understood to be position sensor data delivered by the camera 410B. Either based on the pictures alone and/or based on data fusion with the position sensor data from the first position sensor 400B a position or location in the oral cavity 500B may be determined by the processor 200B, where here the position or location relates to one of the indicated dentition quadrants 511B, 512B, 513B, 514B.
  • an external camera is indicated shall not exclude that alternatively or additionally a camera is used as position sensor that is disposed at a head portion or at a handle portion of the oral scanner 100B so that images from inside of the oral cavity 500B or images from the face of the user can be taken, respectively, to support in the determination of the position or location.
  • a camera serving as oral health sensor may additionally be utilized as position sensor–see, e.g., the reference made to EP 2189198 B1 in a previous paragraph.
  • a scanning procedure performed with an oral health sensor comprising an optical sensor such as a camera is called an optical scanning procedure.
  • Fig. 4 is a schematic depiction of an example oral scanner system 1C in accordance with the present disclosure specifically comprising an oral care device 700C, even though several aspects of the oral scanner system 1C are independent from the presence of the oral care device 700C.
  • the oral scanner system 1C may comprise or interact with an oral scanner 100C, a separate device 300C comprising a display 310C, the mentioned oral care device 700C that here is exemplified as an electric toothbrush, a charger 710C, a base station 720C comprising a display 721C and a charger 722C, a router 730C, a computer 740C and a cloud server or cloud computing device 750C.
  • the various components of the oral care system 1C may preferably all be structured and/or arranged for wireless communication as is indicated with the previously mentioned icon. It shall be understood that the here shown components of the oral scanner system 1C are an optional assembly. E.g., the oral scanner system 1C may comprise only one charger or no charger at all or may indeed comprise two chargers, one for the oral scanner 100C and one for the oral care device 700C and potentially a further charger for the separate device 300C.
  • a processor of the oral scanner system 1C may be realized as a distributed processor and a first processor sub-unit may be disposed in the oral scanner 100C and a second processor sub-unit may be provided by the cloud computing device 750C or a first processor sub-unit may be provided by the separate device 300C and a second processor sub-unit may be provided by the computer 740C.
  • the oral care device 700C may be incorporated into the oral scanner system 1C and that at least one operational setting of the oral care device 700C may be selected based on control data determined by the processor and/or that the oral care device 700C may be structured and/or arranged to send oral care activity data relating to at least one oral care activity performed with the oral care device 700C to the processor where it may be used to adapt a next scanning procedure.
  • Data from one component may be sent directly to another component, e.g., from the oral care device 700C to the oral scanner 100C, or may be sent indirectly, e.g., from the oral care device 700C to the cloud server 750C, where it may be stored in a memory, and then, e.g., on demand, from the cloud server 750C to the processor which may be located in or at the separate device 300C and/or in or at the oral scanner 100C.
  • the mentioned memory may be a memory located in any of the mentioned components or may be a distributed memory.
  • Fig. 5 is a depiction of an example feedback screen 600D as may be visualized on a display of an oral scanner system.
  • the term feedback screen here refers to a visualization of feedback to a user by means of a display using a particular feedback concept within a continuous guidance provided to the user by the oral scanner system.
  • Feedback screens are preferably used to assist the user in performing the task of using the oral scanner system by means of a continued or guided human-machine interaction process, which shall not exclude that a feedback screen in addition also visualizes information such as the current time etc.
  • the feedback screen 600D comprises a first portion 610D and a second portion 620D.
  • a live image or conserve image 611D from a camera on a head portion of an oral scanner of the oral scanner system is shown.
  • the camera may be comprised by an oral health sensor.
  • the live image may comprise unprocessed or processed image data relating to an oral health condition, e.g., to plaque image data visible as red fluorescence light.
  • a processor may be structured and/or arranged to analyze the image data and may determine a borderline within the image or an image portion in which the relevant oral health sensor data is located and a respective indication 612D may be overlaid onto the live image 611D and be visualized as well as part of the live or conserve image.
  • the indication 612D is shown in Fig. 5, which indication is overlaid onto the visualized image data 611D and shall provide a visible reference of the area of the tooth that is visible on the image covered by plaque.
  • the second portion 620D of the feedback screen 600D comprises an abstract visualization of a human dentition 621D.
  • the abstract visualization of the human dentition 621D comprises six segments 622D, 623D, 624D, 625D, 626D and 627D generally arranged with a distance between two neighboring segments in an oval-like arrangement.
  • Each of the segments 622D, 623D, 624D, 625D, 626D and 627D comprises a plurality of overlapping circles or bubbles, which is understood to be a non-limiting example.
  • the top three segments 622D, 623D, 624D shall indicate the teeth of the maxilla and the lower three segments 625D, 626D, 627D shall indicate the teeth of the mandible.
  • the top segment 623D and the bottom segment 626D shall represent locations in the dentition relating to the upper and the lower front teeth, respectively
  • the left-hand segments 622D and 627D shall represent locations in the dentition relating to the upper and lower left molars, respectively
  • the right-hand segments 624D and 625D shall represent locations in the dentition relating to the upper and lower right molars, respectively.
  • a segment may be visually separated into two or three or even more subdivisions that then may relate to different positions or locations of the dentition. These subdivisions may be used to visually distinguish, e.g., different teeth or groups of teeth relating to the segment or different tooth surfaces or groups of tooth surfaces relating to the segment.
  • Segment 622D (and also segment 625E) is separated into three areas 6221D, 6222D, 6223D, where the side areas 6221D and 6223D shall represent the buccal and lingual surfaces of the teeth of the segment 622D, respectively, and the center area 6222D shall represent the occlusal or biting surface of the teeth.
  • a portion of a feedback screen may be used to provide live or summary feedback to the user.
  • Fig. 5 shows a feedback screen as may be seen by a user during a live scanning procedure.
  • the segments 622D, 623D, 624D, 625D, 626D and 627D may be used to indicate a position-resolved or location-resolved scanning procedure progress and/or a severity of an oral health condition, e.g., the total or normalized tooth area within a segment on which plaque or the like was determined.
  • a severity of an oral health condition e.g., the total or normalized tooth area within a segment on which plaque or the like was determined.
  • the segments or the subdivisions of the segments shown on the feedback screen relate to positions or locations in the oral cavity.
  • the scanning procedure progress may be visualized by first showing all segments and all segment subdivisions if such are used in a base color (e.g., dark blue) or a start pattern or the like and to then gradually or step-wise change the color or pattern or the like towards a different color or pattern, e.g., towards lighter blue and finally white to indicate a scanning procedure progress for the respective segment, i.e., for the respective position or location.
  • a base color e.g., dark blue
  • a start pattern or the like e.g., a start pattern or the like
  • gradually or step-wise change the color or pattern or the like towards a different color or pattern e.g., towards lighter blue and finally white
  • a scanning procedure progress i.e., for the respective position or location.
  • a shading of different strength is used instead of color.
  • the severity of the detected oral health condition is determined based on the position or location resolved oral health (sensor) data and may be visualized by adding a pattern of different strength into the color.
  • Fig. 6 is a depiction of an example feedback screen 600E as may be visualized on a display of an oral scanner system.
  • the feedback screen 600E comprises essentially the same abstract visualization of the dentition 621E as was explained with respect to Fig. 5 and reference is made to the respective description. Segments 622E, 623E, 624E, 625E, 626E, and 627E are shown.
  • Feedback screen 600E can be understood as a summary screen on which the severeness of the detected oral health condition, e.g., plaque, is indicated by different colors or patterns or the like in a position or location resolved manner (in Fig. 6 a shading of different strength is used) .
  • Additional patterns or structures may be applied to indicate additional feedback, e.g., the presence of another oral health condition such as tartar (i.e., old plaque) , where the strength of the pattern or the number of additional structures may be indicative of the severeness of the additional oral health condition.
  • additional dots are shown.
  • feedback screen 600E comprises a visualization of a temporal change relating to the severeness of at least one oral health condition, e.g., plaque.
  • Such visualized feedback may indicate in an appropriate manner the severeness of the oral health condition as determined in the recent scanning procedure and a change indicator that provides feedback on the change of the severeness in comparison to at least one previous scanning procedure.
  • the bar indicator with the temporal change arrow as shown in Fig.
  • the shown bar indicator comprises a bar indicating the oral health condition, where here the bottom relates to no issue and the top relates to a condition of concern, where a first number, here 75, indicates a normalized oral health condition rating and a second number, here 8, indicates the temporal change vs. the previous, i.e., historic scanning procedure.
  • a reference guide 630E may be visualized allowing to map the colors or signs or patterns etc. to the severity of an oral health condition, where the severity as indicated in the reference guide 630E may coincide with condition classes into which the oral health data was classified, where in the shown example three condition classes are used, namely “low” , “medium” and “high” .
  • Fig. 7 is a depiction of an example separate device 300F that is part of an oral scanner system and that comprises a display 310F on which an example feedback screen 600F is visualized.
  • an abstract visualization 621F of the dentition is utilized as in Figs. 5 and 6.
  • locations 640F relating to the gums are indicated where an oral health condition of a certain severeness was detected (i.e., where the analysis of the oral health sensor data led to an oral health condition above a threshold value) , e.g., where an inflammation of the gums was detected based on an analysis of, e.g., image data created by a camera as sensor receiver of an oral health sensor.
  • Feedback screen 600F provides one example of a visualization to provide feedback about various oral health conditions classified into different condition classes.
  • a reference guide 630F may be visualized allowing to map the colors or signs or patterns etc. to the type of oral health condition and its classification.
  • Visualized indicia 640F, 641F may be overlaid onto the abstract visualization 621F of the dentition to provide feedback about further oral health conditions, e.g., cavities or the like.
  • the size of such an indicium 641F may relate to a severeness and thus to a condition class.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Physiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Cardiology (AREA)
  • Artificial Intelligence (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Radiology & Medical Imaging (AREA)
  • Databases & Information Systems (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Data Mining & Analysis (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Fuzzy Systems (AREA)
  • Pulmonology (AREA)
  • Evolutionary Computation (AREA)
  • Optics & Photonics (AREA)
  • Rheumatology (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)

Abstract

La présente invention concerne un système de scanner buccal ayant un scanner buccal ayant au moins un capteur de santé buccale structuré et/ou agencé pour délivrer en sortie des données de capteur de santé buccale relatives à au moins une condition de santé buccale, le scanner buccal étant structuré et/ou agencé pour effectuer une procédure de balayage d'au moins une partie d'une cavité buccale d'un sujet à l'aide du capteur de santé buccale pour acquérir les données de capteur de santé buccale, un processeur structuré et/ou agencé pour recevoir les données de capteur de santé buccale, et pour traiter les données de capteur de santé buccale pour déterminer des données de santé buccale relatives à la ou aux conditions de santé buccale, et après l'achèvement de la procédure de balayage pour classifier les données de santé buccale par rapport à au moins deux classes de condition relatives à la ou aux conditions de santé buccale et pour déterminer une classe de condition parmi les au moins deux classes de condition auxquelles appartiennent les données de santé buccale, ou pour classifier, de préférence après l'achèvement de la procédure de balayage, les données de capteur de santé buccale par rapport à au moins deux classes de condition relatives à la ou aux conditions de santé buccale et pour déterminer une classe de condition parmi les au moins deux classes de condition auxquelles appartiennent les données de capteur de santé buccale, et une unité de rétroaction structurée et/ou agencée pour fournir une rétroaction concernant la classe de condition déterminée.
PCT/CN2022/103580 2022-07-04 2022-07-04 Système de scanner buccal WO2024007098A1 (fr)

Priority Applications (7)

Application Number Priority Date Filing Date Title
PCT/CN2022/103580 WO2024007098A1 (fr) 2022-07-04 2022-07-04 Système de scanner buccal
PCT/CN2023/102649 WO2024007884A1 (fr) 2022-07-04 2023-06-27 Système de scanner buccal
PCT/CN2023/102651 WO2024007885A1 (fr) 2022-07-04 2023-06-27 Système de scanner buccal
PCT/CN2023/102655 WO2024007887A1 (fr) 2022-07-04 2023-06-27 Système de scanner buccal
PCT/CN2023/102656 WO2024007888A1 (fr) 2022-07-04 2023-06-27 Système de scanner buccal
PCT/CN2023/102654 WO2024007886A1 (fr) 2022-07-04 2023-06-27 Système de scanner buccal
PCT/CN2023/102657 WO2024007889A1 (fr) 2022-07-04 2023-06-27 Système de scanner buccal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/103580 WO2024007098A1 (fr) 2022-07-04 2022-07-04 Système de scanner buccal

Publications (1)

Publication Number Publication Date
WO2024007098A1 true WO2024007098A1 (fr) 2024-01-11

Family

ID=82655243

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/103580 WO2024007098A1 (fr) 2022-07-04 2022-07-04 Système de scanner buccal

Country Status (1)

Country Link
WO (1) WO2024007098A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2189198A1 (fr) 2008-11-20 2010-05-26 Braun Gmbh Appareil de soin du corps à usage personnel
EP3141151A1 (fr) 2015-09-08 2017-03-15 Braun GmbH Détermination d'une partie de corps en cours de traitement d'un utilisateur
US20190200746A1 (en) * 2017-12-28 2019-07-04 Colgate-Palmolive Company Oral Hygiene Systems and Methods
EP3528172A2 (fr) 2018-02-19 2019-08-21 Braun GmbH Système de classification de l'utilisation d'un dispositif de consommateur portable
US20200188068A1 (en) * 2016-07-27 2020-06-18 Align Technology, Inc. Methods and apparatuses for forming a three-dimensional volumetric model of a subject's teeth
US20210393026A1 (en) * 2020-06-22 2021-12-23 Colgate-Palmolive Company Oral Care System and Method for Promoting Oral Hygiene

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2189198A1 (fr) 2008-11-20 2010-05-26 Braun Gmbh Appareil de soin du corps à usage personnel
US20100170052A1 (en) 2008-11-20 2010-07-08 Marc Philip Ortins Personal Hygiene Devices, Systems and Methods
EP2189198B1 (fr) 2008-11-20 2017-06-21 Braun GmbH Appareil de soin du corps à usage personnel
EP3141151A1 (fr) 2015-09-08 2017-03-15 Braun GmbH Détermination d'une partie de corps en cours de traitement d'un utilisateur
US20200188068A1 (en) * 2016-07-27 2020-06-18 Align Technology, Inc. Methods and apparatuses for forming a three-dimensional volumetric model of a subject's teeth
US20190200746A1 (en) * 2017-12-28 2019-07-04 Colgate-Palmolive Company Oral Hygiene Systems and Methods
EP3528172A2 (fr) 2018-02-19 2019-08-21 Braun GmbH Système de classification de l'utilisation d'un dispositif de consommateur portable
US20210393026A1 (en) * 2020-06-22 2021-12-23 Colgate-Palmolive Company Oral Care System and Method for Promoting Oral Hygiene

Similar Documents

Publication Publication Date Title
EP3713446B1 (fr) Dispositif portatif de suivi dentaire
CN108965653B (zh) 一种口腔内窥器
JP2022130677A (ja) 統合されたカメラを有する歯科ミラーおよびそのアプリケーション
US10231810B2 (en) Dental irradiation device and system
US8520925B2 (en) Device for taking three-dimensional and temporal optical imprints in color
US20190340760A1 (en) Systems and methods for monitoring oral health
US20220240786A1 (en) System and Devices for Multispectral 3D Imaging and Diagnostics of Tissues, and Methods Thereof
KR20190105333A (ko) 영상분석을 이용한 구강 상태 원격 모니터링 시스템 및 그 방법
KR20150080258A (ko) 칫솔질 지도 기능을 구비한 구강 위생 장치 및 이를 포함한 칫솔질 지도 시스템
KR102573669B1 (ko) 조기 우식 검출을 위한 방법 및 장치
WO2024007098A1 (fr) Système de scanner buccal
WO2024007091A1 (fr) Système de scanner buccal
US20230354993A1 (en) Oral Care System and Method for Promoting Oral Hygiene
WO2024007100A1 (fr) Système de scanner buccal
WO2024007117A1 (fr) Système de scanner buccal
WO2024007888A1 (fr) Système de scanner buccal
WO2024007106A1 (fr) Système de scanner buccal
WO2024007113A1 (fr) Système de scanner buccal
KR20230147839A (ko) 개인 맞춤형 구강 관리 상품 추천 시스템 및 방법
JP7219769B2 (ja) 限局性口腔炎症の測定中の改良された運動ロバスト性のための方法及びシステム
CN110446455A (zh) 用以使用口腔护理装置测量局部炎症的方法和系统
US12121139B2 (en) Oral care system and method for promoting oral hygiene
KR102499368B1 (ko) 구강 내 치아의 선택적 모니터링을 통한 치아 관리 시스템
CN202875522U (zh) 一种光学彩色三维图像成像装置
US20240285379A1 (en) Gradual surface quality feedback during intraoral scanning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22744382

Country of ref document: EP

Kind code of ref document: A1