EP3549061A1 - Standardisierte beurteilung und bewertung der mundgesundheit mittels digitaler bildgebung - Google Patents

Standardisierte beurteilung und bewertung der mundgesundheit mittels digitaler bildgebung

Info

Publication number
EP3549061A1
EP3549061A1 EP17876074.0A EP17876074A EP3549061A1 EP 3549061 A1 EP3549061 A1 EP 3549061A1 EP 17876074 A EP17876074 A EP 17876074A EP 3549061 A1 EP3549061 A1 EP 3549061A1
Authority
EP
European Patent Office
Prior art keywords
color
frames
image
quadrant
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP17876074.0A
Other languages
English (en)
French (fr)
Other versions
EP3549061A4 (de
Inventor
Cindy L. Munro
Paula Louise Cairns
Xusheng Chen
Gwendolyn J. Good
Kevin Edward Kip
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of South Florida
Original Assignee
University of South Florida
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/450,925 external-priority patent/US10405754B2/en
Application filed by University of South Florida filed Critical University of South Florida
Publication of EP3549061A1 publication Critical patent/EP3549061A1/de
Publication of EP3549061A4 publication Critical patent/EP3549061A4/de
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/24Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/17Image acquisition using hand-held instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30036Dental; Teeth
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Definitions

  • This invention relates, generally, to oral health assessments. More specifically, it relates to use of digital imaging to standardize, assess, and score oral health in a subject.
  • the present invention may address one or more of the problems and deficiencies of the prior art discussed above. However, it is contemplated that the invention may prove useful in addressing other problems and deficiencies in a number of technical areas. Therefore, the claimed invention should not necessarily be construed as limited to addressing any of the particular problems or deficiencies discussed herein.
  • the current invention is a method of assessing oral health in a patient or subject.
  • the method includes capturing or recording the frames of substantially all buccal, occlusal, and lingual surfaces in the subject's set of teeth, using a suitable intraoral camera.
  • the frames are imported into an image processing software program that is implemented on a computing device.
  • the frames are processed on the software program to generate images of the teeth to be analyzed.
  • the color of at least a plurality of the frames/images are analyzed and classified to determine presence of yellow color, wherein yellow color indicates presence of plaque. Results of the color analysis are scored to objectively and quantitatively assess the subject's oral health.
  • a dental barrier can be positioned over a lens of the intraoral camera, and a camera tip of the intraoral camera is positioned over the dental barrier.
  • the frames captured by the intraoral camera may be either photographs taken by the camera or video recordings taken by the camera.
  • the subject's teeth can be divided into a plurality of sections, including an upper right section, a lower right section, an upper middle section, a lower middle section, an upper left section, and a lower left section.
  • processing the frames can be performed by cropping each image such that the image includes the targeted tooth, and the resolution of the image can be lowered to a percentage of about 50 or less.
  • the color of the image is analyzed and classified by classifying the color of each pixel of the image. Further, each pixel is classified using RGB color code combinations, wherein a three-dimensional point (x, y, z) defines the color of each pixel. Still further, the step of analyzing and classifying color is further performed by dividing each color dimension of the RGB color code combinations into four (4) categories: (0, 64), (64, 128), (128, 192), and (192, 255). A middle point is selected in each category to be representative of the corresponding category. All categories are then cored to determine when yellow color is present on each pixel.
  • scoring the results of the color analysis can include calculating a percentage of yellow color in an image by dividing the number of yellow pixels by the total number of pixels in the image.
  • the frames may be video recordings.
  • the subject's teeth can be divided into a plurality of quadrants, including an upper right quadrant, a lower right quadrant, an upper left quadrant, and a lower left quadrant.
  • Video should be captured and recorded in at least one upper quadrant and at least one lower quadrant.
  • the following order can be used to take video in each quadrant: capturing and recording video of the buccal surfaces in the quadrant, followed by capturing and recording video of the occlusal surfaces in the quadrant, followed by capturing and recording video of the lingual surfaces in the quadrant, and followed by repeating the foregoing steps in another quadrant.
  • processing the frames includes extracting single still frame digital images of the tooth surfaces from the video recording.
  • each image is cropped, and a resolution of the image can be lowered to a percentage of about 50 or less.
  • the color of the image is analyzed and classified by classifying the color of each pixel of the image. Further, each pixel is classified using RGB color code combinations, wherein a three-dimensional point (x, y, z) defines the color of each pixel. Still further, the step of analyzing and classifying color is further performed by dividing each color dimension of the RGB color code combinations into four (4) categories: (0, 64), (64, 128), (128, 192), and (192, 255). A middle point is selected in each category to be representative of the corresponding category.
  • scoring the results of the color analysis can include calculating a percentage of yellow color in an image by dividing the number of yellow pixels by the total number of pixels in the image.
  • a plurality of frames from all frames can be randomly selected prior to processing the frames on the software program. This random selection can be performed with or without criteria for the random selection.
  • the current invention is a method of assessing oral health in a patient or subject, comprising any one or more— or even all— of the foregoing steps.
  • FIG. 1 is a flowchart depicting the steps of oral health assessment, according to an embodiment of the current invention.
  • FIG. 2 is a flowchart depicting the steps of capturing data, according to an embodiment of the current invention.
  • FIG. 3 is a flowchart depicting the steps of scoring data, according to an embodiment of the current invention.
  • FIG. 4A is an image of a tooth using the "intra-oral" setting of the intraoral camera.
  • FIG. 4B is a schematic of "perio mode" of the ACTEON SOPROCARE intraoral camera.
  • FIG. 5 depicts the R code for obtaining the final plaque percentage for the tooth.
  • FIG. 6 is a chart depicting color samples that can be classified as 'yellow' color (plaque) or non- yellow color (normal).
  • FIG. 7 depicts the R code for randomly selecting fifty (50) numbers/frames from a video recording.
  • the current invention uses digital imaging technology to objectively capture clinical data on oral health, and then provides standardized scoring methodology for quantifying oral health.
  • the methodology is applicable to persons in clinical settings (including hospitalized patients) as well as the general population. Generally, it involves two (2) stages (see FIG. 1): the process of capturing clinical data by use of digital imaging technology, and the process of standardized scoring of oral health data.
  • the process of capturing clinical data on oral health is based on the use of a conventional intraoral camera (e.g., ACTEON SOPROCARE Diagnostic/Clinical Intraoral Cameras) that has the capacity to capture digital images of all tooth surfaces in white light within and outside of dental laboratory settings. Any suitable intraoral camera is contemplated herein. Data from the digital images are used to enhance the detection of plaque on tooth surfaces, which are difficult to directly observe and score by a dental hygienist. Imaging software (e.g., ACTEON SOPRO Imaging Software) is used in conjunction with the camera to visualize, capture, and store each subject's digital image recording.
  • a conventional intraoral camera e.g., ACTEON SOPROCARE Diagnostic/Clinical Intraoral Cameras
  • Any suitable intraoral camera is contemplated herein.
  • Imaging software e.g., ACTEON SOPRO Imaging Software
  • ACTEON SOPRO Imaging Software is used in conjunction with the camera to visualize, capture, and store each subject's digital image recording.
  • the process of digital imaging divides the subject's full set of teeth into four (4) quadrants. These quadrants include an upper right quadrant, lower right quadrant, upper left quadrant, and lower left quadrant.
  • the recording of buccal, occlusal, and lingual surfaces of each quadrant in video-mode is significantly more effective and efficient than taking multiple still frame images at the bedside.
  • Each tooth's digital plaque data is collected in a standardized manner, and with the ability to select optimal frames for analysis.
  • dental plaque burden is conventionally scored by visual examination by use of the University of Mississippi Oral Hygiene Index (UM-OHI). Using visual examination, for the ten (10) sections of each tooth, plaque is scored as present (value of 1) versus absent (value of 0). Thus, the maximum plaque score per tooth is 10. The mean plaque score for the subject is calculated by dividing the total score by number of teeth.
  • each tooth's data is extracted into a software program, such as but not limited to R, and with a minimum of about 10,000-40,000 color-derived pixels per tooth.
  • the color classification of each pixel is determined by the software program using an algorithm that makes use of red/green/blue (RGB) color code combinations. These classifications are then calculated quantitatively within the software program, and a separate algorithm automatically generates a range of oral health scoring techniques. These include, but are not limited to: (i) magnitude (and ratio) of dental plaque pertooth and across all teeth; (ii) estimated age of dental plaque per tooth and across all teeth; and (iii) ratio of plaque burden to plaque age per tooth and across all teeth.
  • the process of oral health scoring is set up such that after appropriate selection of digital images has been achieved with use of the intraoral camera, and after these data have been imported into the computer coding language/program (e.g., R), only a few key strokes are required to compile and execute the algorithmic code, thereby resulting in standardized and near real-time scoring of oral health.
  • the computer coding language/program e.g., R
  • a conventional intraoral camera such as SOPROCARE by ACTEON used herein, illuminates dental tissue with a wavelength of light between 440 nm and 680 nm. Exposed tissue absorbs the energy and reflects it in florescent form.
  • the handheld intraoral camera can be connected to a computing device wirelessly or by way of a video cable. If a wired connection is used, the video cable is connected to both the intraoral camera and the computing device.
  • the dental camera electrical supply is directly powered through the computer USB port.
  • the voltage powering the camera is of continuous 5 V low voltage type (0.5 A).
  • imaging software such as SOPRO
  • Imaging software such as SOPRO V2.3 used herein, is required to visualize, capture, and store video and digital images taken by the intraoral camera.
  • a procedure file was created for each subject in order to record and store the digital images.
  • the computing device was placed near the subject's head during the procedure in order to use the monitor as the display screen to visually guide the intraoral camera over each tooth surface.
  • the camera focus ring was set to intraoral mode for video capture and/or camera digital image capture.
  • the mode on the intraoral camera was then set to the appropriate setting.
  • the "intra-oral” (1 -5 teeth) setting captures an image that is 5 mm to 30 mm from the camera. This setting was used for both video and camera digital image capture. See FIG. 4A.
  • the ACTEON SOPROCARE camera has a "perio mode", which is a fluorescent mode that is associated with chromatic amplification to highlight dental plaque using ultraviolet light. This "perio mode" revealed both old and new plaque in various stages. New plaque was interpreted as a white color, while older plaque was interpreted as yellow or orange colors depending on its mineralization. See FIG. 4B. Perio mode was used herein to capture video and camera digital images.
  • a disposable dental barrier can be placed over the camera lens, followed by the optional placement of a camera tip over the dental barrier.
  • the camera tip enables displacement of ambient lighting.
  • the intraoral camera can capture adequate digital images of dental plaque without using a camera tip.
  • Mouth props may be used to assist subjects with or without an endotracheal tube to keep the mouth open wide enough for movement of the intraoral camera during the procedure.
  • the mouth prop would be placed on the opposite side of the mouth being recorded.
  • the intraoral camera can initiate video or camera digital image recordings.
  • the subject's full set of teeth were visualized in four quadrants. These quadrants included an upper right quadrant, lower right quadrant, upper left quadrant, and lower left quadrant.
  • the speed of video digital imaging decreases the amount of subject burden. Pausing 1 to 2 seconds over each tooth surface will enhance the quality of still frame images to be produced from the video at a later time.
  • These video recordings can be obtained in any suitable way.
  • the following is an exemplary step-by-step methodology for taking these recordings.
  • the technician or other member of the medical team (herein the "operator") can lift the subject's upper lip with a free hand to expose the full buccal surface of the central and lateral incisor areas.
  • the camera is held steady over the subject's first available upper quadrant front tooth for 1 -2 seconds and over each buccal tooth surface thereafter, moving the camera over the central and lateral incisor area.
  • the camera does not need to stop or be held steady for 1-2 seconds over each tooth surface; rather, the camera can simply take a continuous video along the rows of teeth, and frames can be extracted from that video, as will become clearer as this specification continues.
  • the operator's free hand can be used to guide the camera distally over the cuspid and molar areas, until the buccal surface of the subject's last tooth in the back of the mouth and in the upper quadrant is recorded.
  • the camera lens is angled to capture the full biting surface of this same last back tooth, pausing 1 -2 seconds over the biting surface of each tooth (or alternatively the camera does not need to be paused over the tooth surface), and moving the camera over the molar and cuspid areas, thus guiding the camera towards the lateral and central incisor areas until the biting surface of the subject's first front tooth in the upper quadrant is recorded.
  • the camera lens is then angled to record the lingual surface of this same first front tooth, pausing 1 -2 seconds over the lingual surface of each tooth (or alternatively the camera does not need to be paused over the tooth surface), and moving the camera over the central and lateral incisor area, thus guiding the camera distally over the cuspid and molar areas until the lingual surface of the subject's last back tooth in this upper quadrant is recorded.
  • the camera On the same side of the mouth, the camera is moved down to record the buccal surface of the subject's last back tooth in the lower quadrant, pausing 1-2 seconds over the buccal surface of each tooth (or alternatively the camera does not need to be paused over the tooth surface), and moving the camera over the molar and cuspid areas in the lower quadrant.
  • the operator can then use a free hand to move the lower lip down to expose and record the full buccal surface of each tooth in the lateral and central incisor area, until the subject's first front tooth in the lower quadrant is recorded.
  • the camera lens is angled to capture the full biting surface of this same front tooth, pausing 1-2 seconds over the biting surface of each tooth (or alternatively the camera does not need to be paused over the tooth surface), and moving the camera over the central and lateral incisor areas, thus guiding the camera distally over the cuspid and molar areas until the biting surface of the last tooth in the back of the subject's mouth in the lower quadrant is recorded.
  • the camera lens is then angled to begin recording the lingual surface of this same last back tooth, pausing 1 -2 seconds over the lingual surface of each tooth in the molar and cuspid areas (or alternatively the camera does not need to be paused over the tooth surface), and guiding the camera over the lateral and central incisor areas until the subject's first front tooth in this lower quadrant is recorded.
  • a mouth prop moves it to the opposite side to continue recording.
  • the foregoing sequence of video recording is then repeated on the opposite side of the subject's mouth. Once the recording of every tooth surface is complete, recording can be stopped. If needed, the mouth prop can be removed and placed inside a sealable biohazard bag for transport to the lab to be cleaned and sterilized.
  • single frame images can be obtained in any suitable way.
  • the following is an exemplary step-by-step methodology for taking these digital images.
  • the subject's full set of teeth can be visualized in six sections. These sections include an upper right, lower right, upper middle, lower middle, upper left, and lower left.
  • the operator's free hand can be used to lift the subject's upper lip to expose the full buccal surface of the central and lateral incisors.
  • camera mode along with the intra-oral setting, a digital image of up to 6 teeth is captured in each section on the buccal side and again on the lingual side for a total of 12 digital images that represent all tooth surfaces in that particular quadrant.
  • the subject's file is saved and closed, to be prepared and analyzed at a later time.
  • single still frame digital images of each tooth surface can be produced from the video recording with digital imaging software. Secure storage of these video files and single image files make it possible to maintain archival data that can be better subjected to additional analysis and reliability determinations.
  • the image of the tooth is cropped, so that the image only includes the targeted tooth in the image without losing the integrity of the tooth.
  • Imaging software is then used to lowerthe resolution.
  • the image is resized to a percentage of 50 or less (or to reset the pixel to 100 in horizontal percentage, and the vertical percentage will be automatically adjusted).
  • the R program is then run to obtain the plaque percentage for the tooth (see FIG. 5 for the R code to obtain the final plaque percentage for the tooth). More specifically as it pertains to the step of the R program obtaining the plaque percentage, to digitize the color, most software programs use three dimensions to record the color, namely the RGB.
  • the software uses two digits for each color dimension, and each digit uses a hexadecimal system to count the numbers. Therefore, there are 256 possible values to score each dimension of the color. Since an objective herein was to calculate the percentage of plaque on a tooth and a typical plaque is always presented in a yellow color, the calculated score was used to judge whether each pixel should be classified as yellow or not.
  • the determination of whether a pixel should be considered yellow can be achieved and implemented in any suitable manner.
  • the combinations of the three colors (RGB) that can create 'yellow' were determined quantitatively.
  • Each color dimension was divided into four categories: (0, 64), (64, 128), (128, 192), (192, 255).
  • Four categories were chosen because that was considered an acceptable compromise for accuracy and computational difficulty.
  • the end result was 64 categories.
  • the middle point of each range was chosen— 32, 96, 160, 224— to be the representative of that range; the color for that specific combination was used to represent the color for that category.
  • the value of the red dimension was found to be between about 0.75 times of the value of the green dimension and about 2.5 times of the value of the green dimension.
  • the value of both the green dimension and the red dimension were found to be at least about 1.2 times of the value of the blue dimension.
  • the percentage of yellow could be calculated by using the number of yellow pixels divided by the total number of pixels in the picture.
  • FIG. 6 depict color samples that can be classified as 'yellow' color (plaque) or non-yellow color (normal).
  • each recording is broken down into individual frames.
  • a random number generator is then used to randomly select a predetermined number of frames (e.g., 50) from one (1) to x, where x is the number of total images/frames within that recording.
  • the R code for this can be seen in FIG. 7.
  • frames can be randomly selected with or without specified criteria for the selection of frames. These criteria (e.g., a predetermined number of frames from each quadrant, certain quality of the frame to minimize the noise, etc.) can be inputted manually or can be learned automatically by the software algorithm, for example via artificial intelligence. If any criteria are present, the imaging software can automatically select frames that are relevant and/or discard frames that are not relevant. In any case, upon selection of the frames from the video recording, the corresponding images from the individual frames are selected, and the yellow percentage of the selected images are the calculated, as previously discussed.
  • any suitable methodology for randomly selecting frames/images from the video recording is contemplated herein. For illustration purposes, differing methodologies were tested herein for this random selection.
  • the 50 images can be randomly selected, but only the portion of the image related to teeth for analysis, is selected and cropped, as previously discussed.
  • Another methodology is selecting images from the middle 60% of the complete video recording, thus truncating the first 20% and the last 20% of the video recording.
  • Yet another methodology is simply randomly selecting 50 images from the frames of the video recording.
  • each tooth can be selected, cropped, and analyzed separately, and subsequently taking the average of the scores of the teeth.
  • Table 1 Dental plaque percentage comparison using different methods.
  • the present invention may be embodied on various computing platforms that perform actions responsive to software-based instructions and most particularly on touchscreen portable devices.
  • the following provides an antecedent basis for the information technology that may be utilized to enable the invention.
  • the computer readable medium described in the claims below may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any non-transitory, tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wire-line, optical fiber cable, radio frequency, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C#, C++, Visual Basic or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • an "end-user" is an operator of the software as opposed to a developer or author who modifies the underlying source code of the software.
  • authentication means identifying the particular user while authorization defines what procedures and functions that user is permitted to execute.
  • Oral health This term is used herein to refer to the well-being of one's mouth, specifically based herein on the plaque that may be present in the person's mouth. More specifically, the manner in which plaque affects oral health can be the magnitude and ratio of dental plaque per tooth and across all teeth, the estimated age of dental plaque per tooth and across all teeth, and the ratio of plaque burden to plaque age per tooth and across all teeth.
  • Random selection This term is used herein to refer to a relatively unpredictably chosen array of frames/images from a video recording.
  • the term “relatively” is used because it is contemplated herein that this random selection can be performed with or without a predetermined set of criteria for the selection. For example, if 1 ,000 frames are present in a video recording, a set of criteria may eliminate 200 of those frames, and then 50 frames can be "randomly selected” from the remaining 800 frames. Alternatively, the 50 frames can be "randomly selected” from the 1 ,000 frames with no criteria present. Both circumstances are contemplated herein.
  • Single still frame digital image This term is used herein to refer to a visual representation of a tooth extracted at a specific time during a video recording.
  • Substantially all This term is used herein to referto a representative number of tooth surfaces that, when analyzed, can be used to characterize the amount of plaque across all teeth or on each tooth. This number can be all teeth in the subject's mouth, or it can be an amount less than all of the teeth. For example, some teeth may be inaccessible by an intraoral camera due to the anatomy of a particular subject's mouth, so only the accessible surfaces are recorded. These circumstances are still considered herein as “substantially all”.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Quality & Reliability (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
EP17876074.0A 2016-12-01 2017-03-08 Standardisierte beurteilung und bewertung der mundgesundheit mittels digitaler bildgebung Withdrawn EP3549061A4 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201615366741A 2016-12-01 2016-12-01
US15/450,925 US10405754B2 (en) 2015-12-01 2017-03-06 Standardized oral health assessment and scoring using digital imaging
PCT/US2017/021367 WO2018101977A1 (en) 2016-12-01 2017-03-08 Standardized oral health assessment and scoring using digital imaging

Publications (2)

Publication Number Publication Date
EP3549061A1 true EP3549061A1 (de) 2019-10-09
EP3549061A4 EP3549061A4 (de) 2020-07-08

Family

ID=62242234

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17876074.0A Withdrawn EP3549061A4 (de) 2016-12-01 2017-03-08 Standardisierte beurteilung und bewertung der mundgesundheit mittels digitaler bildgebung

Country Status (2)

Country Link
EP (1) EP3549061A4 (de)
WO (1) WO2018101977A1 (de)

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7596179B2 (en) * 2002-02-27 2009-09-29 Hewlett-Packard Development Company, L.P. Reducing the resolution of media data
US7324661B2 (en) * 2004-04-30 2008-01-29 Colgate-Palmolive Company Computer-implemented system and method for automated and highly accurate plaque analysis, reporting, and visualization
US7813591B2 (en) * 2006-01-20 2010-10-12 3M Innovative Properties Company Visual feedback of 3D scan parameters
US20100183523A1 (en) * 2009-01-22 2010-07-22 Wagner Richard E Dental composition and method
US20100279248A1 (en) * 2009-03-05 2010-11-04 Mourad Pierre D Device and method for predicting the likelihood of caries development
CA2791624A1 (en) * 2010-02-26 2011-09-01 Myskin, Inc. Analytic methods of tissue evaluation
US20110216409A1 (en) * 2010-03-04 2011-09-08 Stutes Richard Dale Optical barrier device
CA2799266A1 (en) * 2010-05-13 2011-11-17 Stephen Abrams Method of processing and displaying oral health diagnostic data
JP5796408B2 (ja) * 2011-08-24 2015-10-21 オムロンヘルスケア株式会社 口腔ケア装置
US20140118427A1 (en) * 2012-10-30 2014-05-01 Pixtronix, Inc. Display apparatus employing frame specific composite contributing colors
US10304365B2 (en) * 2014-05-16 2019-05-28 Nec Display Solutions, Ltd. Image correction device, display device, and image correction method

Also Published As

Publication number Publication date
WO2018101977A1 (en) 2018-06-07
EP3549061A4 (de) 2020-07-08

Similar Documents

Publication Publication Date Title
US10405754B2 (en) Standardized oral health assessment and scoring using digital imaging
US8866894B2 (en) Method for real-time visualization of caries condition
DK3050534T3 (en) TRACKING AND PREDICTING DENTAL CHANGES
KR102267197B1 (ko) 디지털 치아 영상에 치아 진료 데이터를 기록 및 표시하는 방법 및 장치
Kidd et al. Occlusal caries diagnosis: a changing challenge for clinicians and epidemiologists
DK2668904T3 (en) Spectral filter for an intraoral imaging system
US8073212B2 (en) Methods and products for analyzing gingival tissues
US6821116B2 (en) System for scanning oral environment
Jablonski-Momeni et al. Clinical performance of the near-infrared imaging system VistaCam iX Proxi for detection of approximal enamel lesions
JP6830082B2 (ja) 歯科分析システムおよび歯科分析x線システム
Liu et al. A pilot study of a deep learning approach to detect marginal bone loss around implants
Pentapati et al. Clinical applications of intraoral camera to increase patient compliance-current perspectives
CN101911117A (zh) 用于分析硬组织的方法和系统
KR102428636B1 (ko) 딥러닝 알고리즘을 이용한 치아 검진 방법
Liu et al. Red fluorescence imaging for dental plaque detection and quantification: pilot study
EP3549061A1 (de) Standardisierte beurteilung und bewertung der mundgesundheit mittels digitaler bildgebung
Kasai et al. Dental plaque assessment lifelogging system using commercial camera for oral healthcare
Zahid et al. Validity and Reliability of Polarized vs Non-Polarized Digital Images for Measuring Gingival Melanin Pigmentation
Prasanth et al. In vivo inflammation mapping of periodontal disease based on diffuse reflectance spectral imaging: a clinical study
Farooq Diagnosis of Dental Caries-Old and the New
Mauriello et al. Dental Digital Radiographic Imaging.
Guo et al. Establishment and evaluation of a 3D quantitative analysis method for dental plaque based on an intraoral scanner technique.
Arjunan Accuracy of diagnosing proximal caries using intra-oral bitewing radiographs and near infra-red imaging (NIRI) technology in iTero element 5D scanners: an in vivo study
Menon et al. YOLO V5 Deep Learning Model for Dental Problem Detection
Agius Comparing the diagnostic performance of a fluorescence-based caries detection tool of an intra-oral scanner to the gold-standard visual and radiographic examination

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190701

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20200609

RIC1 Information provided on ipc code assigned before grant

Ipc: G06K 9/32 20060101ALI20200604BHEP

Ipc: A61B 1/24 20060101ALI20200604BHEP

Ipc: A61B 1/00 20060101ALI20200604BHEP

Ipc: G06K 9/00 20060101AFI20200604BHEP

Ipc: G06K 9/46 20060101ALI20200604BHEP

Ipc: G06T 7/90 20170101ALI20200604BHEP

Ipc: G06T 7/00 20170101ALI20200604BHEP

Ipc: G06K 9/22 20060101ALI20200604BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

17Q First examination report despatched

Effective date: 20210617

18W Application withdrawn

Effective date: 20210714