US20170172418A1 - Standardized oral health assessment and scoring using digital imaging - Google Patents

Standardized oral health assessment and scoring using digital imaging Download PDF

Info

Publication number
US20170172418A1
US20170172418A1 US15/450,925 US201715450925A US2017172418A1 US 20170172418 A1 US20170172418 A1 US 20170172418A1 US 201715450925 A US201715450925 A US 201715450925A US 2017172418 A1 US2017172418 A1 US 2017172418A1
Authority
US
United States
Prior art keywords
color
frames
image
quadrant
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/450,925
Other versions
US10405754B2 (en
Inventor
Cindy L. Munro
Paula Louise Cairns
Xusheng Chen
Gwendolyn J. Good
Kevin Edward Kip
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of South Florida
Original Assignee
University of South Florida
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of South Florida filed Critical University of South Florida
Priority to US15/450,925 priority Critical patent/US10405754B2/en
Priority to EP17876074.0A priority patent/EP3549061A4/en
Priority to PCT/US2017/021367 priority patent/WO2018101977A1/en
Assigned to NATIONAL INSTITUTES OF HEALTH (NIH), U.S. DEPT. OF HEALTH AND HUMAN SERVICES (DHHS), U.S. GOVERNMENT reassignment NATIONAL INSTITUTES OF HEALTH (NIH), U.S. DEPT. OF HEALTH AND HUMAN SERVICES (DHHS), U.S. GOVERNMENT CONFIRMATORY LICENSE (SEE DOCUMENT FOR DETAILS). Assignors: UNIVERSITY OF SOUTH FLORIDA
Publication of US20170172418A1 publication Critical patent/US20170172418A1/en
Assigned to UNIVERSITY OF SOUTH FLORIDA reassignment UNIVERSITY OF SOUTH FLORIDA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GOOD, GWENDOLYN J., CAIRNS, PAULA LOUISE, CHEN, XUSHENG, KIP, KEVIN EDWARD, MUNRO, CINDY L.
Application granted granted Critical
Publication of US10405754B2 publication Critical patent/US10405754B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0088Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/24Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1032Determining colour for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C5/00Filling or capping teeth
    • A61C5/80Dental aids fixed to teeth during treatment, e.g. tooth clamps
    • A61C5/82Dams; Holders or clamps therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C9/00Impression cups, i.e. impression trays; Impression methods
    • A61C9/004Means or methods for taking digitized impressions
    • A61C9/0046Data acquisition means or methods
    • A61C9/0053Optical means or methods, e.g. scanning the teeth by a laser or light beam
    • G06F19/3406
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30036Dental; Teeth

Definitions

  • This invention relates, generally, to oral health assessments. More specifically, it relates to use of digital imaging to standardize, assess, and score oral health in a subject.
  • the present invention may address one or more of the problems and deficiencies of the prior art discussed above. However, it is contemplated that the invention may prove useful in addressing other problems and deficiencies in a number of technical areas. Therefore, the claimed invention should not necessarily be construed as limited to addressing any of the particular problems or deficiencies discussed herein.
  • the current invention is a method of assessing oral health in a patient or subject.
  • the method includes capturing or recording the frames of substantially all buccal, occlusal, and lingual surfaces in the subject's set of teeth, using a suitable intraoral camera.
  • the frames are imported into an image processing software program that is implemented on a computing device.
  • the frames are processed on the software program to generate images of the teeth to be analyzed.
  • the color of at least a plurality of the frames/images are analyzed and classified to determine presence of yellow color, wherein yellow color indicates presence of plaque. Results of the color analysis are scored to objectively and quantitatively assess the subject's oral health.
  • a dental barrier can be positioned over a lens of the intraoral camera, and a camera tip of the intraoral camera is positioned over the dental barrier.
  • the frames captured by the intraoral camera may be either photographs taken by the camera or video recordings taken by the camera.
  • the subject's teeth can be divided into a plurality of sections, including an upper right section, a lower right section, an upper middle section, a lower middle section, an upper left section, and a lower left section.
  • processing the frames can be performed by cropping each image such that the image includes the targeted tooth, and the resolution of the image can be lowered to a percentage of about 50 or less.
  • the color of the image is analyzed and classified by classifying the color of each pixel of the image. Further, each pixel is classified using RGB color code combinations, wherein a three-dimensional point (x, y, z) defines the color of each pixel. Still further, the step of analyzing and classifying color is further performed by dividing each color dimension of the RGB color code combinations into four (4) categories: (0, 64), (64, 128), (128, 192), and (192, 255). A middle point is selected in each category to be representative of the corresponding category. All categories are then cored to determine when yellow color is present on each pixel.
  • scoring the results of the color analysis can include calculating a percentage of yellow color in an image by dividing the number of yellow pixels by the total number of pixels in the image.
  • the frames may be video recordings.
  • the subject's teeth can be divided into a plurality of quadrants, including an upper right quadrant, a lower right quadrant, an upper left quadrant, and a lower left quadrant.
  • Video should be captured and recorded in at least one upper quadrant and at least one lower quadrant.
  • the following order can be used to take video in each quadrant: capturing and recording video of the buccal surfaces in the quadrant, followed by capturing and recording video of the occlusal surfaces in the quadrant, followed by capturing and recording video of the lingual surfaces in the quadrant, and followed by repeating the foregoing steps in another quadrant.
  • processing the frames includes extracting single still frame digital images of the tooth surfaces from the video recording.
  • each image is cropped, and a resolution of the image can be lowered to a percentage of about 50 or less.
  • the color of the image is analyzed and classified by classifying the color of each pixel of the image. Further, each pixel is classified using RGB color code combinations, wherein a three-dimensional point (x, y, z) defines the color of each pixel. Still further, the step of analyzing and classifying color is further performed by dividing each color dimension of the RGB color code combinations into four (4) categories: (0, 64), (64, 128), (128, 192), and (192, 255). A middle point is selected in each category to be representative of the corresponding category.
  • scoring the results of the color analysis can include calculating a percentage of yellow color in an image by dividing the number of yellow pixels by the total number of pixels in the image.
  • a plurality of frames from all frames can be randomly selected prior to processing the frames on the software program. This random selection can be performed with or without criteria for the random selection.
  • the current invention is a method of assessing oral health in a patient or subject, comprising any one or more—or even all—of the foregoing steps.
  • FIG. 1 is a flowchart depicting the steps of oral health assessment, according to an embodiment of the current invention.
  • FIG. 2 is a flowchart depicting the steps of capturing data, according to an embodiment of the current invention.
  • FIG. 3 is a flowchart depicting the steps of scoring data, according to an embodiment of the current invention.
  • FIG. 4A is an image of a tooth using the “intra-oral” setting of the intraoral camera.
  • FIG. 4B is a schematic of “perio mode” of the ACTEON SOPROCARE intraoral camera.
  • FIG. 5 depicts the R code for obtaining the final plaque percentage for the tooth.
  • FIG. 6 is a chart depicting color samples that can be classified as ‘yellow’ color (plaque) or non-yellow color (normal).
  • FIG. 7 depicts the R code for randomly selecting fifty (50) numbers/frames from a video recording.
  • the current invention uses digital imaging technology to objectively capture clinical data on oral health, and then provides standardized scoring methodology for quantifying oral health.
  • the methodology is applicable to persons in clinical settings (including hospitalized patients) as well as the general population. Generally, it involves two (2) stages (see FIG. 1 ): the process of capturing clinical data by use of digital imaging technology, and the process of standardized scoring of oral health data.
  • the process of capturing clinical data on oral health is based on the use of a conventional intraoral camera (e.g., ACTEON SOPROCARE Diagnostic/Clinical Intraoral Cameras) that has the capacity to capture digital images of all tooth surfaces in white light within and outside of dental laboratory settings. Any suitable intraoral camera is contemplated herein. Data from the digital images are used to enhance the detection of plaque on tooth surfaces, which are difficult to directly observe and score by a dental hygienist. Imaging software (e.g., ACTEON SOPRO Imaging Software) is used in conjunction with the camera to visualize, capture, and store each subject's digital image recording.
  • a conventional intraoral camera e.g., ACTEON SOPROCARE Diagnostic/Clinical Intraoral Cameras
  • Any suitable intraoral camera is contemplated herein.
  • Imaging software e.g., ACTEON SOPRO Imaging Software
  • ACTEON SOPRO Imaging Software is used in conjunction with the camera to visualize, capture, and store each subject's digital image recording.
  • the process of digital imaging divides the subject's full set of teeth into four (4) quadrants. These quadrants include an upper right quadrant, lower right quadrant, upper left quadrant, and lower left quadrant.
  • the recording of buccal, occlusal, and lingual surfaces of each quadrant in video-mode is significantly more effective and efficient than taking multiple still frame images at the bedside.
  • Each tooth's digital plaque data is collected in a standardized manner, and with the ability to select optimal frames for analysis.
  • dental plaque burden is conventionally scored by visual examination by use of the University of Mississippi Oral Hygiene Index (UM-OHI). Using visual examination, for the ten (10) sections of each tooth, plaque is scored as present (value of 1) versus absent (value of 0). Thus, the maximum plaque score per tooth is 10. The mean plaque score for the subject is calculated by dividing the total score by number of teeth.
  • each tooth's data is extracted into a software program, such as but not limited to R, and with a minimum of about 10,000-40,000 color-derived pixels per tooth.
  • the color classification of each pixel is determined by the software program using an algorithm that makes use of red/green/blue (RGB) color code combinations. These classifications are then calculated quantitatively within the software program, and a separate algorithm automatically generates a range of oral health scoring techniques. These include, but are not limited to: (i) magnitude (and ratio) of dental plaque per tooth and across all teeth; (ii) estimated age of dental plaque per tooth and across all teeth; and (iii) ratio of plaque burden to plaque age per tooth and across all teeth.
  • RGB red/green/blue
  • the process of oral health scoring is set up such that after appropriate selection of digital images has been achieved with use of the intraoral camera, and after these data have been imported into the computer coding language/program (e.g., R), only a few key strokes are required to compile and execute the algorithmic code, thereby resulting in standardized and near real-time scoring of oral health.
  • the computer coding language/program e.g., R
  • a conventional intraoral camera such as SOPROCARE by ACTEON used herein, illuminates dental tissue with a wavelength of light between 440 nm and 680 nm. Exposed tissue absorbs the energy and reflects it in florescent form.
  • the handheld intraoral camera can be connected to a computing device wirelessly or by way of a video cable. If a wired connection is used, the video cable is connected to both the intraoral camera and the computing device.
  • the dental camera electrical supply is directly powered through the computer USB port.
  • the voltage powering the camera is of continuous 5 V low voltage type (0.5 A).
  • imaging software such as SOPRO
  • Imaging software such as SOPRO V2.3 used herein, is required to visualize, capture, and store video and digital images taken by the intraoral camera.
  • a procedure file was created for each subject in order to record and store the digital images.
  • the computing device was placed near the subject's head during the procedure in order to use the monitor as the display screen to visually guide the intraoral camera over each tooth surface.
  • the camera focus ring was set to intraoral mode for video capture and/or camera digital image capture.
  • the mode on the intraoral camera was then set to the appropriate setting.
  • the “intra-oral” (1-5 teeth) setting captures an image that is 5 mm to 30 mm from the camera. This setting was used for both video and camera digital image capture. See FIG. 4A .
  • the ACTEON SOPROCARE camera has a “perio mode”, which is a fluorescent mode that is associated with chromatic amplification to highlight dental plaque using ultraviolet light. This “perio mode” revealed both old and new plaque in various stages. New plaque was interpreted as a white color, while older plaque was interpreted as yellow or orange colors depending on its mineralization. See FIG. 4B . Perio mode was used herein to capture video and camera digital images.
  • a disposable dental barrier can be placed over the camera lens, followed by the optional placement of a camera tip over the dental barrier.
  • the camera tip enables displacement of ambient lighting.
  • the intraoral camera can capture adequate digital images of dental plaque without using a camera tip.
  • Mouth props may be used to assist subjects with or without an endotracheal tube to keep the mouth open wide enough for movement of the intraoral camera during the procedure.
  • the mouth prop would be placed on the opposite side of the mouth being recorded.
  • the intraoral camera can initiate video or camera digital image recordings.
  • the subject's full set of teeth were visualized in four quadrants. These quadrants included an upper right quadrant, lower right quadrant, upper left quadrant, and lower left quadrant.
  • the speed of video digital imaging decreases the amount of subject burden. Pausing 1 to 2 seconds over each tooth surface will enhance the quality of still frame images to be produced from the video at a later time.
  • These video recordings can be obtained in any suitable way.
  • the following is an exemplary step-by-step methodology for taking these recordings.
  • the technician or other member of the medical team (herein the “operator”) can lift the subject's upper lip with a free hand to expose the full buccal surface of the central and lateral incisor areas.
  • the camera is held steady over the subject's first available upper quadrant front tooth for 1-2 seconds and over each buccal tooth surface thereafter, moving the camera over the central and lateral incisor area.
  • the camera does not need to stop or be held steady for 1-2 seconds over each tooth surface; rather, the camera can simply take a continuous video along the rows of teeth, and frames can be extracted from that video, as will become clearer as this specification continues.
  • the operator's free hand can be used to guide the camera distally over the cuspid and molar areas, until the buccal surface of the subject's last tooth in the back of the mouth and in the upper quadrant is recorded.
  • the camera lens is angled to capture the full biting surface of this same last back tooth, pausing 1-2 seconds over the biting surface of each tooth (or alternatively the camera does not need to be paused over the tooth surface), and moving the camera over the molar and cuspid areas, thus guiding the camera towards the lateral and central incisor areas until the biting surface of the subject's first front tooth in the upper quadrant is recorded.
  • the camera lens is then angled to record the lingual surface of this same first front tooth, pausing 1-2 seconds over the lingual surface of each tooth (or alternatively the camera does not need to be paused over the tooth surface), and moving the camera over the central and lateral incisor area, thus guiding the camera distally over the cuspid and molar areas until the lingual surface of the subject's last back tooth in this upper quadrant is recorded.
  • the camera is moved down to record the buccal surface of the subject's last back tooth in the lower quadrant, pausing 1-2 seconds over the buccal surface of each tooth (or alternatively the camera does not need to be paused over the tooth surface), and moving the camera over the molar and cuspid areas in the lower quadrant.
  • the operator can then use a free hand to move the lower lip down to expose and record the full buccal surface of each tooth in the lateral and central incisor area, until the subject's first front tooth in the lower quadrant is recorded.
  • the camera lens is angled to capture the full biting surface of this same front tooth, pausing 1-2 seconds over the biting surface of each tooth (or alternatively the camera does not need to be paused over the tooth surface), and moving the camera over the central and lateral incisor areas, thus guiding the camera distally over the cuspid and molar areas until the biting surface of the last tooth in the back of the subject's mouth in the lower quadrant is recorded.
  • the camera lens is then angled to begin recording the lingual surface of this same last back tooth, pausing 1-2 seconds over the lingual surface of each tooth in the molar and cuspid areas (or alternatively the camera does not need to be paused over the tooth surface), and guiding the camera over the lateral and central incisor areas until the subject's first front tooth in this lower quadrant is recorded.
  • a mouth prop moves it to the opposite side to continue recording.
  • the foregoing sequence of video recording is then repeated on the opposite side of the subject's mouth. Once the recording of every tooth surface is complete, recording can be stopped. If needed, the mouth prop can be removed and placed inside a sealable biohazard bag for transport to the lab to be cleaned and sterilized.
  • single frame images can be obtained in any suitable way.
  • the following is an exemplary step-by-step methodology for taking these digital images.
  • the subject's full set of teeth can be visualized in six sections. These sections include an upper right, lower right, upper middle, lower middle, upper left, and lower left.
  • the operator's free hand can be used to lift the subject's upper lip to expose the full buccal surface of the central and lateral incisors.
  • camera mode along with the intra-oral setting, a digital image of up to 6 teeth is captured in each section on the buccal side and again on the lingual side for a total of 12 digital images that represent all tooth surfaces in that particular quadrant.
  • the subject's file is saved and closed, to be prepared and analyzed at a later time.
  • single still frame digital images of each tooth surface can be produced from the video recording with digital imaging software. Secure storage of these video files and single image files make it possible to maintain archival data that can be better subjected to additional analysis and reliability determinations.
  • the image of the tooth is cropped, so that the image only includes the targeted tooth in the image without losing the integrity of the tooth.
  • Imaging software is then used to lower the resolution.
  • the image is resized to a percentage of 50 or less (or to reset the pixel to 100 in horizontal percentage, and the vertical percentage will be automatically adjusted).
  • the R program is then run to obtain the plaque percentage for the tooth (see FIG. 5 for the R code to obtain the final plaque percentage for the tooth).
  • the R program to digitize the color
  • most software programs use three dimensions to record the color, namely the RGB.
  • the software programs score the value of a specific pixel on each dimension, and a three-dimensional point (x, y, z) uniquely defines the color of the specific pixel.
  • the software uses two digits for each color dimension, and each digit uses a hexadecimal system to count the numbers. Therefore, there are 256 possible values to score each dimension of the color. Since an objective herein was to calculate the percentage of plaque on a tooth and a typical plaque is always presented in a yellow color, the calculated score was used to judge whether each pixel should be classified as yellow or not.
  • the determination of whether a pixel should be considered yellow can be achieved and implemented in any suitable manner.
  • the combinations of the three colors (RGB) that can create ‘yellow’ were determined quantitatively.
  • Each color dimension was divided into four categories: (0, 64), (64, 128), (128, 192), (192, 255).
  • Four categories were chosen because that was considered an acceptable compromise for accuracy and computational difficulty.
  • the end result was 64 categories.
  • the middle point of each range was chosen—32, 96, 160, 224—to be the representative of that range; the color for that specific combination was used to represent the color for that category.
  • the percentage of yellow could be calculated by using the number of yellow pixels divided by the total number of pixels in the picture.
  • FIG. 6 depict color samples that can be classified as ‘yellow’ color (plaque) or non-yellow color (normal).
  • each recording is broken down into individual frames.
  • a random number generator is then used to randomly select a predetermined number of frames (e.g., 50) from one (1) to x, where x is the number of total images/frames within that recording.
  • the R code for this can be seen in FIG. 7 .
  • frames can be randomly selected with or without specified criteria for the selection of frames. These criteria (e.g., a predetermined number of frames from each quadrant, certain quality of the frame to minimize the noise, etc.) can be inputted manually or can be learned automatically by the software algorithm, for example via artificial intelligence. If any criteria are present, the imaging software can automatically select frames that are relevant and/or discard frames that are not relevant. In any case, upon selection of the frames from the video recording, the corresponding images from the individual frames are selected, and the yellow percentage of the selected images are the calculated, as previously discussed.
  • any suitable methodology for randomly selecting frames/images from the video recording is contemplated herein. For illustration purposes, differing methodologies were tested herein for this random selection.
  • the 50 images can be randomly selected, but only the portion of the image related to teeth for analysis, is selected and cropped, as previously discussed.
  • Another methodology is selecting images from the middle 60% of the complete video recording, thus truncating the first 20% and the last 20% of the video recording.
  • Yet another methodology is simply randomly selecting 50 images from the frames of the video recording.
  • each tooth can be selected, cropped, and analyzed separately, and subsequently taking the average of the scores of the teeth.
  • the present invention may be embodied on various computing platforms that perform actions responsive to software-based instructions and most particularly on touchscreen portable devices.
  • the following provides an antecedent basis for the information technology that may be utilized to enable the invention.
  • the computer readable medium described in the claims below may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any non-transitory, tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wire-line, optical fiber cable, radio frequency, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C#, C++, Visual Basic or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • an “end-user” is an operator of the software as opposed to a developer or author who modifies the underlying source code of the software.
  • authentication means identifying the particular user while authorization defines what procedures and functions that user is permitted to execute.
  • Frame This term is used herein to refer to a fraction/division of time on a multimedia (e.g., video) timeline.
  • Oral health This term is used herein to refer to the well-being of one's mouth, specifically based herein on the plaque that may be present in the person's mouth. More specifically, the manner in which plaque affects oral health can be the magnitude and ratio of dental plaque per tooth and across all teeth, the estimated age of dental plaque per tooth and across all teeth, and the ratio of plaque burden to plaque age per tooth and across all teeth.
  • Random selection This term is used herein to refer to a relatively unpredictably chosen array of frames/images from a video recording.
  • the term “relatively” is used because it is contemplated herein that this random selection can be performed with or without a predetermined set of criteria for the selection. For example, if 1,000 frames are present in a video recording, a set of criteria may eliminate 200 of those frames, and then 50 frames can be “randomly selected” from the remaining 800 frames. Alternatively, the 50 frames can be “randomly selected” from the 1,000 frames with no criteria present. Both circumstances are contemplated herein.
  • Single still frame digital image This term is used herein to refer to a visual representation of a tooth extracted at a specific time during a video recording.
  • Substantially all This term is used herein to refer to a representative number of tooth surfaces that, when analyzed, can be used to characterize the amount of plaque across all teeth or on each tooth. This number can be all teeth in the subject's mouth, or it can be an amount less than all of the teeth. For example, some teeth may be inaccessible by an intraoral camera due to the anatomy of a particular subject's mouth, so only the accessible surfaces are recorded. These circumstances are still considered herein as “substantially all”.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Epidemiology (AREA)
  • Fuzzy Systems (AREA)
  • Physiology (AREA)
  • Psychiatry (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Quality & Reliability (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Primary Health Care (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)

Abstract

System and methodology objectively capture clinical data on oral health and provide standardized scoring for quantifying oral health. The process of capturing clinical data is based on use of an intraoral camera with imaging software that has the capacity to capture digital images of all tooth surfaces. Digital plaque data are collected in a standardized manner per tooth, and with the ability to select optimal frames for analysis. Data are extracted per tooth into a software program. Color classification of each pixel is determined by the software program using an algorithm that makes use of red/green/blue color code combinations. These classifications are then quantitatively used within the software program and separate algorithms that automatically generate a range of oral health scoring techniques.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This nonprovisional application is a continuation-in-part of U.S. Nonprovisional patent application Ser. No. 15/366,741, entitled “Standardized Oral Health Assessment and Scoring Using Digital Imaging”, filed Dec. 1, 2016, which claims priority to U.S. Provisional Patent Application No. 62/261,631, entitled “Standardized Oral Health Assessment and Scoring Using Digital Imaging”, filed Dec. 1, 2015, all of which are incorporated herein by reference in their entireties.
  • FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • This invention was made with government support under Grant Number RO1 NR007652 awarded by the National Institutes of Health. The government has certain rights in the invention.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • This invention relates, generally, to oral health assessments. More specifically, it relates to use of digital imaging to standardize, assess, and score oral health in a subject.
  • 2. Brief Description of the Prior Art
  • Despite its prominent position in bedside care, there is little evidence to judge the benefits or associated risks of nurse-administered tooth brushing for mechanically ventilated adults, and the optimal frequency of tooth brushing in the critically ill has never been experimentally determined. Traditional methods for scoring oral health, including both tooth (e.g., plaque burden) and gum (e.g., inflammation) health, have relied upon visual examination by skilled professionals, including dental hygienists. Given the relatively subjective nature of this process, measurement of oral health is suboptimal for a number of reasons including, but not limited to: time burden, lack of reliability within and between assessors, lack of universal standardized scoring algorithms, and computational complexity in attempting to score oral health on multiple dimensions, such as simultaneous assessment of age and extent of plaque burden.
  • Attempts have been made at evaluating and quantifying plaque and oral health. Examples include U.S. Pat. No. 8,110,178; U.S. patent application Ser. No. 12/832,652; U.S. patent application Ser. No. 11/662,346; Lupita Jocelin Reyes Silveyra, Investigations on Automated Methods for Dental Plaque Detection, A thesis submitted to The University of Birmingham for the degree of Doctor of Philosophy, School of Dentistry College of Medical and Dental Sciences, The University of Birmingham, September 2011; Pretty I A, et al., Quantification of dental plaque in the research environment, Journal of dentistry (2005), 33, (3), 193-207, ISSN:0300-5712; Michael G McGrady, et al., Evaluating the use of fluorescent imaging for the quantification of dental fluorosis, BMC Oral Health (2012), 12, 47; and Rosa G M, et al., New portable system for dental plaque measurement using a digital single-lens reflex camera and image analysis: Study of reliability and validation. Journal of Indian Society of Periodontology. 2015; 19(3):279-284. doi:10.4103/0972-124X.152415. However, none provide a standardized and objective system for assessing oral health.
  • Accordingly, what is needed for both for clinicians and researchers is a reliable, user-friendly, and fully objective and standardized methodology and system for quantifying and scoring oral health. However, in view of the art considered as a whole at the time the present invention was made, it was not obvious to those of ordinary skill in the field of this invention how the shortcomings of the prior art could be overcome.
  • While certain aspects of conventional technologies have been discussed to facilitate disclosure of the invention, Applicants in no way disclaim these technical aspects, and it is contemplated that the claimed invention may encompass one or more of the conventional technical aspects discussed herein.
  • The present invention may address one or more of the problems and deficiencies of the prior art discussed above. However, it is contemplated that the invention may prove useful in addressing other problems and deficiencies in a number of technical areas. Therefore, the claimed invention should not necessarily be construed as limited to addressing any of the particular problems or deficiencies discussed herein.
  • In this specification, where a document, act or item of knowledge is referred to or discussed, this reference or discussion is not an admission that the document, act or item of knowledge or any combination thereof was at the priority date, publicly available, known to the public, part of common general knowledge, or otherwise constitutes prior art under the applicable statutory provisions; or is known to be relevant to an attempt to solve any problem with which this specification is concerned.
  • BRIEF SUMMARY OF THE INVENTION
  • The long-standing but heretofore unfulfilled need for an improved method of assessing oral health is now met by a new, useful, and nonobvious invention.
  • In an embodiment, the current invention is a method of assessing oral health in a patient or subject. The method includes capturing or recording the frames of substantially all buccal, occlusal, and lingual surfaces in the subject's set of teeth, using a suitable intraoral camera. The frames are imported into an image processing software program that is implemented on a computing device. The frames are processed on the software program to generate images of the teeth to be analyzed. The color of at least a plurality of the frames/images are analyzed and classified to determine presence of yellow color, wherein yellow color indicates presence of plaque. Results of the color analysis are scored to objectively and quantitatively assess the subject's oral health.
  • Optionally, a dental barrier can be positioned over a lens of the intraoral camera, and a camera tip of the intraoral camera is positioned over the dental barrier.
  • The frames captured by the intraoral camera may be either photographs taken by the camera or video recordings taken by the camera. When the frames are photographs, the subject's teeth can be divided into a plurality of sections, including an upper right section, a lower right section, an upper middle section, a lower middle section, an upper left section, and a lower left section. Further, when the frames are photographs, processing the frames can be performed by cropping each image such that the image includes the targeted tooth, and the resolution of the image can be lowered to a percentage of about 50 or less.
  • In other embodiments when the frames are photographs, the color of the image is analyzed and classified by classifying the color of each pixel of the image. Further, each pixel is classified using RGB color code combinations, wherein a three-dimensional point (x, y, z) defines the color of each pixel. Still further, the step of analyzing and classifying color is further performed by dividing each color dimension of the RGB color code combinations into four (4) categories: (0, 64), (64, 128), (128, 192), and (192, 255). A middle point is selected in each category to be representative of the corresponding category. All categories are then cored to determine when yellow color is present on each pixel. When this is done, it was found that yellow color is present on each pixel when a value of a red dimension is between about 0.75 times of a value of a green dimension and about 2.5 times of the value of the green dimension, and when the values of the green and red dimensions are at least about 1.2 times a value of a blue dimension. Optionally, scoring the results of the color analysis can include calculating a percentage of yellow color in an image by dividing the number of yellow pixels by the total number of pixels in the image.
  • As noted previously, the frames may be video recordings. In this case, the subject's teeth can be divided into a plurality of quadrants, including an upper right quadrant, a lower right quadrant, an upper left quadrant, and a lower left quadrant. Video should be captured and recorded in at least one upper quadrant and at least one lower quadrant. Optionally, the following order can be used to take video in each quadrant: capturing and recording video of the buccal surfaces in the quadrant, followed by capturing and recording video of the occlusal surfaces in the quadrant, followed by capturing and recording video of the lingual surfaces in the quadrant, and followed by repeating the foregoing steps in another quadrant.
  • In other embodiments when video recordings are used, processing the frames includes extracting single still frame digital images of the tooth surfaces from the video recording. Optionally, each image is cropped, and a resolution of the image can be lowered to a percentage of about 50 or less. The color of the image is analyzed and classified by classifying the color of each pixel of the image. Further, each pixel is classified using RGB color code combinations, wherein a three-dimensional point (x, y, z) defines the color of each pixel. Still further, the step of analyzing and classifying color is further performed by dividing each color dimension of the RGB color code combinations into four (4) categories: (0, 64), (64, 128), (128, 192), and (192, 255). A middle point is selected in each category to be representative of the corresponding category. All categories are then cored to determine when yellow color is present on each pixel. When this is done, it was found that yellow color is present on each pixel when a value of a red dimension is between about 0.75 times of a value of a green dimension and about 2.5 times of the value of the green dimension, and when the values of the green and red dimensions are at least about 1.2 times a value of a blue dimension. Optionally, scoring the results of the color analysis can include calculating a percentage of yellow color in an image by dividing the number of yellow pixels by the total number of pixels in the image.
  • Optionally, when video recordings are taken, a plurality of frames from all frames can be randomly selected prior to processing the frames on the software program. This random selection can be performed with or without criteria for the random selection.
  • In a separate embodiment, the current invention is a method of assessing oral health in a patient or subject, comprising any one or more—or even all—of the foregoing steps.
  • These and other important objects, advantages, and features of the invention will become clear as this disclosure proceeds.
  • The invention accordingly comprises the features of construction, combination of elements, and arrangement of parts that will be exemplified in the disclosure set forth hereinafter and the scope of the invention will be indicated in the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
  • For a fuller understanding of the invention, reference should be made to the following detailed description, taken in connection with the accompanying drawings, in which:
  • FIG. 1 is a flowchart depicting the steps of oral health assessment, according to an embodiment of the current invention.
  • FIG. 2 is a flowchart depicting the steps of capturing data, according to an embodiment of the current invention.
  • FIG. 3 is a flowchart depicting the steps of scoring data, according to an embodiment of the current invention.
  • FIG. 4A is an image of a tooth using the “intra-oral” setting of the intraoral camera.
  • FIG. 4B is a schematic of “perio mode” of the ACTEON SOPROCARE intraoral camera.
  • FIG. 5 depicts the R code for obtaining the final plaque percentage for the tooth.
  • FIG. 6 is a chart depicting color samples that can be classified as ‘yellow’ color (plaque) or non-yellow color (normal).
  • FIG. 7 depicts the R code for randomly selecting fifty (50) numbers/frames from a video recording.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • In the following detailed description of the preferred embodiments, reference is made to the accompanying drawings, which form a part thereof, and within which are shown by way of illustration specific embodiments by which the invention may be practiced. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the invention.
  • As used in this specification and the appended claims, the singular forms “a”, “an”, and “the” include plural referents unless the content clearly dictates otherwise. As used in this specification and the appended claims, the term “or” is generally employed in its sense including “and/or” unless the context clearly dictates otherwise.
  • In an embodiment, the current invention uses digital imaging technology to objectively capture clinical data on oral health, and then provides standardized scoring methodology for quantifying oral health. The methodology is applicable to persons in clinical settings (including hospitalized patients) as well as the general population. Generally, it involves two (2) stages (see FIG. 1): the process of capturing clinical data by use of digital imaging technology, and the process of standardized scoring of oral health data.
  • Capturing Clinical Data (FIG. 2)
  • The process of capturing clinical data on oral health is based on the use of a conventional intraoral camera (e.g., ACTEON SOPROCARE Diagnostic/Clinical Intraoral Cameras) that has the capacity to capture digital images of all tooth surfaces in white light within and outside of dental laboratory settings. Any suitable intraoral camera is contemplated herein. Data from the digital images are used to enhance the detection of plaque on tooth surfaces, which are difficult to directly observe and score by a dental hygienist. Imaging software (e.g., ACTEON SOPRO Imaging Software) is used in conjunction with the camera to visualize, capture, and store each subject's digital image recording.
  • The process of digital imaging divides the subject's full set of teeth into four (4) quadrants. These quadrants include an upper right quadrant, lower right quadrant, upper left quadrant, and lower left quadrant. The recording of buccal, occlusal, and lingual surfaces of each quadrant in video-mode is significantly more effective and efficient than taking multiple still frame images at the bedside. Each tooth's digital plaque data is collected in a standardized manner, and with the ability to select optimal frames for analysis.
  • Scoring Clinical Data (FIG. 3)
  • When not utilizing intraoral cameras, dental plaque burden is conventionally scored by visual examination by use of the University of Mississippi Oral Hygiene Index (UM-OHI). Using visual examination, for the ten (10) sections of each tooth, plaque is scored as present (value of 1) versus absent (value of 0). Thus, the maximum plaque score per tooth is 10. The mean plaque score for the subject is calculated by dividing the total score by number of teeth. By way of contrast, with the intraoral camera and imaging software, according to certain embodiments of the current invention, each tooth's data is extracted into a software program, such as but not limited to R, and with a minimum of about 10,000-40,000 color-derived pixels per tooth.
  • The color classification of each pixel is determined by the software program using an algorithm that makes use of red/green/blue (RGB) color code combinations. These classifications are then calculated quantitatively within the software program, and a separate algorithm automatically generates a range of oral health scoring techniques. These include, but are not limited to: (i) magnitude (and ratio) of dental plaque per tooth and across all teeth; (ii) estimated age of dental plaque per tooth and across all teeth; and (iii) ratio of plaque burden to plaque age per tooth and across all teeth. The process of oral health scoring is set up such that after appropriate selection of digital images has been achieved with use of the intraoral camera, and after these data have been imported into the computer coding language/program (e.g., R), only a few key strokes are required to compile and execute the algorithmic code, thereby resulting in standardized and near real-time scoring of oral health.
  • Example Capturing Data
  • A conventional intraoral camera, such as SOPROCARE by ACTEON used herein, illuminates dental tissue with a wavelength of light between 440 nm and 680 nm. Exposed tissue absorbs the energy and reflects it in florescent form. The handheld intraoral camera can be connected to a computing device wirelessly or by way of a video cable. If a wired connection is used, the video cable is connected to both the intraoral camera and the computing device. The dental camera electrical supply is directly powered through the computer USB port. The voltage powering the camera is of continuous 5 V low voltage type (0.5 A). On the computing device, imaging software, such as SOPRO
  • Imaging software, such as SOPRO V2.3 used herein, is required to visualize, capture, and store video and digital images taken by the intraoral camera. Upon initiating the imaging software, a procedure file was created for each subject in order to record and store the digital images. The computing device was placed near the subject's head during the procedure in order to use the monitor as the display screen to visually guide the intraoral camera over each tooth surface. The camera focus ring was set to intraoral mode for video capture and/or camera digital image capture.
  • The mode on the intraoral camera was then set to the appropriate setting. On the ACTEON SOPROCARE camera for example, there is a rotating focus ring used to focus from “0” to infinite. The “intra-oral” (1-5 teeth) setting captures an image that is 5 mm to 30 mm from the camera. This setting was used for both video and camera digital image capture. See FIG. 4A. Additionally, the ACTEON SOPROCARE camera has a “perio mode”, which is a fluorescent mode that is associated with chromatic amplification to highlight dental plaque using ultraviolet light. This “perio mode” revealed both old and new plaque in various stages. New plaque was interpreted as a white color, while older plaque was interpreted as yellow or orange colors depending on its mineralization. See FIG. 4B. Perio mode was used herein to capture video and camera digital images.
  • A disposable dental barrier can be placed over the camera lens, followed by the optional placement of a camera tip over the dental barrier. The camera tip enables displacement of ambient lighting. In the event of an anatomically small mouth or an oral cavity that is minimized due to facial and tongue swelling, the intraoral camera can capture adequate digital images of dental plaque without using a camera tip.
  • Mouth props may be used to assist subjects with or without an endotracheal tube to keep the mouth open wide enough for movement of the intraoral camera during the procedure. The mouth prop would be placed on the opposite side of the mouth being recorded.
  • At this point, the intraoral camera can initiate video or camera digital image recordings. For video recordings, the subject's full set of teeth were visualized in four quadrants. These quadrants included an upper right quadrant, lower right quadrant, upper left quadrant, and lower left quadrant. The speed of video digital imaging decreases the amount of subject burden. Pausing 1 to 2 seconds over each tooth surface will enhance the quality of still frame images to be produced from the video at a later time. In preparation for unforeseen events that make it impossible to complete digital imaging of all four quadrants, it is more representative data of oral health to obtain one-half of a full set of digital images from an upper quadrant and lower quadrant than one-half of a full set of digital images that is either both upper quadrants or both lower quadrants. Recording tooth surfaces closest to the endotracheal tube is recommended to be completed last, in case the subject is susceptible to coughing or gagging with incidental movement of the endotracheal tube. Placing the intraoral camera close to the mouth at the initiation of image recording and again at the conclusion of image recording can assist in protecting the subject's identity by avoiding incidental recording of a camera-facing headshot.
  • These video recordings can be obtained in any suitable way. The following is an exemplary step-by-step methodology for taking these recordings. First, the technician or other member of the medical team (herein the “operator”) can lift the subject's upper lip with a free hand to expose the full buccal surface of the central and lateral incisor areas. The camera is held steady over the subject's first available upper quadrant front tooth for 1-2 seconds and over each buccal tooth surface thereafter, moving the camera over the central and lateral incisor area. Alternatively, the camera does not need to stop or be held steady for 1-2 seconds over each tooth surface; rather, the camera can simply take a continuous video along the rows of teeth, and frames can be extracted from that video, as will become clearer as this specification continues.
  • The operator's free hand can be used to guide the camera distally over the cuspid and molar areas, until the buccal surface of the subject's last tooth in the back of the mouth and in the upper quadrant is recorded. The camera lens is angled to capture the full biting surface of this same last back tooth, pausing 1-2 seconds over the biting surface of each tooth (or alternatively the camera does not need to be paused over the tooth surface), and moving the camera over the molar and cuspid areas, thus guiding the camera towards the lateral and central incisor areas until the biting surface of the subject's first front tooth in the upper quadrant is recorded.
  • The camera lens is then angled to record the lingual surface of this same first front tooth, pausing 1-2 seconds over the lingual surface of each tooth (or alternatively the camera does not need to be paused over the tooth surface), and moving the camera over the central and lateral incisor area, thus guiding the camera distally over the cuspid and molar areas until the lingual surface of the subject's last back tooth in this upper quadrant is recorded. On the same side of the mouth, the camera is moved down to record the buccal surface of the subject's last back tooth in the lower quadrant, pausing 1-2 seconds over the buccal surface of each tooth (or alternatively the camera does not need to be paused over the tooth surface), and moving the camera over the molar and cuspid areas in the lower quadrant.
  • The operator can then use a free hand to move the lower lip down to expose and record the full buccal surface of each tooth in the lateral and central incisor area, until the subject's first front tooth in the lower quadrant is recorded. The camera lens is angled to capture the full biting surface of this same front tooth, pausing 1-2 seconds over the biting surface of each tooth (or alternatively the camera does not need to be paused over the tooth surface), and moving the camera over the central and lateral incisor areas, thus guiding the camera distally over the cuspid and molar areas until the biting surface of the last tooth in the back of the subject's mouth in the lower quadrant is recorded. The camera lens is then angled to begin recording the lingual surface of this same last back tooth, pausing 1-2 seconds over the lingual surface of each tooth in the molar and cuspid areas (or alternatively the camera does not need to be paused over the tooth surface), and guiding the camera over the lateral and central incisor areas until the subject's first front tooth in this lower quadrant is recorded.
  • If a mouth prop is being used, move it to the opposite side to continue recording. The foregoing sequence of video recording is then repeated on the opposite side of the subject's mouth. Once the recording of every tooth surface is complete, recording can be stopped. If needed, the mouth prop can be removed and placed inside a sealable biohazard bag for transport to the lab to be cleaned and sterilized.
  • Similar to video recordings, single frame images can be obtained in any suitable way. The following is an exemplary step-by-step methodology for taking these digital images. The subject's full set of teeth can be visualized in six sections. These sections include an upper right, lower right, upper middle, lower middle, upper left, and lower left. The operator's free hand can be used to lift the subject's upper lip to expose the full buccal surface of the central and lateral incisors. In camera mode, along with the intra-oral setting, a digital image of up to 6 teeth is captured in each section on the buccal side and again on the lingual side for a total of 12 digital images that represent all tooth surfaces in that particular quadrant.
  • At the completion of video or camera recordings of all tooth surfaces, the subject's file is saved and closed, to be prepared and analyzed at a later time. In preparation for analysis, single still frame digital images of each tooth surface can be produced from the video recording with digital imaging software. Secure storage of these video files and single image files make it possible to maintain archival data that can be better subjected to additional analysis and reliability determinations.
  • Analyzing Data
  • Generally, to calculate the percentage of plaque in a given image, the image of the tooth is cropped, so that the image only includes the targeted tooth in the image without losing the integrity of the tooth. Imaging software is then used to lower the resolution. The image is resized to a percentage of 50 or less (or to reset the pixel to 100 in horizontal percentage, and the vertical percentage will be automatically adjusted). The R program is then run to obtain the plaque percentage for the tooth (see FIG. 5 for the R code to obtain the final plaque percentage for the tooth).
  • More specifically as it pertains to the step of the R program obtaining the plaque percentage, to digitize the color, most software programs use three dimensions to record the color, namely the RGB. The software programs score the value of a specific pixel on each dimension, and a three-dimensional point (x, y, z) uniquely defines the color of the specific pixel. The software uses two digits for each color dimension, and each digit uses a hexadecimal system to count the numbers. Therefore, there are 256 possible values to score each dimension of the color. Since an objective herein was to calculate the percentage of plaque on a tooth and a typical plaque is always presented in a yellow color, the calculated score was used to judge whether each pixel should be classified as yellow or not.
  • The determination of whether a pixel should be considered yellow can be achieved and implemented in any suitable manner. For example, as described herein for illustrative purposes, the combinations of the three colors (RGB) that can create ‘yellow’ were determined quantitatively. Each color dimension was divided into four categories: (0, 64), (64, 128), (128, 192), (192, 255). Four categories were chosen because that was considered an acceptable compromise for accuracy and computational difficulty. Thus, in total, the end result was 64 categories. Next, the middle point of each range was chosen—32, 96, 160, 224—to be the representative of that range; the color for that specific combination was used to represent the color for that category. For example, for the category (0, 64) in red, (0, 64) in blue and (0, 64) in green, the color of point (32 in red, 32 in blue, and 32 in green) was used to represent the color for that category. After all 64 categories were scored, it was found that there were several common properties shared for those categories that are identified as yellow. These common properties are listed as follows:
      • 1. The value of the red dimension was found to be between about 0.75 times of the value of the green dimension and about 2.5 times of the value of the green dimension.
      • 2. The value of both the green dimension and the red dimension were found to be at least about 1.2 times of the value of the blue dimension.
  • Accordingly, if the values of the pixel met the conditions above, it was classified as yellow. If not, the pixel was not classified as yellow. After results were obtained for each pixel, the percentage of yellow could be calculated by using the number of yellow pixels divided by the total number of pixels in the picture. FIG. 6 depict color samples that can be classified as ‘yellow’ color (plaque) or non-yellow color (normal).
  • Automated Selection
  • As it pertains to video recordings, each recording is broken down into individual frames. A random number generator is then used to randomly select a predetermined number of frames (e.g., 50) from one (1) to x, where x is the number of total images/frames within that recording. The R code for this can be seen in FIG. 7. Generally, frames can be randomly selected with or without specified criteria for the selection of frames. These criteria (e.g., a predetermined number of frames from each quadrant, certain quality of the frame to minimize the noise, etc.) can be inputted manually or can be learned automatically by the software algorithm, for example via artificial intelligence. If any criteria are present, the imaging software can automatically select frames that are relevant and/or discard frames that are not relevant. In any case, upon selection of the frames from the video recording, the corresponding images from the individual frames are selected, and the yellow percentage of the selected images are the calculated, as previously discussed.
  • Any suitable methodology for randomly selecting frames/images from the video recording is contemplated herein. For illustration purposes, differing methodologies were tested herein for this random selection. First, the 50 images can be randomly selected, but only the portion of the image related to teeth for analysis, is selected and cropped, as previously discussed. Another methodology is selecting images from the middle 60% of the complete video recording, thus truncating the first 20% and the last 20% of the video recording. Yet another methodology is simply randomly selecting 50 images from the frames of the video recording. Finally, each tooth can be selected, cropped, and analyzed separately, and subsequently taking the average of the scores of the teeth.
  • TABLE 1
    Dental plaque percentage comparison using different methods.
    Random
    Selection Random Individual
    Subject ID and with Selection Random Evaluation
    Intervention selecting and with Selection Results
    Day cropping truncation Results (Benchmark)
    1002-P3 0.0599 0.0981 0.0948 0.1571
    1008-P5 0.0472 0.0465 0.1044 0.1163
    1009-P5 0.0365 0.0500 0.0444 0.0481
    1011-P5 0.4047 0.3899 0.3075 0.3611
    1027-P3 0.0438 0.0220 0.0403 0.1127
    1034-P3 0.1715 0.1364 0.1380 0.1757
    1042-P5 0.3054 0.2293 0.2293 0.4311
    1045-P3 0.0768 0.0861 0.09609 0.0424
    1047-P3 0.0560 0.0375 0.08125 0.0531
    1050-P3 0.1042 0.0500 0.03780 0.2149
  • Hardware and Software Infrastructure Examples
  • The present invention may be embodied on various computing platforms that perform actions responsive to software-based instructions and most particularly on touchscreen portable devices. The following provides an antecedent basis for the information technology that may be utilized to enable the invention.
  • The computer readable medium described in the claims below may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any non-transitory, tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wire-line, optical fiber cable, radio frequency, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C#, C++, Visual Basic or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
  • Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • It should be noted that when referenced, an “end-user” is an operator of the software as opposed to a developer or author who modifies the underlying source code of the software. For security purposes, authentication means identifying the particular user while authorization defines what procedures and functions that user is permitted to execute.
  • GLOSSARY OF CLAIM TERMS
  • About: This term is used herein to refer to approximately or nearly and in the context of a numerical value or range set forth means±15% of the numerical. In an embodiment, the term “about” can include traditional rounding according to significant figures of the numerical value. In addition, the phrase “about ‘x’ to ‘y’” includes “about ‘x’ to about ‘y’”.
  • Frame: This term is used herein to refer to a fraction/division of time on a multimedia (e.g., video) timeline.
  • Oral health: This term is used herein to refer to the well-being of one's mouth, specifically based herein on the plaque that may be present in the person's mouth. More specifically, the manner in which plaque affects oral health can be the magnitude and ratio of dental plaque per tooth and across all teeth, the estimated age of dental plaque per tooth and across all teeth, and the ratio of plaque burden to plaque age per tooth and across all teeth.
  • Random selection: This term is used herein to refer to a relatively unpredictably chosen array of frames/images from a video recording. The term “relatively” is used because it is contemplated herein that this random selection can be performed with or without a predetermined set of criteria for the selection. For example, if 1,000 frames are present in a video recording, a set of criteria may eliminate 200 of those frames, and then 50 frames can be “randomly selected” from the remaining 800 frames. Alternatively, the 50 frames can be “randomly selected” from the 1,000 frames with no criteria present. Both circumstances are contemplated herein.
  • Single still frame digital image: This term is used herein to refer to a visual representation of a tooth extracted at a specific time during a video recording.
  • Substantially all: This term is used herein to refer to a representative number of tooth surfaces that, when analyzed, can be used to characterize the amount of plaque across all teeth or on each tooth. This number can be all teeth in the subject's mouth, or it can be an amount less than all of the teeth. For example, some teeth may be inaccessible by an intraoral camera due to the anatomy of a particular subject's mouth, so only the accessible surfaces are recorded. These circumstances are still considered herein as “substantially all”.
  • The advantages set forth above, and those made apparent from the foregoing description, are efficiently attained. Since certain changes may be made in the above construction without departing from the scope of the invention, it is intended that all matters contained in the foregoing description or shown in the accompanying drawings shall be interpreted as illustrative and not in a limiting sense.
  • It is also to be understood that the following claims are intended to cover all of the generic and specific features of the invention herein described, and all statements of the scope of the invention that, as a matter of language, might be said to fall there between.

Claims (25)

What is claimed is:
1. A method of assessing oral health in a patient or subject, comprising the steps of:
providing an intraoral camera that can capture or record frames within a mouth of said subject;
using the intraoral camera, capturing or recording the frames of substantially all buccal, occlusal, and lingual surfaces in the subject's set of teeth;
importing the frames into an image processing software program implemented on a computing device;
processing the frames on the image processing software program to generate images of the teeth to be analyzed;
analyzing and classifying color of at least a plurality of the frames to determine presence of yellow color, wherein the yellow color indicates presence of plaque;
scoring results of the color analysis to objectively and quantitatively assess the oral health of the subject.
2. A method as in claim 1, further comprising the steps of positioning a dental barrier over a lens of the intraoral camera and positioning a camera tip of the intraoral camera over the dental barrier.
3. A method as in claim 1, wherein the captured frames are photographs taken by the intraoral camera.
4. A method as in claim 3, further comprising the step of dividing the subject's set of teeth into a plurality of sections, wherein the sections include an upper right section, a lower right section, an upper middle section, a lower middle section, an upper left section, and a lower left section.
5. A method as in claim 3, wherein the step of processing the frames is performed by cropping each image such that the image includes a targeted tooth in the image.
6. A method as in claim 5, wherein the step of processing the frames is further performed by lowering a resolution of the image to a percentage of about 50 or less.
7. A method as in claim 5, wherein the step of analyzing and classifying the color is performed by classifying the color of each pixel of the image.
8. A method as in claim 7, wherein the color of the each pixel is classified using red-green-blue color code combinations, wherein a three-dimensional point (x, y, z) defines the color of the each pixel.
9. A method as in claim 8, wherein the step of analyzing and classifying the color is further performed by:
dividing each color dimension of the red-green-blue color code combinations into four categories, wherein the four categories are (0, 64), (64, 128), (128, 192), and (192, 255);
selecting a middle point in each category to be representative of the corresponding category; and
scoring all categories to determine when the yellow color is present on the each pixel.
10. A method as in claim 9, wherein the yellow color is present on the each pixel when a value of a red dimension is between about 0.75 times of a value of a green dimension and about 2.5 times of the value of the green dimension, and when the values of the green and red dimensions are at least about 1.2 times a value of a blue dimension.
11. A method as in claim 7, wherein the step of scoring the results of the color analysis includes calculating a percentage of the yellow color by dividing a number of yellow pixels by a total number of pixels in the image.
12. A method as in claim 1, wherein the captured frames are video recordings.
13. A method as in claim 12, further comprising the step of dividing the subject's set of teeth into a plurality of quadrants, wherein the quadrants include an upper right quadrant, a lower right quadrant, an upper left quadrant, and a lower left quadrant.
14. A method as in claim 13, wherein the step of capturing or recording the frames includes capturing and recording video of the tooth surfaces in at least one upper quadrant and in at least one lower quadrant.
15. A method as in claim 13, wherein the step of capturing or recording the frames includes capturing and recording video of the buccal surfaces in a quadrant, followed by capturing and recording video of the occlusal surfaces in the quadrant, followed by capturing and recording video of the lingual surfaces in the quadrant, and followed by repeating the foregoing steps in another quadrant.
16. A method as in claim 12, wherein the step of processing the frames on the image processing software program includes extracting single still frame digital images of the tooth surfaces from the video recording.
17. A method as in claim 16, wherein the step of processing the frames is performed by cropping each image such that the image includes a targeted tooth in the image.
18. A method as in claim 17, wherein the step of processing the frames is further performed by lowering a resolution of the image to a percentage of about 50 or less.
19. A method as in claim 17, wherein the step of analyzing and classifying the color is performed by classifying the color of each pixel of the image.
20. A method as in claim 19, wherein the color of the each pixel is classified using red-green-blue color code combinations, wherein a three-dimensional point (x, y, z) defines the color of the each pixel.
21. A method as in claim 20, wherein the step of analyzing and classifying the color is further performed by:
dividing each color dimension of the red-green-blue color code combinations into four categories, wherein the four categories are (0, 64), (64, 128), (128, 192), and (192, 255);
selecting a middle point in each category to be representative of the corresponding category; and
scoring all categories to determine when the yellow color is present on the each pixel.
22. A method as in claim 21, wherein the yellow color is present on the each pixel when a value of a red dimension is between about 0.75 times of a value of a green dimension and about 2.5 times of the value of the green dimension, and when the values of the green and red dimensions are at least about 1.2 times a value of a blue dimension.
23. A method as in claim 19, wherein the step of scoring the results of the color analysis includes calculating a percentage of the yellow color by dividing a number of yellow pixels by a total number of pixels in the image.
24. A method as in claim 16, further comprising the step of randomly selecting a plurality of frames from all of the captured frames prior to processing the frames on the image processing software program, wherein the random selection of the plurality of frames is performed with or without criteria for the random selection.
25. A method of assessing oral health in a patient or subject, comprising the steps of:
providing an intraoral camera that can capture or record frames within a mouth of said subject;
positioning a dental barrier over a lens of the intraoral camera and positioning a camera tip of the intraoral camera over the dental barrier;
dividing the subject's set of teeth into a plurality of quadrants, wherein the quadrants include an upper right quadrant, a lower right quadrant, an upper left quadrant, and a lower left quadrant;
using the intraoral camera, capturing or recording the frames of substantially all buccal, occlusal, and lingual surfaces in the subject's set of teeth,
wherein the captured frames are video recordings,
wherein the step of capturing or recording the frames includes capturing and recording video of the tooth surfaces in at least one upper quadrant and in at least one lower quadrant,
wherein the step of capturing or recording the frames includes capturing and recording video of the buccal surfaces in a quadrant, followed by capturing and recording video of the occlusal surfaces in the quadrant, followed by capturing and recording video of the lingual surfaces in the quadrant, and followed by repeating the foregoing steps in another quadrant;
importing the frames into an image processing software program implemented on a computing device;
randomly selecting a plurality of frames from all of the captured frames, wherein the random selection of the plurality of frames is performed with or without criteria for the random selection;
processing the frames on the image processing software program to generate images of the teeth to be analyzed,
wherein the step of processing the frames on the image processing software program includes extracting single still-framed digital images of the tooth surfaces from the video recording,
wherein the step of processing the frames is performed by cropping each image such that the image includes a targeted tooth in the image and lowering a resolution of the image to a percentage of about 50 or less;
analyzing and classifying color of at least a plurality of the frames to determine presence of yellow color, wherein the yellow color indicates presence of plaque,
wherein the step of analyzing and classifying the color is performed by classifying the color of each pixel of the image, wherein the color of the each pixel is classified using red-green-blue color code combinations, wherein a three-dimensional point (x, y, z) defines the color of the each pixel, wherein the step of analyzing and classifying the color is further performed by:
dividing each color dimension of the red-green-blue color code combinations into four categories, wherein the four categories are (0, 64), (64, 128), (128, 192), and (192, 255),
selecting a middle point in each category to be representative of the corresponding category, and
scoring all categories to determine when the yellow color is present on the each pixel,
wherein the yellow color is present on the each pixel when a value of a red dimension is between about 0.75 times of a value of a green dimension and about 2.5 times of the value of the green dimension, and when the values of the green and red dimensions are at least about 1.2 times a value of a blue dimension;
scoring results of the color analysis to objectively and quantitatively assess the oral health of the subject,
wherein the step of scoring the results of the color analysis includes calculating a percentage of the yellow color by dividing a number of yellow pixels by a total number of pixels in the image.
US15/450,925 2015-12-01 2017-03-06 Standardized oral health assessment and scoring using digital imaging Active 2037-09-23 US10405754B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US15/450,925 US10405754B2 (en) 2015-12-01 2017-03-06 Standardized oral health assessment and scoring using digital imaging
EP17876074.0A EP3549061A4 (en) 2016-12-01 2017-03-08 Standardized oral health assessment and scoring using digital imaging
PCT/US2017/021367 WO2018101977A1 (en) 2016-12-01 2017-03-08 Standardized oral health assessment and scoring using digital imaging

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201562261631P 2015-12-01 2015-12-01
US201615366741A 2016-12-01 2016-12-01
US15/450,925 US10405754B2 (en) 2015-12-01 2017-03-06 Standardized oral health assessment and scoring using digital imaging

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US201615366741A Continuation-In-Part 2015-12-01 2016-12-01

Publications (2)

Publication Number Publication Date
US20170172418A1 true US20170172418A1 (en) 2017-06-22
US10405754B2 US10405754B2 (en) 2019-09-10

Family

ID=59064042

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/450,925 Active 2037-09-23 US10405754B2 (en) 2015-12-01 2017-03-06 Standardized oral health assessment and scoring using digital imaging

Country Status (1)

Country Link
US (1) US10405754B2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110060250A (en) * 2019-04-24 2019-07-26 北京峰云视觉技术有限公司 A kind of tongue body image processing method, device and electronic equipment
CN110349224A (en) * 2019-06-14 2019-10-18 众安信息技术服务有限公司 A kind of color of teeth value judgment method and system based on deep learning
US20200022582A1 (en) * 2018-07-23 2020-01-23 Quanta Computer Inc. Image-processing methods for marking plaque fluorescent reaction area and systems therefor
CN111526778A (en) * 2017-12-28 2020-08-11 阿伊里斯株式会社 Intraoral imaging device, medical device, and program
US20220076694A1 (en) * 2020-09-08 2022-03-10 Lifeline Systems Company Cognitive impairment detected through audio recordings
CN114496254A (en) * 2022-01-25 2022-05-13 首都医科大学附属北京同仁医院 Gingivitis evaluation system construction method, gingivitis evaluation system and gingivitis evaluation method
US11468561B2 (en) 2018-12-21 2022-10-11 The Procter & Gamble Company Apparatus and method for operating a personal grooming appliance or household cleaning appliance
US20220398731A1 (en) * 2021-06-03 2022-12-15 The Procter & Gamble Company Oral Care Based Digital Imaging Systems And Methods For Determining Perceived Attractiveness Of A Facial Image Portion

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115668279A (en) 2020-06-04 2023-01-31 宝洁公司 Oral care based digital imaging system and method for determining perceived appeal of facial image portions

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030161401A1 (en) * 2002-02-27 2003-08-28 Bo Shen Reducing the resolution of media data
US20050244794A1 (en) * 2004-04-30 2005-11-03 Kemp James H Computer-implemented system and method for automated and highly accurate plaque analysis, reporting, and visualization
US20100183523A1 (en) * 2009-01-22 2010-07-22 Wagner Richard E Dental composition and method
US20100279248A1 (en) * 2009-03-05 2010-11-04 Mourad Pierre D Device and method for predicting the likelihood of caries development
US20110216409A1 (en) * 2010-03-04 2011-09-08 Stutes Richard Dale Optical barrier device
US20110301441A1 (en) * 2007-01-05 2011-12-08 Myskin, Inc. Analytic methods of tissue evaluation
US20140118427A1 (en) * 2012-10-30 2014-05-01 Pixtronix, Inc. Display apparatus employing frame specific composite contributing colors
US20140199651A1 (en) * 2011-08-24 2014-07-17 Omron Healthcare Co., Ltd. Oral care apparatus applied to the removal of dental plaque

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE602005026676D1 (en) 2004-09-10 2011-04-14 Lion Corp SYSTEM FOR DETECTING TOOTH COVER AND METHOD FOR DETECTING TOOTH COVER
EP2199980A1 (en) 2008-12-09 2010-06-23 Braun GmbH Method and device for measuring the efficacy of plaque removal

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030161401A1 (en) * 2002-02-27 2003-08-28 Bo Shen Reducing the resolution of media data
US20050244794A1 (en) * 2004-04-30 2005-11-03 Kemp James H Computer-implemented system and method for automated and highly accurate plaque analysis, reporting, and visualization
US20110301441A1 (en) * 2007-01-05 2011-12-08 Myskin, Inc. Analytic methods of tissue evaluation
US20100183523A1 (en) * 2009-01-22 2010-07-22 Wagner Richard E Dental composition and method
US20100279248A1 (en) * 2009-03-05 2010-11-04 Mourad Pierre D Device and method for predicting the likelihood of caries development
US20110216409A1 (en) * 2010-03-04 2011-09-08 Stutes Richard Dale Optical barrier device
US20140199651A1 (en) * 2011-08-24 2014-07-17 Omron Healthcare Co., Ltd. Oral care apparatus applied to the removal of dental plaque
US20140118427A1 (en) * 2012-10-30 2014-05-01 Pixtronix, Inc. Display apparatus employing frame specific composite contributing colors

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111526778A (en) * 2017-12-28 2020-08-11 阿伊里斯株式会社 Intraoral imaging device, medical device, and program
US11523741B2 (en) 2017-12-28 2022-12-13 Aillis Inc. Intraoral imaging apparatus, medical apparatus, and program
EP3714764A4 (en) * 2017-12-28 2021-06-30 Allis Inc. Oral photographing apparatus, medical apparatus, and program
CN110742707A (en) * 2018-07-23 2020-02-04 广达电脑股份有限公司 Image processing method for marking dental plaque fluorescence reaction area and electronic system thereof
US10779735B2 (en) * 2018-07-23 2020-09-22 Quanta Computer Inc. Image-processing methods for marking plaque fluorescent reaction area and systems therefor
US20200022582A1 (en) * 2018-07-23 2020-01-23 Quanta Computer Inc. Image-processing methods for marking plaque fluorescent reaction area and systems therefor
US11468561B2 (en) 2018-12-21 2022-10-11 The Procter & Gamble Company Apparatus and method for operating a personal grooming appliance or household cleaning appliance
US11494899B2 (en) 2018-12-21 2022-11-08 The Procter & Gamble Company Apparatus and method for operating a personal grooming appliance or household cleaning appliance
US11752650B2 (en) 2018-12-21 2023-09-12 The Procter & Gamble Company Apparatus and method for operating a personal grooming appliance or household cleaning appliance
CN110060250A (en) * 2019-04-24 2019-07-26 北京峰云视觉技术有限公司 A kind of tongue body image processing method, device and electronic equipment
CN110349224A (en) * 2019-06-14 2019-10-18 众安信息技术服务有限公司 A kind of color of teeth value judgment method and system based on deep learning
US20220076694A1 (en) * 2020-09-08 2022-03-10 Lifeline Systems Company Cognitive impairment detected through audio recordings
US20220398731A1 (en) * 2021-06-03 2022-12-15 The Procter & Gamble Company Oral Care Based Digital Imaging Systems And Methods For Determining Perceived Attractiveness Of A Facial Image Portion
US11978207B2 (en) * 2021-06-03 2024-05-07 The Procter & Gamble Company Oral care based digital imaging systems and methods for determining perceived attractiveness of a facial image portion
CN114496254A (en) * 2022-01-25 2022-05-13 首都医科大学附属北京同仁医院 Gingivitis evaluation system construction method, gingivitis evaluation system and gingivitis evaluation method

Also Published As

Publication number Publication date
US10405754B2 (en) 2019-09-10

Similar Documents

Publication Publication Date Title
US10405754B2 (en) Standardized oral health assessment and scoring using digital imaging
US8866894B2 (en) Method for real-time visualization of caries condition
US9770217B2 (en) Dental variation tracking and prediction
KR102267197B1 (en) Method and apparatus for recording and displaying dental care data on a digital dental image
US8073212B2 (en) Methods and products for analyzing gingival tissues
Kidd et al. Occlusal caries diagnosis: a changing challenge for clinicians and epidemiologists
US6821116B2 (en) System for scanning oral environment
Jablonski-Momeni et al. Clinical performance of the near-infrared imaging system VistaCam iX Proxi for detection of approximal enamel lesions
Liu et al. A pilot study of a deep learning approach to detect marginal bone loss around implants
US20110058717A1 (en) Methods and systems for analyzing hard tissues
Pentapati et al. Clinical applications of intraoral camera to increase patient compliance-current perspectives
Liu et al. Red fluorescence imaging for dental plaque detection and quantification: pilot study
Wallis et al. Quantification of canine dental plaque using quantitative light-induced fluorescence
Radha et al. Machine learning techniques for periodontitis and dental caries detection: A Narrative Review
EP3549061A1 (en) Standardized oral health assessment and scoring using digital imaging
Zahid et al. Validity and Reliability of Polarized vs Non-Polarized Digital Images for Measuring Gingival Melanin Pigmentation
KR102428636B1 (en) Teeth examination system using deep learning algorithms
Prasanth et al. In vivo inflammation mapping of periodontal disease based on diffuse reflectance spectral imaging: a clinical study
Mauriello et al. Dental Digital Radiographic Imaging.
US20230386682A1 (en) Systems and methods to chronologically image orthodontic treatment progress
Arjunan Accuracy of diagnosing proximal caries using intra-oral bitewing radiographs and near infra-red imaging (NIRI) technology in iTero element 5D scanners: an in vivo study
Guo et al. Establishment and evaluation of a 3D quantitative analysis method for dental plaque based on an intraoral scanner technique.
Nasruddin et al. VALIDITY AND RELIABILITY OF DIGITAL PHOTOS AS A DIAGNOSTIC TOOL FOR DETERMINATION OF CARIES: Received 2024-02-19; Accepted 2024-03-26; Published 2024-03-27
Menon et al. YOLO V5 Deep Learning Model for Dental Problem Detection
Farooq Diagnosis of Dental Caries-Old and the New

Legal Events

Date Code Title Description
AS Assignment

Owner name: NATIONAL INSTITUTES OF HEALTH (NIH), U.S. DEPT. OF

Free format text: CONFIRMATORY LICENSE;ASSIGNOR:UNIVERSITY OF SOUTH FLORIDA;REEL/FRAME:042444/0739

Effective date: 20170503

AS Assignment

Owner name: UNIVERSITY OF SOUTH FLORIDA, FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MUNRO, CINDY L.;CAIRNS, PAULA LOUISE;CHEN, XUSHENG;AND OTHERS;SIGNING DATES FROM 20170406 TO 20170420;REEL/FRAME:042898/0849

FEPP Fee payment procedure

Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PTGR); ENTITY STATUS OF PATENT OWNER: MICROENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO MICRO (ORIGINAL EVENT CODE: MICR); ENTITY STATUS OF PATENT OWNER: MICROENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Free format text: SURCHARGE FOR LATE PAYMENT, SMALL ENTITY (ORIGINAL EVENT CODE: M2554); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment: 4