WO2004104927A2 - Analysis and display of fluorescence images - Google Patents

Analysis and display of fluorescence images Download PDF

Info

Publication number
WO2004104927A2
WO2004104927A2 PCT/IB2004/001658 IB2004001658W WO2004104927A2 WO 2004104927 A2 WO2004104927 A2 WO 2004104927A2 IB 2004001658 W IB2004001658 W IB 2004001658W WO 2004104927 A2 WO2004104927 A2 WO 2004104927A2
Authority
WO
WIPO (PCT)
Prior art keywords
pixels
image
pixel
points
contour
Prior art date
Application number
PCT/IB2004/001658
Other languages
French (fr)
Other versions
WO2004104927A3 (en
Inventor
Elbert De Josselin De Jong
Monique H. Van Der Veen
Elbert Waller
Original Assignee
Inspektor Research Systems B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inspektor Research Systems B.V. filed Critical Inspektor Research Systems B.V.
Priority to JP2006530674A priority Critical patent/JP2007502185A/en
Publication of WO2004104927A2 publication Critical patent/WO2004104927A2/en
Publication of WO2004104927A3 publication Critical patent/WO2004104927A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0088Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10064Fluorescence image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30036Dental; Teeth

Definitions

  • the present invention relates to quantitative analysis of digital images of fluorescing tissue. More specifically, the present invention relates to methods, systems, and apparatus for analyzing digital images of dental tissue to quantify and/or visualize variations in the state of the dental tissue due to disease or other damage.
  • an object of the invention to provide an improved method for enhancing the available information about plaque, calculus, and carious dental tissue otherwise invisible to the human eye, and to objectify longitudinal monitoring by recording the information at each measurement and providing quantitative information based on the image(s).
  • Another object is to improve visualization and analysis of the whole visible tooth area, not limiting them to just a particular point.
  • Still another object is to enhance information available to patients to motivate them toward better hygiene and earlier treatment.
  • the invention provides a method of image analysis, comprising capturing additional images of tooth tissue, and for each of a plurality of pixels in the image, determining a first component value of the pixels color and a second component value of the pixels color, and calculating a first function value for the pixel based on the component values.
  • the first component is a red color component of the pixel
  • the second component is a green color component of the pixel
  • the function is a ratio of the red component value to the green component value.
  • the pixels original color may be replaced by an alternate color depending upon the value of the first function calculated as to that pixel.
  • the modified image is displayed or stored, and may be combined with other modified images to construct an animated sequence.
  • the function is calculated over all pixels in the image, while in other embodiments the function is applied only to one or more specified regions.
  • Another embodiment is a method of quantifying calcium loss due to a white spot lesion on a tooth.
  • An image of the fluorescence of the tooth due, for example, to incident blue light, is captured as a digital image.
  • a plurality of points defining a closed contour around a plurality of pixels are selected, and a reconstructed intensity value is calculated for each pixel within the contour.
  • the sum of the differences between the reconstructed intensity values and actual intensity values for each of the pixels within the contour is calculated and quantifies the loss of fluorescence.
  • the actual intensity value for each pixel is a function of a single optical component of the pixel, such as a red component intensity.
  • the reconstructed intensity value for each pixel is calculated using linear interpolation, such as interpolating between intensity values of one or more points on the contour.
  • the points on the contour lie on or adjacent to a line through the given pixel, where the line is perpendicular to a regression line that characterizes the region surrounded by the contour.
  • Another embodiment is a system that comprises a processor and a memory, where the memory is encoded with programming instructions executable by the processor to quantitatively evaluate the decalcification of a white spot based on a single image.
  • the user selects points on the image around the white spot, where each point is assumed to be healthy tissue. "Reconstructed" intensities are calculated for each point within the closed loop, and a result quantity is calculated based on these values and the pixel values in the image.
  • Fig. 1 is a representative image of the side view of a tooth, the image of which is to be analyzed according to the present invention.
  • Fig. 2 is a flowchart depicting the method of analysis according to one embodiment of the present invention.
  • Fig. 3 is a hardware and software system for capturing and processing image data according to one embodiment of the present invention.
  • Fig. 4 is a representative image of a tooth with a white spot lesion for analysis according to a second form of the present invention.
  • Fig. 5 is a two-dimensional graph of selected features from Fig. 4.
  • Fig. 6 shows certain features from Fig. 4 in the context of calculating a reconstructed intensity value for a particular point in the image.
  • Fig. 7 is a graph of measured and reconstructed intensity along line I in Fig. 6.
  • Fig. 8 illustrates quantities used for analysis in a third form of the present invention.
  • Fig. 9 is a series of related images and graphs illustrating a fourth form of the present invention.
  • Fig. 10 is a graph and series of image cells illustrating a fifth form of the present invention.
  • Fig. 11 is a graph of quantitative remineralization data over time as measured according to the present invention.
  • Fig. 1 represents a digital image of the side of a tooth for analysis according to the present invention.
  • any portion of a tooth might be captured in a digital image for analysis using any camera suitable for intra-oral imaging.
  • One exemplary image capture device is the combination light, camera, and shield described in the U.S.
  • Fig. 1 represents image 100, including a portion of the image 102 that captures the fluorescence of a particular tooth.
  • a carious region 104 extends along the gum and appears red in image 100.
  • a user has positioned a circle on the image to indicate a clean area 106 of the tooth that appears healthy.
  • the portions of image 100 corresponding to the tooth 102 and/or clean area 106 may be automatically determined by image analysis, as described below.
  • Fig. 2 describes in a flowchart the process 120, which is applied to image 100 in one embodiment of the present invention.
  • Process 120 begins at start point 121, and the system captures the digital image at step 123.
  • An example of a system for capturing an image at step 123 is illustrated in Fig. 3.
  • System 150 includes a monitor 151 and keyboard 152, which communicate using any suitable means with computer unit 154, such as through a PS/2, USB, or Bluetooth interface.
  • Unit 154 houses storage 153, memory 155, and a processor 157 that controls the capturing and processing functions in this embodiment. This includes, but is not limited to, controlling camera 156 to acquire digital images of tooth 158 or other dental tissue for analysis, preferably according to the techniques discussed in the Software Repositioning, Inspection, and Fluorescence Filter patent and applications.
  • a clean area of the tooth is identified or selected at step 125 by manual or automatic means. For example, a user might accomplish this manually by positioning and sizing a circle on a displayed version of the image using a graphical user interface.
  • the system selects or proposes a clean area of the image by finding the pixel(s) having the highest (or lowest) value of a particular function over the domain of the tooth image. A circle centered at the point (or the centroid of points) corresponding to that maximum, a circle circumscribing each of those points, or other selection means may be used.
  • the normalized function value F N is greater than the threshold F ⁇ for the pixel being considered (a positive result at decision block 133)
  • the image is output from the system at block 143.
  • the image can be displayed on a monitor 151 (see Fig. 3), saved to a storage device 153, or added to an animation (as will be discussed below).
  • other thresholds and color choices will occur to those skilled in the art for use in practicing this invention.
  • more or fewer ratio thresholds and corresponding colors may be used in other alternative forms of this embodiment of the invention.
  • pixels having a normalized function value F N (Z ' ) less than the lower or lowest threshold F ⁇ are replaced with a neutral, contrasting color such as gray, black, beige, or white.
  • This process 120 is particularly useful for performing a longitudinal analysis of a patient's condition over time during treatment. For example, a series of images taken before, during, and after treatment often reveals strengths and weaknesses of the treatment in terms of efficacy in an easily observable, yet objective way.
  • the use of R/G ratios instead of simple intensity measurements in these calculations reduces variations resulting from slightly different lighting conditions or camera configurations.
  • One can further improve the data available for longitudinal analysis by combining the teachings herein with those of U.S. Patent No. 6,597,934, cited above.
  • C is a set of pixels in the clean area of the tooth
  • L is a set of pixels i for which F N ( ⁇ F ⁇ , and s is the amount of surface area of the tooth represented by a single pixel in the image (obtained as part of the image capture process or calculated using known methods). Then the lesion area A
  • This metric was used to evaluate a white spot lesion over a one- year period following orthodontic debracketing.
  • the collected data, shown in Fig. 11, reflects an expected remineralization of the lesion over the monitoring period. Additional methods for evaluating lesions according to the present invention will now be discussed in relation to Figs. 4 - 10.
  • an image is considered by a user, who selects a series of points on the image that define a closed contour (curve C) around damaged tissue, a white spot lesion in this example.
  • a computing system estimates the original intensity values using calculated "reconstructed" intensity values for points within the contour, and compares those reconstructed values with the actual measured values from the image. The comparison is used to assess the calcium loss in the white spot.
  • Other techniques and applications are discussed herein
  • FIG. 4 an image of a tooth with a white spot lesion is shown, whereon a user has identified points P ⁇ -P .
  • other interfaces may be used, or automated techniques that are known to those skilled in the image processing arts may be used to define the contour. This description will refer to the region R of the image enclosed by the curve, or contour, C through points P P 9 .
  • Region R is illustrated again in Fig. 5, with x- and y-axes, which may be arbitrarily selected, but provide a fixed frame of reference for the remainder of the analysis in this exemplary embodiment.
  • a linear regression algorithm is applied to the region to determine a slope m that characterizes the primary orientation of the white spot in the image relative to the -axis.
  • a reconstructed intensity value is determined for each pixel in region R. Since the portions of the tooth along the line segments connecting P ⁇ -P are presumed to be healthy tissue, those values are retained in the reconstructed image. For those points strictly within region R (that is, within but not on the closed curve C), the values are interpolated as follows. As illustrated in Fig. 6, a line / of slope m' is projected through each such point P to two points (P a and P ) on curve C. A reconstructed intensity value I r is calculated for point P as the linear interpolation between intensities at points P a and P , where line / intersects curve C.
  • Fig. 7 is a graph of intensity values (on the vertical axis) versus position along line I (on the horizontal axis), wherein the intensity at point P a in the image is I a , the intensity at P in the image is , and the intensity at point P in the image is I 0 .
  • the "reconstructed" intensity at point P is I r , calculated as the result of linear interpolation between I a and I according to the formula
  • I r I b — (l b - I a ) f — X -x ⁇ , where X, X a , and X b are the ⁇ -coordinates of points P,
  • ieR' ieR
  • one or more points along curve C can be ignored in the interpolation, and alternative points (such as, for example, a line through point P having a slope slightly increased or decreased from m') could be used.
  • This "ignore" function is useful, for example, in situations where curve C passes through damaged tissue. If the points on curve C that are associated with damaged tissue are used for interpolation or projection of reconstructed intensity values, the reconstructed values will be tainted. Ignoring these values along the curve C allows the system to rely only on valid data for the reconstruction calculations.
  • Another alternative approach to calculating a reconstructed intensity I r for each point P uses the intensity at each point P,. Define n as the distance between point P and point P,- as shown in Fig. 8, and the reconstructed intensity of I r can be calculated as I r for N selected points in sound tooth areas, and a predetermined exponent ⁇ , which is preferably 2.
  • points P,- are selected in sound tooth areas of the image, where the points do not necessarily form a closed loop, but are preferably dispersed around the tooth image and around the damaged tooth area. Then the intensity at each point P in the damaged area can be calculated using a two-dimensional spline, a Bezier surface, or the distance- based interpolation function discussed above in relation to Fig. 5.
  • reconstruction of the intensities in damaged areas is achieved using additional intersection lines through the given point P with slope m'+(n- ⁇ ) for a predetermined angle ⁇ and n e ⁇ -3, -2, -1, 0, 1, 2, 3 ⁇ . More or fewer multiples are used in various embodiments. As discussed above in relation to Figs. 6 and 7, linear interpolation along each of these lines is performed to find a reconstructed intensity, then those values are combined to arrive at the reconstructed intensity I r to be used in further analysis.
  • Each of the individual images used in the analyses described herein may be expressed as grayscale images or in terms of RGB triples or YUV triples.
  • the interpolation calculations described above are preferably applied to each component of each pixel independently, though those skilled in the art will appreciate that variations on this approach and cross-over between components may be considered in reconstruction.
  • the images may be captured using any suitable technique known to those skilled in the art, such as those techniques discussed in the Software Repositioning patent.
  • FIG. 9 An important aspect of treatment is patient communication.
  • One aspect of the present invention that supports such communication relates to the creation of animated "movies" using individual images captured with fluorescent techniques, where frames are added between those fixed images to smoothly change from each individual image to the next.
  • One method for providing such animations according to the present invention is illustrated in Fig. 9.
  • Row A in this illustration shows two actual images (in columns 1 and 5) with space left (in columns 2-4) for intervening cells in the animation.
  • Row B of Fig. 9 shows the intensity values of each image from row A along line i. (Again, the present analysis may be applied to individual components of RGB or YUV component images.)
  • Row C of Fig. 9 shows intensity values from each image in the animation time sequence at pixel [i,j], which lies on line i.
  • the points shown for times tj and t are from images, while the points shown for times t 2 -t ⁇ are interpolated based on times t 2 -t 4 relative to ti and ts, and the actual values at times t and ts.
  • Row D of Fig. 9 illustrates a graph of the reconstructed intensity values I r along line i for each image in the sequence. It may be noted that while the intensity graphs for each image are similar, they are not identical. These variations might be due, for example, to differences in the specific imaging parameters and positions used to capture the actual images.
  • the reconstructed intensity values in row D are calculated independently for each image as discussed above.
  • the graphs shown in row B are then normalized by dividing each data value into the corresponding data value in the reconstructed data in row D, thus yielding the normalized data shown in row E.
  • the normalized values shown in row E are obtained for each pixel in each image, and are combined to yield the images (frames) in row F.
  • the series of images thus obtained yields an animated movie that functions like a weather map to illustrate the change in condition of the tooth, for better or worse.
  • the illustrated sequences of images shows, for example, the remineralization of the white spot seen in the image at row A, column 1.
  • the calculation of intensity, luminance, or individual pixel component color values in cells not corresponding to actual captured images is performed using other curve-fitting techniques.
  • a spline is fitted to the intensity values of corresponding pixels in at least three images as shown in Fig. 10.
  • the frames at times tj, t4, and t ⁇ j are actual images, while images for times t 2 , tj, and t are being synthesized.
  • the fitted spline is used to select intensity values for points in the synthesized frames based on the real data captured in the images.
  • linear interpolation is applied, a Bezier curve is fitted to the given data, or other curve-fitting techniques are applied as would occur to those skilled in the art based on the present disclosure.
  • the point on the curve corresponding to the time value for each frame is used to fill the pixel in that frame.
  • the normalized white spot graphic or illustration (as shown in Fig. 9, row F) is shown alone. In other embodiments it is superimposed on the original images, while in still others it is displayed over the interpolated images as well.
  • the intensities shown in row E are displayed in grayscale, while in others they are shown in color that varies based on the magnitude of the normalized intensity of each pixel.
  • function f(i) depends on one or more "optical components" of the pixel, which might include red, green, blue, chrominance, luminance, bandwidth, and/or other component as would occur to one of skill in the art of digital graphic processing.

Abstract

Systems and methods are described for visualizing, measuring, monitoring, and observing damage to and decalcification of tooth tissue in a lesion based on one or more still images of the tooth, each preferably observing through an optical filter the fluorescent response of the tissue to blue excitation light. The image is analyzed based on a function(s) of optical components of the pixels, preferably comparing a ratio between optical components to one or more thresholds. Other analysis uses interpolation and/or curve fitting to reconstruct what intensities the pixels would have if the tooth were sound. In some embodiments, this reconstruction is based on the pixel intensities that the user indicates correspond to sound tooth tissue. In other embodiments, these points are automatically selected. In still other embodiments, images captured over time are analyzed to create a sequence of frames in an animation of the state of the lesion.

Description

ANALYSIS AND DISPLAY OF FLUORESCENCE IMAGES
CROSS-REFERENCE TO RELATED APPLICATIONS This application contains subject matter related to U.S. Patent Application No. 10/209,574, filed July 31, 2002 (the "Inspection" application), a U.S. Patent Application titled "Fluorescence Filter for Tissue Examination and Imaging" filed of even date herewith (the "Fluorescence Filter" application), and U.S. Patent No. 6,597,934 (the "Software Repositioning" patent), and claims priority to U.S. Provisional Application Nos. 60/472,486, filed May 22, 2003, and 60/540,630, filed January 31, 2004. These applications and patent are hereby incorporated by reference in their entireties.
HELD OF THE INVENTION The present invention relates to quantitative analysis of digital images of fluorescing tissue. More specifically, the present invention relates to methods, systems, and apparatus for analyzing digital images of dental tissue to quantify and/or visualize variations in the state of the dental tissue due to disease or other damage.
BACKGROUND
Various techniques exist for evaluating the soundness of dental tissue, including many subjective techniques (characterizing an amount of plaque mechanically removed by explorer, floss, or pick, white-light visual examination, radiological examination, and the like). Recent developments include point examination techniques such as DIAGNODENT by KaVo America Corporation (of Lake Zurich, Illinois), which is said to measure fluorescence intensity in visually detected lesions. With each of these techniques, longitudinal analysis is difficult at best.
Furthermore, significant subjective components in many of these processes make it difficult to achieve repeatable and/or objective results, and they are not well adapted for producing visual representations of lesion progress. It is, therefore, an object of the invention to provide an improved method for enhancing the available information about plaque, calculus, and carious dental tissue otherwise invisible to the human eye, and to objectify longitudinal monitoring by recording the information at each measurement and providing quantitative information based on the image(s). Another object is to improve visualization and analysis of the whole visible tooth area, not limiting them to just a particular point. Still another object is to enhance information available to patients to motivate them toward better hygiene and earlier treatment.
SUMMARY Accordingly, in one embodiment, the invention provides a method of image analysis, comprising capturing additional images of tooth tissue, and for each of a plurality of pixels in the image, determining a first component value of the pixels color and a second component value of the pixels color, and calculating a first function value for the pixel based on the component values. In some embodiments, the first component is a red color component of the pixel, the second component is a green color component of the pixel, and the function is a ratio of the red component value to the green component value. In other embodiments, the pixels original color may be replaced by an alternate color depending upon the value of the first function calculated as to that pixel. In some of these embodiments, the modified image is displayed or stored, and may be combined with other modified images to construct an animated sequence.
In some embodiments, the function is calculated over all pixels in the image, while in other embodiments the function is applied only to one or more specified regions.
Another embodiment is a method of quantifying calcium loss due to a white spot lesion on a tooth. An image of the fluorescence of the tooth due, for example, to incident blue light, is captured as a digital image. A plurality of points defining a closed contour around a plurality of pixels are selected, and a reconstructed intensity value is calculated for each pixel within the contour. The sum of the differences between the reconstructed intensity values and actual intensity values for each of the pixels within the contour is calculated and quantifies the loss of fluorescence. In some forms of this embodiment, the actual intensity value for each pixel is a function of a single optical component of the pixel, such as a red component intensity. In other forms, the reconstructed intensity value for each pixel is calculated using linear interpolation, such as interpolating between intensity values of one or more points on the contour. In some implementations of this form, the points on the contour lie on or adjacent to a line through the given pixel, where the line is perpendicular to a regression line that characterizes the region surrounded by the contour. Another embodiment is a system that comprises a processor and a memory, where the memory is encoded with programming instructions executable by the processor to quantitatively evaluate the decalcification of a white spot based on a single image. In some embodiments of this form, the user selects points on the image around the white spot, where each point is assumed to be healthy tissue. "Reconstructed" intensities are calculated for each point within the closed loop, and a result quantity is calculated based on these values and the pixel values in the image.
BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is a representative image of the side view of a tooth, the image of which is to be analyzed according to the present invention.
Fig. 2 is a flowchart depicting the method of analysis according to one embodiment of the present invention.
Fig. 3 is a hardware and software system for capturing and processing image data according to one embodiment of the present invention.
Fig. 4 is a representative image of a tooth with a white spot lesion for analysis according to a second form of the present invention. Fig. 5 is a two-dimensional graph of selected features from Fig. 4.
Fig. 6 shows certain features from Fig. 4 in the context of calculating a reconstructed intensity value for a particular point in the image.
Fig. 7 is a graph of measured and reconstructed intensity along line I in Fig. 6. Fig. 8 illustrates quantities used for analysis in a third form of the present invention.
Fig. 9 is a series of related images and graphs illustrating a fourth form of the present invention.
Fig. 10 is a graph and series of image cells illustrating a fifth form of the present invention.
Fig. 11 is a graph of quantitative remineralization data over time as measured according to the present invention.
DESCRIPTION For the purpose of promoting an understanding of the principles of the present invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will, nevertheless, be understood that no limitation of the scope of the invention is thereby intended; any alterations and further modifications of the described or illustrated embodiments, and any further applications of the principles of the invention as illustrated therein are contemplated as would normally occur to one skilled in the art to which the invention relates. Fig. 1 represents a digital image of the side of a tooth for analysis according to the present invention. Of course, any portion of a tooth might be captured in a digital image for analysis using any camera suitable for intra-oral imaging. One exemplary image capture device is the combination light, camera, and shield described in the U.S. Patent Application titled "Fluorescence Filter for Tissue Examination and Imaging" (the "Fluorescence Filter" application), which is being filed of even date herewith. Alternative embodiments use other intra-oral cameras. The captured images are preferably limited to the fluorescent response of one or more teeth to light of a known wavelength (preferably between about 390 nm and 450 nm), where the response is preferably optically filtered to remove wavelengths below about 520 nm.
Fig. 1 represents image 100, including a portion of the image 102 that captures the fluorescence of a particular tooth. A carious region 104 extends along the gum and appears red in image 100. In this embodiment, a user has positioned a circle on the image to indicate a clean area 106 of the tooth that appears healthy. In alternative embodiments, the portions of image 100 corresponding to the tooth 102 and/or clean area 106 may be automatically determined by image analysis, as described below.
Fig. 2 describes in a flowchart the process 120, which is applied to image 100 in one embodiment of the present invention. Process 120 begins at start point 121, and the system captures the digital image at step 123. An example of a system for capturing an image at step 123 is illustrated in Fig. 3. System 150 includes a monitor 151 and keyboard 152, which communicate using any suitable means with computer unit 154, such as through a PS/2, USB, or Bluetooth interface. Unit 154 houses storage 153, memory 155, and a processor 157 that controls the capturing and processing functions in this embodiment. This includes, but is not limited to, controlling camera 156 to acquire digital images of tooth 158 or other dental tissue for analysis, preferably according to the techniques discussed in the Software Repositioning, Inspection, and Fluorescence Filter patent and applications.
Returning to Fig. 2, a clean area of the tooth is identified or selected at step 125 by manual or automatic means. For example, a user might accomplish this manually by positioning and sizing a circle on a displayed version of the image using a graphical user interface. In other embodiments, the system selects or proposes a clean area of the image by finding the pixel(s) having the highest (or lowest) value of a particular function over the domain of the tooth image. A circle centered at the point (or the centroid of points) corresponding to that maximum, a circle circumscribing each of those points, or other selection means may be used. When the clean area has been selected or defined, at block 127 the system finds the average of a particular function f(-) over the two-dimensional region that makes up the clean area 106. In this embodiment, function /() is a ratio of red intensity R(z') to green intensity G(i) at each pixel i. Thus, f(i) = R(i)/G(i), and the
average is Fc
Figure imgf000009_0001
{# of pixels in C)
The color data for each pixel is then analyzed in a loop at pixel subprocess block 129. There, a normalized function FN(Z") =f(i)/Fc is calculated for the pixel i. It is determined at decision block 133 whether that normalized value is greater than a predetermined threshold; that is, whether FN( > FTι. In this sample embodiment, this threshold Fτι is defined as 1.1, but other threshold values Fπ can be used based on automatic adjustment or user preference as would occur to one of ordinary skill in the art. If the threshold is not exceeded, the negative branch of decision block 133 leads to the end of pixel subprocess 129 at point 141. If, instead, the normalized function value FN( is greater than the threshold F ι for the pixel being considered (a positive result at decision block 133), it is determined at decision block 135 whether the normalized value exceeds the second threshold; that is, whether FN(Z') > Fχ2. If not (a negative result), the system changes the color of pixel i to a predetermined color at block 137, then proceeds to process the next pixel via point 141, which is the end of pixel subprocess 129. If the normalized function value FN(Z) exceeds the second threshold Vn (a positive result at decision block 135), the system changes the color of pixel i to a predetermined color C2 at block 139. The system then proceeds to the next pixel via point 141.
When each pixel in the tooth portion 102 of image 100 has been processed by pixel subprocess 129, the image is output from the system at block 143. In various embodiments, the image can be displayed on a monitor 151 (see Fig. 3), saved to a storage device 153, or added to an animation (as will be discussed below).
In a preferred form of this embodiment, predetermined colors Cι and C2 are selected to stand out from the original image data, such as choosing a light blue color for pixels with normalized R G ratios higher than Fχι = 1.1, and a medium blue color for pixels having normalized R/G ratios higher than Fχ2 = 1.2. Of course, other thresholds and color choices will occur to those skilled in the art for use in practicing this invention. Furthermore, more or fewer ratio thresholds and corresponding colors may be used in other alternative forms of this embodiment of the invention. Still further, in still other embodiments pixels having a normalized function value FN(Z') less than the lower or lowest threshold Fτι are replaced with a neutral, contrasting color such as gray, black, beige, or white.
This process 120 is particularly useful for performing a longitudinal analysis of a patient's condition over time during treatment. For example, a series of images taken before, during, and after treatment often reveals strengths and weaknesses of the treatment in terms of efficacy in an easily observable, yet objective way. The use of R/G ratios instead of simple intensity measurements in these calculations reduces variations resulting from slightly different lighting conditions or camera configurations. One can further improve the data available for longitudinal analysis by combining the teachings herein with those of U.S. Patent No. 6,597,934, cited above.
When multiple images have been captured of a particular subject, techniques known in the image processing art can be applied to generate an animation from those images. In one form of this embodiment, captured images are simply placed in sequence to yield a time-lapse animation. In other forms, the time scale is made more consistent by placing reconstructed images between captured images to provide a consistent time scale between frames of the animation. Some of these techniques are discussed herein.
Several metrics can be calculated using pixel-specific and image-wide data described above. For example, assume that C is a set of pixels in the clean area of the tooth, L is a set of pixels i for which FN( < Fτι, and s is the amount of surface area of the tooth represented by a single pixel in the image (obtained as part of the image capture process or calculated using known methods). Then the lesion area A
= s- {pixels in L}. A measurement of fluorescence loss in the lesion is calculated as
A r_ {average G(i) over L}- {average G(i) over C] ^. . . . . _, , ..
ΔF = — — - — . This value of ΔF descπbes
{average G(i) over C] the lesion depth as a proportion or percentage of fluorescence intensity lost.
Another useful metric is the integrated fluorescence lost, AQ=A-ΔF, which describes the total amount of mineral lost from the lesion in area-percentage units (such as mm2-%). This metric was used to evaluate a white spot lesion over a one- year period following orthodontic debracketing. The collected data, shown in Fig. 11, reflects an expected remineralization of the lesion over the monitoring period. Additional methods for evaluating lesions according to the present invention will now be discussed in relation to Figs. 4 - 10. Generally, in using this evaluation technique, an image is considered by a user, who selects a series of points on the image that define a closed contour (curve C) around damaged tissue, a white spot lesion in this example. A computing system estimates the original intensity values using calculated "reconstructed" intensity values for points within the contour, and compares those reconstructed values with the actual measured values from the image. The comparison is used to assess the calcium loss in the white spot. Other techniques and applications are discussed herein.
Turning to Fig. 4, an image of a tooth with a white spot lesion is shown, whereon a user has identified points Pι-P . hi this embodiment the user clicks a mouse button in a graphical user interface to select each point, then clicks the starting point again to close the loop. In other embodiments, other interfaces may be used, or automated techniques that are known to those skilled in the image processing arts may be used to define the contour. This description will refer to the region R of the image enclosed by the curve, or contour, C through points P P9.
Region R is illustrated again in Fig. 5, with x- and y-axes, which may be arbitrarily selected, but provide a fixed frame of reference for the remainder of the analysis in this exemplary embodiment. A linear regression algorithm is applied to the region to determine a slope m that characterizes the primary orientation of the white spot in the image relative to the -axis. The slope of a line perpendicular to the regression line will be used in the present method, and will be referred to as m'= - Vm.
Once the slope of interest m' is determined, a reconstructed intensity value is determined for each pixel in region R. Since the portions of the tooth along the line segments connecting Pι-P are presumed to be healthy tissue, those values are retained in the reconstructed image. For those points strictly within region R (that is, within but not on the closed curve C), the values are interpolated as follows. As illustrated in Fig. 6, a line / of slope m' is projected through each such point P to two points (Pa and P ) on curve C. A reconstructed intensity value Ir is calculated for point P as the linear interpolation between intensities at points Pa and P , where line / intersects curve C.
Linear interpolation in this context is illustrated in Fig. 7. Fig. 7 is a graph of intensity values (on the vertical axis) versus position along line I (on the horizontal axis), wherein the intensity at point Pa in the image is Ia, the intensity at P in the image is , and the intensity at point P in the image is I0. The "reconstructed" intensity at point P is Ir, calculated as the result of linear interpolation between Ia and I according to the formula
Ir = Ib — (lb - Ia ) f — X -x λ , where X, Xa, and Xb are the ^-coordinates of points P,
Pa, and Pb, respectively. A useful value that characterizes the damage to the tissue
is the fluorescence loss ratio, ΔF = — . Where decalcification has occurred,
** / r ΔF > 0.
A useful metric L for fluorescence loss in a lesion is the sum of ΔF over all the pixels within curve C; that is, L = AF(i). Other metrics V and L" take the ieR sum of ΔF over only pixels for which reconstructed intensity Ir is a certain (multiplicative) factor or (subtractive) differential less than the actual, measured intensity I0; that is, given R'= {i : I0 < (Ir -έ)} and R" = {i : I0 < βlr) for some predetermined ε and β, then L'= ΔF(i) and "= AF(i). ieR' ieR"
Other interpolation and curve-fitting methods for reconstructing or estimating a healthy intensity Ir will occur to those skilled in the art based on this discussion. For example, a two-dimensional smoothing function can be applied throughout region R, so that many values along curve C affect the reconstructed values for the points within the curve.
In some embodiments, one or more points along curve C can be ignored in the interpolation, and alternative points (such as, for example, a line through point P having a slope slightly increased or decreased from m') could be used. This "ignore" function is useful, for example, in situations where curve C passes through damaged tissue. If the points on curve C that are associated with damaged tissue are used for interpolation or projection of reconstructed intensity values, the reconstructed values will be tainted. Ignoring these values along the curve C allows the system to rely only on valid data for the reconstruction calculations. Another alternative approach to calculating a reconstructed intensity Ir for each point P uses the intensity at each point P,. Define n as the distance between point P and point P,- as shown in Fig. 8, and the reconstructed intensity of Ir can be calculated as Ir for N selected points in sound
Figure imgf000014_0001
tooth areas, and a predetermined exponent α, which is preferably 2.
In yet another embodiment of the present invention, several points P,- are selected in sound tooth areas of the image, where the points do not necessarily form a closed loop, but are preferably dispersed around the tooth image and around the damaged tooth area. Then the intensity at each point P in the damaged area can be calculated using a two-dimensional spline, a Bezier surface, or the distance- based interpolation function discussed above in relation to Fig. 5.
In another alternative embodiment, reconstruction of the intensities in damaged areas is achieved using additional intersection lines through the given point P with slope m'+(n-ΔΘ) for a predetermined angle Δθ and n e {-3, -2, -1, 0, 1, 2, 3}. More or fewer multiples are used in various embodiments. As discussed above in relation to Figs. 6 and 7, linear interpolation along each of these lines is performed to find a reconstructed intensity, then those values are combined to arrive at the reconstructed intensity Ir to be used in further analysis.
Each of the individual images used in the analyses described herein may be expressed as grayscale images or in terms of RGB triples or YUV triples. In the case of component expressions, the interpolation calculations described above are preferably applied to each component of each pixel independently, though those skilled in the art will appreciate that variations on this approach and cross-over between components may be considered in reconstruction. Further, the images may be captured using any suitable technique known to those skilled in the art, such as those techniques discussed in the Software Repositioning patent.
An important aspect of treatment is patient communication. One aspect of the present invention that supports such communication relates to the creation of animated "movies" using individual images captured with fluorescent techniques, where frames are added between those fixed images to smoothly change from each individual image to the next. One method for providing such animations according to the present invention is illustrated in Fig. 9. Row A in this illustration shows two actual images (in columns 1 and 5) with space left (in columns 2-4) for intervening cells in the animation. Row B of Fig. 9 shows the intensity values of each image from row A along line i. (Again, the present analysis may be applied to individual components of RGB or YUV component images.) Row C of Fig. 9 shows intensity values from each image in the animation time sequence at pixel [i,j], which lies on line i. The points shown for times tj and t are from images, while the points shown for times t2-t^ are interpolated based on times t2-t4 relative to ti and ts, and the actual values at times t and ts.
Row D of Fig. 9 illustrates a graph of the reconstructed intensity values Ir along line i for each image in the sequence. It may be noted that while the intensity graphs for each image are similar, they are not identical. These variations might be due, for example, to differences in the specific imaging parameters and positions used to capture the actual images. The reconstructed intensity values in row D are calculated independently for each image as discussed above. The graphs shown in row B are then normalized by dividing each data value into the corresponding data value in the reconstructed data in row D, thus yielding the normalized data shown in row E. The normalized values shown in row E are obtained for each pixel in each image, and are combined to yield the images (frames) in row F. The series of images thus obtained yields an animated movie that functions like a weather map to illustrate the change in condition of the tooth, for better or worse. The illustrated sequences of images shows, for example, the remineralization of the white spot seen in the image at row A, column 1.
In various alternative embodiments, the calculation of intensity, luminance, or individual pixel component color values in cells not corresponding to actual captured images is performed using other curve-fitting techniques. For example, in some embodiments a spline is fitted to the intensity values of corresponding pixels in at least three images as shown in Fig. 10. In that illustration, the frames at times tj, t4, and t<j are actual images, while images for times t2, tj, and t are being synthesized. The fitted spline is used to select intensity values for points in the synthesized frames based on the real data captured in the images. In other alternative embodiments, linear interpolation is applied, a Bezier curve is fitted to the given data, or other curve-fitting techniques are applied as would occur to those skilled in the art based on the present disclosure. Whatever curve-fitting technique is used, the point on the curve corresponding to the time value for each frame is used to fill the pixel in that frame. In various alternative embodiments of the "weather map" technique, the normalized white spot graphic or illustration (as shown in Fig. 9, row F) is shown alone. In other embodiments it is superimposed on the original images, while in still others it is displayed over the interpolated images as well. In some of these embodiments, the intensities shown in row E are displayed in grayscale, while in others they are shown in color that varies based on the magnitude of the normalized intensity of each pixel.
It is noted that the methods described and suggested herein are preferably implemented by a processor executing prograrnming instructions stored in a computer-readable medium, as illustrated in Fig. 3. In various embodiments, function f(i) depends on one or more "optical components" of the pixel, which might include red, green, blue, chrominance, luminance, bandwidth, and/or other component as would occur to one of skill in the art of digital graphic processing.
While the invention has been illustrated and described in detail in the drawings and foregoing description, the same is to be considered as illustrative and not restrictive in character, it being understood that only the preferred embodiment has been shown and described and that all changes and modifications that come within the spirit of the invention are desired to be protected. Furthermore, all patents, publications, prior and simultaneous applications, and other documents cited herein are hereby incorporated by reference in their entirety as if each had been individually incorporated by reference and fully set forth.

Claims

What is claimed is:
1. A method of image analysis, comprising: capturing a digital image of tooth tissue; and for each of a plurality of pixels in the digital image: determining a first component value of the pixel's color and a second component value of the pixel's color; and calculating a first function value for the pixel based on the first component value and the second component value.
2. The method of claim 1, wherein the first component value is a red color component of the pixel .
3. The method of claim 2, wherein: the second component value is a green color component of the pixel; and the first function is a ratio of the red color component to the green color component.
4. The method of claim 1, further comprising creating a second image, wherein the creating includes using an alternate color for at least one pixel, and the alternate color is selected based on the first function value for the at least one pixel.
5. The method of claim 4, wherein each of the plurality of pixels has an original color, and further comprising displaying the digital image, substituting the selected alternate color in place of the original color for the plurality of pixels in the digital image.
6. The method of claim 4, wherein each of the plurality of pixels has an original color, and further comprising storing the digital image, substituting the selected alternate color in place of the original color for the plurality of pixels in the digital image.
7. The method of claim 1, wherein the plurality of pixels includes all pixels in the image; and further comprising displaying a subset of the plurality of pixels in an alternative color.
8. A method of quantifying mineral loss due to a lesion on a tooth, comprising: capturing a digital image of the fluorescence of the tooth, the image comprising actual intensity values for a region of pixels; selecting a plurality of points defining a closed contour around a first plurality of pixels; calculating a reconstructed intensity value for each pixel in the first plurality of pixels; and calculating the sum of the differences between the reconstructed intensity values for each of a second plurality of pixels and the actual intensity values for each of the second plurality of pixels.
9. The method of claim 8, wherein the first plurality of pixels is the same as the second plurality of pixels.
10. The method of claim 8, wherein the second plurality of pixels consists of those of the first plurality of pixels for which the actual intensity values are smaller than the reconstructed intensity values minus a predetermined threshold.
11. The method of claim 8, wherein the second plurality of pixels consists of those of the first plurality of pixels for which the actual intensity values are smaller than the reconstructed intensity values by a predetermined multiplicative factor.
12. The method of claim 8, wherein the actual intensity value for each pixel in the first plurality of pixels is a function of a single optical component of the pixel.
13. The method of claim 8, wherein the reconstructed intensity value for each pixel in the first plurality of pixels is calculated using linear interpolation.
14. The method of claim 13, wherein the linear interpolation for each given pixel is based on intensity values of one or more points on the contour.
15. The method of claim 14 wherein the one or more points on the contour lie on or adjacent to a line through the given pixel.
16. The method of claim 15 further comprising performing a linear regression analysis of the region surrounded by the contour to determine the slope m of a regression line; and wherein the line through the given pixel is selected to have a slope of about - 1/m.
17. The method of claim 14 further comprising performing a linear regression analysis of the region surrounded by the contour to determine the slope m of a regression line; and wherein the one or more points on the contour lie on or adjacent to a set of lines lj through the given pixel, and wherein the slope of each line lj is selected to be (-Vm + nθ) for a predetermined slope differential θ and set of multipliers n.
18. The method of claim 8, wherein the reconstructed intensity value for each pixel is calculated as a function of intensity values of two or more points on the contour.
19. The method of claim 18, further comprising: identifying one or more points to be ignored on the contour; and excluding the one or more points to be ignored during the calculation of reconstructed intensity values.
20. The method of claim 18, wherein the function is a function of
N selected points Pi, P2, ... P# in the image that represent sound tooth tissue, where N>1, , the distance in the image between the pixel and a selected point P,- in a sound tooth area, li, the intensity of point P;, and a predetermined exponent α,
Figure imgf000020_0001
21. The method of claim 20, wherein α = 2.
22. A system, comprising a processor and a memory, the memory being encoded with programming instructions executable by the processor to: retrieve a first image of light that is the product of autofluorescence of a tooth having a white spot lesion, wherein the first image comprises pixels each having an original intensity; determine a first plurality of points in the first image that define a contour substantially surrounding the lesion; and calculate a reconstructed intensity for each pixel in the second image that lies within the contour; and calculate a first result quantity based on two or more of the reconstructed intensities and two or more of the original intensities of pixels in the first image.
23. The system of claim 22, wherein the programming instructions are further executable by the processor to: retrieve a second image of light that is the product of autofluorescence of the tooth, wherein the second image comprises pixels each having an original intensity, and the second image is captured at a different time than that at which the first image is captured; determine a second plurality of points in the second image that define a contour substantially surrounding the lesion; and calculate a reconstructed intensity for each pixel in the second image that lies within the contour; and calculate a second result quantity based on two or more of the reconstructed intensities and two or more of the original intensities of pixels in the second image.
PCT/IB2004/001658 2003-05-22 2004-05-22 Analysis and display of fluorescence images WO2004104927A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006530674A JP2007502185A (en) 2003-05-22 2004-05-22 Analysis and display of fluorescence images

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US47248603P 2003-05-22 2003-05-22
US60/472,486 2003-05-22
US54063004P 2004-01-31 2004-01-31
US60/540,630 2004-01-31

Publications (2)

Publication Number Publication Date
WO2004104927A2 true WO2004104927A2 (en) 2004-12-02
WO2004104927A3 WO2004104927A3 (en) 2005-06-16

Family

ID=33479322

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/IB2004/001658 WO2004104927A2 (en) 2003-05-22 2004-05-22 Analysis and display of fluorescence images
PCT/IB2004/001655 WO2004103171A2 (en) 2003-05-22 2004-05-22 Fluorescence filter for tissue examination and imaging

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/IB2004/001655 WO2004103171A2 (en) 2003-05-22 2004-05-22 Fluorescence filter for tissue examination and imaging

Country Status (6)

Country Link
US (2) US20040254478A1 (en)
EP (1) EP1624797A2 (en)
JP (1) JP2007502185A (en)
AU (1) AU2004241802B2 (en)
CA (1) CA2520195A1 (en)
WO (2) WO2004104927A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007127036A1 (en) * 2006-04-21 2007-11-08 Carestream Health, Inc. Optical detection of dental caries
US7845039B2 (en) 2003-09-09 2010-12-07 The Procter & Gamble Company Toothbrush with severable electrical connections
EP2348484A1 (en) * 2009-10-14 2011-07-27 Carestream Health, Inc. Method for extracting a carious lesion area
EP2312527A3 (en) * 2009-06-19 2013-03-06 Carestream Health, Inc. Method for quantifying caries
EP2587452A1 (en) * 2010-12-13 2013-05-01 Carestream Health, Inc. Method for identification of dental caries in polychromatic images
US8687859B2 (en) 2009-10-14 2014-04-01 Carestream Health, Inc. Method for identifying a tooth region
US9235901B2 (en) 2009-10-14 2016-01-12 Carestream Health, Inc. Method for locating an interproximal tooth region
US10572755B2 (en) 2015-09-28 2020-02-25 Olympus Corporation Image analysis apparatus for calculating degree of change in distribution characteristic values, image analysis system, and method for operating image analysis system

Families Citing this family (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10333515B4 (en) * 2003-07-17 2016-11-24 Carl Zeiss Meditec Ag Method and device for identifying tooth-colored tooth filling residues
DE102004024494B4 (en) * 2004-05-16 2019-10-17 Dürr Dental SE Medical camera
US7270543B2 (en) * 2004-06-29 2007-09-18 Therametric Technologies, Inc. Handpiece for caries detection
US20060241501A1 (en) * 2004-09-28 2006-10-26 Zila Pharmaceuticals, Inc. Method and apparatus for detecting abnormal epithelial tissue
US20080255462A1 (en) * 2004-09-28 2008-10-16 Zila Pharmaceuticals, Inc. Light stick
US20090118624A1 (en) * 2004-09-28 2009-05-07 Zila Pharmaceuticals, Inc. Device for oral cavity examination
EP1793727A4 (en) * 2004-09-28 2009-01-07 Zila Pharm Inc Methods for detecting abnormal epithelial tissue
WO2006106509A2 (en) * 2005-04-04 2006-10-12 Hadasit Ltd. Medical imaging method and system
KR100943367B1 (en) * 2005-04-27 2010-02-18 올림푸스 메디칼 시스템즈 가부시키가이샤 Image processing device, image processing method, and recording medium for recording image processing program
WO2007025362A1 (en) * 2005-09-02 2007-03-08 Neptec Imaging system and method
US7596253B2 (en) * 2005-10-31 2009-09-29 Carestream Health, Inc. Method and apparatus for detection of caries
US20100190129A1 (en) * 2006-08-08 2010-07-29 Mony Paz Combination dental hand tool
US7668355B2 (en) * 2006-08-31 2010-02-23 Carestream Health, Inc. Method for detection of caries
US20080062429A1 (en) * 2006-09-12 2008-03-13 Rongguang Liang Low coherence dental oct imaging
US8447087B2 (en) 2006-09-12 2013-05-21 Carestream Health, Inc. Apparatus and method for caries detection
US8270689B2 (en) 2006-09-12 2012-09-18 Carestream Health, Inc. Apparatus for caries detection
EP2074556A2 (en) * 2006-09-28 2009-07-01 Koninklijke Philips Electronics N.V. Content detection of a part of an image
US7702139B2 (en) * 2006-10-13 2010-04-20 Carestream Health, Inc. Apparatus for caries detection
US20080118886A1 (en) * 2006-11-21 2008-05-22 Rongguang Liang Apparatus for dental oct imaging
US8360771B2 (en) * 2006-12-28 2013-01-29 Therametric Technologies, Inc. Handpiece for detection of dental demineralization
US8224045B2 (en) 2007-01-17 2012-07-17 Carestream Health, Inc. System for early detection of dental caries
US20080193894A1 (en) * 2007-02-13 2008-08-14 Neng-Wei Wu Mouth camera device
FR2916883B1 (en) * 2007-05-29 2009-09-04 Galderma Res & Dev METHOD AND DEVICE FOR ACQUIRING AND PROCESSING IMAGES FOR DETECTION OF EVOLUTIVE LESIONS
US20080306361A1 (en) * 2007-06-11 2008-12-11 Joshua Friedman Optical screening device
US20080306470A1 (en) * 2007-06-11 2008-12-11 Joshua Friedman Optical screening device
US20100210951A1 (en) * 2007-06-15 2010-08-19 Mohammed Saidur Rahman Optical System for Imaging of Tissue Lesions
US20110058717A1 (en) * 2008-01-18 2011-03-10 John Michael Dunavent Methods and systems for analyzing hard tissues
US8866894B2 (en) * 2008-01-22 2014-10-21 Carestream Health, Inc. Method for real-time visualization of caries condition
WO2009134783A1 (en) * 2008-05-02 2009-11-05 The Procter & Gamble Company Products and methods for disclosing conditions in the oral cavity
US20100036260A1 (en) 2008-08-07 2010-02-11 Remicalm Llc Oral cancer screening device
CN102292018B (en) * 2009-01-20 2014-12-24 卡尔斯特里姆保健公司 Method and apparatus for detection of caries
JP5498481B2 (en) * 2009-03-24 2014-05-21 オリンパス株式会社 Fluorescence observation apparatus, fluorescence observation system, operation method of fluorescence observation apparatus, and fluorescence image processing method performed by fluorescence observation apparatus
US20110110575A1 (en) * 2009-11-11 2011-05-12 Thiagarajar College Of Engineering Dental caries detector
EP2544583B1 (en) 2010-03-08 2016-03-02 Bruce Adams System, method and article for normalization and enhancement of tissue images
US9642687B2 (en) 2010-06-15 2017-05-09 The Procter & Gamble Company Methods for whitening teeth
US8416984B2 (en) * 2011-01-20 2013-04-09 Carestream Health, Inc. Automatic tooth charting using digital images
WO2012133845A1 (en) * 2011-03-31 2012-10-04 株式会社根本杏林堂 Leakage detection sensor and drug infusion system
JP5926909B2 (en) 2011-09-07 2016-05-25 オリンパス株式会社 Fluorescence observation equipment
MX2014004280A (en) 2011-10-13 2014-05-28 Koninkl Philips Nv Medical probe with multi-fiber lumen.
US9901256B2 (en) 2012-01-20 2018-02-27 University Of Washington Through Its Center For Commercialization Dental demineralization detection, methods and systems
WO2014118786A1 (en) * 2013-02-04 2014-08-07 Orpheus Medical Ltd. Color reduction in images of human body
CN103654730B (en) * 2013-12-19 2016-04-20 北京大学 A kind of fluorescent molecules imaging system based on LED light source and formation method thereof
US10080484B2 (en) 2014-01-31 2018-09-25 University Of Washington Multispectral wide-field endoscopic imaging of fluorescence
US9870613B2 (en) 2014-11-05 2018-01-16 Carestream Health, Inc. Detection of tooth condition using reflectance images with red and green fluorescence
WO2016073569A2 (en) 2014-11-05 2016-05-12 Carestream Health, Inc. Video detection of tooth condition using green and red fluorescence
GB201420273D0 (en) 2014-11-14 2014-12-31 Mars Inc Method for quantifying plaque in pet animals
WO2016099471A1 (en) 2014-12-17 2016-06-23 Carestream Health, Inc. Intra-oral 3-d fluorescence imaging
US9547903B2 (en) 2015-04-16 2017-01-17 Carestream Health, Inc. Method for quantifying caries
WO2016175178A1 (en) 2015-04-27 2016-11-03 オリンパス株式会社 Image analysis device, image analysis system, and operation method for image analysis device
WO2017122431A1 (en) 2016-01-15 2017-07-20 オリンパス株式会社 Image analysis device, image analysis system, and method for actuating image analysis device
PL3442397T3 (en) 2016-04-13 2021-11-08 Inspektor Research Systems B.V. Bi-frequency dental examination
WO2018081637A1 (en) 2016-10-28 2018-05-03 University Of Washington System and method for ranking bacterial activity leading to tooth and gum disease
US10699163B1 (en) 2017-08-18 2020-06-30 Massachusetts Institute Of Technology Methods and apparatus for classification
JP7026337B2 (en) * 2017-08-29 2022-02-28 パナソニックIpマネジメント株式会社 Optical observation device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004093673A1 (en) * 2003-04-10 2004-11-04 Stookey George K Optical detection of dental caries

Family Cites Families (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10760A (en) * 1854-04-11 Street gas-lamp
US3388645A (en) * 1965-12-30 1968-06-18 Polaroid Corp Photographic device
US3424070A (en) * 1966-11-10 1969-01-28 Polaroid Corp Camera apparatus
US3425599A (en) * 1967-03-02 1969-02-04 Int Harvester Co Gravity type fertilizer spreader
US3711700A (en) * 1971-05-10 1973-01-16 Gte Sylvania Inc Disclosing light
GB1470760A (en) * 1973-11-12 1977-04-21 Alphametrics Ltd Ultraviolet camera system for dental photography and mouth pieces therefor
US3969577A (en) * 1974-10-15 1976-07-13 Westinghouse Electric Corporation System for evaluating similar objects
US4085436A (en) * 1976-10-14 1978-04-18 Allen Weiss Ring light converter for electronic flash units
US4080476A (en) * 1976-11-15 1978-03-21 Datascope Corporation Anti-fog coated optical substrates
CH627069A5 (en) * 1978-04-14 1981-12-31 Lpa Les Produits Associes
US4290433A (en) * 1979-08-20 1981-09-22 Alfano Robert R Method and apparatus for detecting the presence of caries in teeth using visible luminescence
SE442817B (en) * 1981-04-01 1986-02-03 Hans Ingmar Bjelkhagen DEVICE FOR OCCURRANTLY ACHIEVING A DISCRIPTION IN A LUMINISCENCE FOR A TANDY SURFACE
US4479799A (en) * 1981-05-21 1984-10-30 Riker Laboratories, Inc. Hypodermic syringe containing microfibers of an amorphous heparin salt
CH650868A5 (en) * 1981-06-05 1985-08-15 Volpi Ag DEVICE FOR LIGHTING A CAVITY.
US4437161A (en) * 1981-06-29 1984-03-13 Siemens Gammasonics Inc. Medical imaging apparatus
EP0083047B1 (en) * 1981-12-24 1987-02-04 Bayerische Motoren Werke Aktiengesellschaft, Patentabteilung AJ-3 Test method for mechanical working parts, and device for carrying out this method
US4479499A (en) * 1982-01-29 1984-10-30 Alfano Robert R Method and apparatus for detecting the presence of caries in teeth using visible light
US4445858A (en) * 1982-02-19 1984-05-01 American Hospital Supply Corporation Apparatus for photo-curing of dental restorative materials
EP0153439B1 (en) * 1983-06-03 1993-08-04 Fondazione Pro Juventute Don Carlo Gnocchi Modularly expansible system for real time processing of a TV display, useful in particular for the acquisition of coordinates of known shape objects and method using said system in radiography.
JP2615006B2 (en) * 1985-03-26 1997-05-28 富士写真光機 株式会社 Laser beam side fiber
US4615679A (en) * 1985-06-03 1986-10-07 Wyatt Thomas K Light shield for use with light curing apparatus
US4921344A (en) * 1985-06-12 1990-05-01 Duplantis Shannon S Apparatus and method for enhancing the images of intra-oral photography
US4662842A (en) * 1985-08-16 1987-05-05 Croll Theodore P Finger-mounted light filter
US4836206A (en) * 1987-02-25 1989-06-06 The United States Of America As Represented By The Department Of Health And Human Services Method and device for determining viability of intact teeth
US4900253A (en) * 1987-07-15 1990-02-13 Landis Timothy J Dental mirror having ultraviolet filter
JPH03223976A (en) * 1990-01-29 1991-10-02 Ezel Inc Image collating device
US6580086B1 (en) * 1999-08-26 2003-06-17 Masimo Corporation Shielded optical probe and method
US5779634A (en) * 1991-05-10 1998-07-14 Kabushiki Kaisha Toshiba Medical information processing system for supporting diagnosis
DE4200741C2 (en) * 1992-01-14 2000-06-15 Kaltenbach & Voigt Device for the detection of caries on teeth
US6205259B1 (en) * 1992-04-09 2001-03-20 Olympus Optical Co., Ltd. Image processing apparatus
US5359513A (en) * 1992-11-25 1994-10-25 Arch Development Corporation Method and system for detection of interval change in temporally sequential chest images
US5288231A (en) * 1993-03-08 1994-02-22 Pinnacle Products, Inc. Light shield for dental apparatus
US5509800A (en) * 1993-08-20 1996-04-23 Cunningham; Peter J Light-filter for dental use
US5487661A (en) * 1993-10-08 1996-01-30 Dentsply International, Inc. Portable dental camera and system
US5528432A (en) * 1994-02-23 1996-06-18 Ultrak, Inc. Intra-oral optical viewing device
US5590660A (en) * 1994-03-28 1997-01-07 Xillix Technologies Corp. Apparatus and method for imaging diseased tissue using integrated autofluorescence
US5585186A (en) * 1994-12-12 1996-12-17 Minnesota Mining And Manufacturing Company Coating composition having anti-reflective, and anti-fogging properties
US5766006A (en) * 1995-06-26 1998-06-16 Murljacic; Maryann Lehmann Tooth shade analyzer system and methods
US5894620A (en) * 1995-06-28 1999-04-20 U.S. Philips Corporation Electric toothbrush with means for locating dental plaque
US5742700A (en) * 1995-08-10 1998-04-21 Logicon, Inc. Quantitative dental caries detection system and method
DE19541686B4 (en) * 1995-11-08 2009-08-06 Kaltenbach & Voigt Gmbh & Co. Kg Device for detecting caries, plaque or bacterial infestation of teeth
DE29704185U1 (en) * 1997-03-07 1997-04-30 Kaltenbach & Voigt Device for the detection of caries, plaque or bacterial attack on teeth
DE29705934U1 (en) * 1997-04-03 1997-06-05 Kaltenbach & Voigt Diagnostic and treatment device for teeth
US6148092A (en) * 1998-01-08 2000-11-14 Sharp Laboratories Of America, Inc System for detecting skin-tone regions within an image
GB9810471D0 (en) * 1998-05-16 1998-07-15 Helmet Hund Gmbh Toothbrush
DE19825021A1 (en) * 1998-06-04 1999-12-09 Kaltenbach & Voigt Method and device for the detection of caries, plaque, calculus or bacterial infection on teeth
DE19827417B4 (en) * 1998-06-19 2004-10-28 Hahn, Rainer, Dr.Med.Dent. Material for different modification of the optical properties of different cells
US5957687A (en) * 1998-07-21 1999-09-28 Plak-Lite Company Llc Apparatus and method for detecting dental plaque
GB2340618A (en) * 1998-07-22 2000-02-23 Gee Dental mirror
US6227850B1 (en) * 1999-05-13 2001-05-08 Align Technology, Inc. Teeth viewing system
US6547394B2 (en) * 1998-10-20 2003-04-15 Victor J. Doherty Hand-held ophthalmic illuminator
US6532299B1 (en) * 2000-04-28 2003-03-11 Orametrix, Inc. System and method for mapping a surface
US6512994B1 (en) * 1999-11-30 2003-01-28 Orametrix, Inc. Method and apparatus for producing a three-dimensional digital model of an orthodontic patient
US6155823A (en) * 1999-06-18 2000-12-05 Bisco Inc. Snap-on light shield for a dental composite light curing gun
US6345982B1 (en) * 1999-09-01 2002-02-12 Darcy M. Dunaway Dental light controller and concentrator
US6341957B1 (en) * 1999-11-27 2002-01-29 Electro-Optical Sciences Inc. Method of transillumination imaging of teeth
US7234937B2 (en) * 1999-11-30 2007-06-26 Orametrix, Inc. Unified workstation for virtual craniofacial diagnosis, treatment planning and therapeutics
US6402693B1 (en) * 2000-01-13 2002-06-11 Siemens Medical Solutions Usa, Inc. Ultrasonic transducer aligning system to replicate a previously obtained image
JP2001209097A (en) * 2000-01-28 2001-08-03 Masashi Saito Camera system for intra-oral photography
DE10013210A1 (en) * 2000-03-17 2001-09-20 Kaltenbach & Voigt Device for the detection of caries, plaque, bacterial infestation, calculus, tartar and other fluorescent substances on teeth
US6788813B2 (en) * 2000-10-27 2004-09-07 Sony Corporation System and method for effectively performing a white balance operation
US6597934B1 (en) * 2000-11-06 2003-07-22 Inspektor Research Systems B.V. Diagnostic image capture
US6769911B2 (en) * 2001-04-16 2004-08-03 Advanced Research & Technology Institue Luminescence assisted caries excavation
EP1252859A3 (en) * 2001-04-27 2003-12-17 Firma Ivoclar Vivadent AG Dental camera with mouthpiece
DE60228165D1 (en) * 2001-05-16 2008-09-25 Olympus Corp Endoscope with image processing device
FR2825260B1 (en) * 2001-06-01 2004-08-20 Centre Nat Rech Scient METHOD AND DEVICE FOR DETECTION OF DENTAL CARIES
DE10133451B4 (en) * 2001-07-10 2012-01-26 Ferton Holding S.A. Device for detecting caries, plaque, concrements or bacterial infestation of teeth
US7365844B2 (en) * 2002-12-10 2008-04-29 Board Of Regents, The University Of Texas System Vision enhancement system for improved detection of epithelial neoplasia and other conditions
US20040225340A1 (en) * 2003-03-10 2004-11-11 Evans James W. Light/breath/meditation device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004093673A1 (en) * 2003-04-10 2004-11-04 Stookey George K Optical detection of dental caries

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ANGMAR-MANSSON B ET AL: "QUANTITATIVE LIGHT-INDUCED FLUORESCENCE (QLF): A METHOD FOR ASSESSMENT OF INCIPIENT CARIES LESIONS" DENTO-MAXILLO-FACIAL RADIOLOGY, INT. ASS. DENTO-MAXILLO-FACIAL RADIOLOGY, GOTEBORG, SE, vol. 30, no. 6, November 2001 (2001-11), pages 298-307, XP009035409 ISSN: 0250-832X *
CHENG H D ET AL: "Color image segmentation: advances and prospects" PATTERN RECOGNITION, ELSEVIER, KIDLINGTON, GB, vol. 34, no. 12, December 2001 (2001-12), pages 2259-2281, XP004508355 ISSN: 0031-3203 *
FISHER M ET AL: "TOOTH-CARIES EARLY DIAGNOSIS AND MAPPING BY FOURIER TRANSFORM SPECTRAL IMAGING FLUORESCENCE" INSTRUMENTATION SCIENCE AND TECHNOLOGY, vol. 30, no. 2, 2002, pages 225-232, XP009035410 *
LUCEY S ET AL: "A suitability metric for mouth tracking through chromatic segmentation" PROCEEDINGS 2001 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING. ICIP 2001. THESSALONIKI, GREECE, OCT. 7 - 10, 2001, INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, NEW YORK, NY : IEEE, US, vol. VOL. 1 OF 3. CONF. 8, 7 October 2001 (2001-10-07), pages 258-261, XP010563332 ISBN: 0-7803-6725-1 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7845039B2 (en) 2003-09-09 2010-12-07 The Procter & Gamble Company Toothbrush with severable electrical connections
US7577284B2 (en) 2006-04-21 2009-08-18 Carestream Health, Inc. Optical detection of dental caries
US7844091B2 (en) 2006-04-21 2010-11-30 Carestream Health, Inc. Optical detection of dental caries
WO2007127036A1 (en) * 2006-04-21 2007-11-08 Carestream Health, Inc. Optical detection of dental caries
US8768016B2 (en) 2009-06-19 2014-07-01 Carestream Health, Inc. Method for quantifying caries
US9773306B2 (en) 2009-06-19 2017-09-26 Carestream Health, Inc. Method for quantifying caries
EP2312527A3 (en) * 2009-06-19 2013-03-06 Carestream Health, Inc. Method for quantifying caries
CN106898013A (en) * 2009-06-19 2017-06-27 卡尔斯特里姆保健公司 The method of quantifying caries
US8908936B2 (en) 2009-10-14 2014-12-09 Carestream Health, Inc. Method for extracting a carious lesion area
US8687859B2 (en) 2009-10-14 2014-04-01 Carestream Health, Inc. Method for identifying a tooth region
US9020228B2 (en) 2009-10-14 2015-04-28 Carestream Health, Inc. Method for identifying a tooth region
US9235901B2 (en) 2009-10-14 2016-01-12 Carestream Health, Inc. Method for locating an interproximal tooth region
EP2348484A1 (en) * 2009-10-14 2011-07-27 Carestream Health, Inc. Method for extracting a carious lesion area
EP2587452A1 (en) * 2010-12-13 2013-05-01 Carestream Health, Inc. Method for identification of dental caries in polychromatic images
US10572755B2 (en) 2015-09-28 2020-02-25 Olympus Corporation Image analysis apparatus for calculating degree of change in distribution characteristic values, image analysis system, and method for operating image analysis system

Also Published As

Publication number Publication date
EP1624797A2 (en) 2006-02-15
JP2007502185A (en) 2007-02-08
WO2004104927A3 (en) 2005-06-16
US20040254478A1 (en) 2004-12-16
WO2004103171A3 (en) 2005-01-27
US20040240716A1 (en) 2004-12-02
AU2004241802A1 (en) 2004-12-02
AU2004241802B2 (en) 2008-02-14
CA2520195A1 (en) 2004-12-02
WO2004103171A2 (en) 2004-12-02

Similar Documents

Publication Publication Date Title
US20040240716A1 (en) Analysis and display of fluorescence images
JP7427038B2 (en) Intraoral scanner with dental diagnostic function
US9870613B2 (en) Detection of tooth condition using reflectance images with red and green fluorescence
US10888400B2 (en) Methods and apparatuses for forming a three-dimensional volumetric model of a subject&#39;s teeth
US9770217B2 (en) Dental variation tracking and prediction
US8866894B2 (en) Method for real-time visualization of caries condition
JP5729924B2 (en) Caries determination method
JP6086573B2 (en) Method and apparatus for characterizing pigmented spots and its application in methods for evaluating the coloring or depigmenting effect of cosmetic, skin or pharmaceutical products
JP5165732B2 (en) Multispectral image processing method, image processing apparatus, and image processing system
JP6478984B2 (en) Intraoral imaging method and system using HDR imaging and removing highlights
JP4599520B2 (en) Multispectral image processing method
KR20110040739A (en) Method for extracting a carious lesion area
WO2016073569A2 (en) Video detection of tooth condition using green and red fluorescence
US9547903B2 (en) Method for quantifying caries
JP6721939B2 (en) Fluorescence image analyzer

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2006530674

Country of ref document: JP

DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
122 Ep: pct application non-entry in european phase