CN118160046A - Method and apparatus for determining tooth color values - Google Patents

Method and apparatus for determining tooth color values Download PDF

Info

Publication number
CN118160046A
CN118160046A CN202280072234.XA CN202280072234A CN118160046A CN 118160046 A CN118160046 A CN 118160046A CN 202280072234 A CN202280072234 A CN 202280072234A CN 118160046 A CN118160046 A CN 118160046A
Authority
CN
China
Prior art keywords
teeth
color
tooth
image
calibration pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280072234.XA
Other languages
Chinese (zh)
Inventor
胡文超
孙逸雯
王杉
杨荟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Unilever IP Holdings BV
Original Assignee
Unilever IP Holdings BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Unilever IP Holdings BV filed Critical Unilever IP Holdings BV
Publication of CN118160046A publication Critical patent/CN118160046A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Biomedical Technology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)

Abstract

The present invention relates to a method and a corresponding device for determining color values of one or more teeth. The computer-implemented method includes receiving an image of a tooth and a calibration pattern; identifying one or more teeth from the image using a segmentation model, wherein the segmentation model is a tooth-by-tooth segmentation model configured to detect individual teeth in the image; determining an observed color for each of the one or more teeth from the image; identifying a plurality of colored regions of the calibration pattern from the image; determining an observed color for each colored region of the calibration pattern; comparing the observed color of each colored region of the calibration pattern with the corresponding known color of the corresponding known pattern, thereby determining a correction model; and applying a correction model to the observed color of each of the one or more teeth to determine color values of the one or more teeth.

Description

Method and apparatus for determining tooth color values
Technical Field
The present invention relates to methods and apparatus for determining color values of one or more teeth, and in particular to methods and apparatus for a user to self-evaluate tooth whiteness.
Background
Consumers have a strong desire for healthy and white teeth. While many people are concerned about their tooth whiteness (which may also be referred to as tooth whitening), it is difficult to know the level of tooth whiteness with the naked eye. Traditionally, consumers have been required to look at dentists to see their tooth whiteness levels. The professional dentist is in a form called VITA Bleachedguide D-A set of dental specimens was used for the evaluation in the dental whiteness evaluation method. The dentist can hold individual ones of a set of tooth samples adjacent to the patient's teeth in order to find a sample that appears to be similar in color to the patient's teeth. Advantages of such tooth classification schemes include evaluation of tooth color by trained professionals, and the use of validated tools. Disadvantages of this approach include that it requires a standard environment and that it requires the consumer to go to the dental office in person. In addition, this desire for whiter teeth has led to a trend towards increased use of tooth whitening products (ranging from toothpastes to mouthwashes and chewing gums). However, it is not easy to track changes in tooth color after the tooth whitening product is applied.
Some tooth whitening products provide consumers with printed color calibration cards to measure their tooth whiteness. Conventional color calibration codes typically contain dental images of various hues. The user may hold the color calibration card next to their mouth to allow another person to evaluate which image looks most similar to the user's teeth, or the user may view their own teeth while viewing the image in a mirror. Such a color calibration card is not a verified standard tool. Furthermore, the accuracy of the evaluation may be affected due to the print quality of the card and the accuracy of the visual evaluation by the user.
Aspects of the present invention address the problems encountered in prior art tooth color assessment methods.
Disclosure of Invention
According to a first aspect of the present invention there is provided a computer-implemented method for determining color values of one or more teeth, the method comprising:
Receiving an image of a tooth and a calibration pattern;
Identifying one or more teeth from the image using a segmentation model, wherein the segmentation model is a tooth-by-tooth segmentation model configured to detect individual teeth in the image;
determining an observed color for each of the one or more teeth from the image;
identifying a plurality of colored regions of the calibration pattern from the image;
determining an observed color for each colored region of the calibration pattern;
comparing the observed color of each colored region of the calibration pattern with the corresponding known color of the corresponding known pattern, thereby determining a correction model; and
The correction model is applied to the observed color of each of the one or more teeth to determine color values of the one or more teeth.
The color value may provide an indication of the whiteness level of one or more teeth. The color value of the one or more teeth may be a single value associated with a plurality of teeth. The method may include determining a color value for each of the one or more teeth by applying a correction model to the observed color of each of the one or more teeth.
The observed tooth color or color value may include an indication of the color, shade, or shade of one or more teeth.
The colored areas of the calibration pattern may also be referred to as color areas, or panels. The areas of the calibration pattern may be corresponding homogeneous areas of a single color, including areas of a particular hue or shade.
The method may include selecting a center portion of each of the one or more teeth. Each central portion may be surrounded by a peripheral portion. Each central portion may comprise at least 60%, preferably 65% to 95%, more preferably 70% to 90%, even more preferably 75% to 85% of the area of the visible region of the corresponding tooth in the image. The color value of each of the one or more teeth may be associated with a respective center portion.
The method may include determining a chemical treatment based on the color values of one or more teeth. The chemical process may be determined by entering color values in a look-up table of chemical processes. The determined chemical treatment may involve applying a tooth whitening product to the teeth based on the color value. Preferably, the tooth whitening product is applied to the teeth for a period of time. The determined chemical treatment may involve recommending a tooth whitening product for the tooth based on the color value. Preferably, the tooth whitening product is used to produce a change in tooth color value.
The method may include storing the color value with an associated date or time stamp.
The image may be received from a camera of the user device. The method may be performed by a processor of a user equipment. The method may be performed remotely from the user equipment. The method may be performed by a computer server.
According to a second aspect, there is provided a method performed by a user to determine color values of one or more teeth using a user device, wherein the user:
maintaining the calibration pattern adjacent to the user's mouth;
Exposing one or more teeth to a camera of the user device; and
The user equipment is operated to perform the method of the first aspect.
According to a third aspect, a computer-implemented method for training a segmentation model to identify teeth is provided. The segmentation model may be trained to identify individual teeth, for example, on a tooth-by-tooth basis. The method may include providing training data to the segmentation model. The training data may include a plurality of annotated images of teeth, wherein the teeth have been manually identified.
According to a fourth aspect, there is provided a computer readable medium comprising non-transitory computer program code configured to cause a processor to perform any of the methods described above.
According to a fifth aspect, there is provided a mobile computing device comprising:
A processor;
A camera for acquiring a dental image and a calibration pattern; and
A computer readable medium according to the fourth aspect.
The mobile computing device may be a portable computing device of a user, such as a laptop computer, tablet computer (e.g.) Or a smart phone.
According to a sixth aspect, there is provided a tooth whitening kit comprising:
A tooth whitening product; and
Calibration patterns for use in a method according to the first and/or second aspect described above.
The tooth whitening kit may also provide instructions or code for accessing a computer program configured to perform the method according to the first and/or second aspects described above, for example providing details of a Quick Response (QR) code, a Uniform Resource Location (URL) or a program name in an application store.
According to a seventh aspect of the present invention there is provided a data processing unit configured to perform any of the methods described herein as being computer implementable. The data processing unit may comprise one or more processors and a memory comprising computer program code configured to cause the processor to perform any of the methods described herein.
A computer program may be provided which, when run on a computer, causes the computer to configure any apparatus, including the circuits, units, apparatus or systems disclosed herein, to perform any of the methods disclosed herein. The computer program may be a software implementation. The computer may comprise suitable hardware including one or more processors and memory configured to perform the methods defined by the computer program.
The computer program may be provided on a computer readable medium, which may be a physical computer readable medium such as a disk or memory device, or may be embodied as a transitory signal. Such transient signals may be network downloads, including internet downloads. The computer readable medium may be a computer readable storage medium or a non-transitory computer readable medium.
Those skilled in the art will appreciate that features or parameters described in relation to any one of the aspects described above may be applied to any other aspect unless mutually exclusive. Furthermore, unless mutually exclusive, any feature or parameter described herein may be applied to any aspect and/or combined with any other feature or parameter described herein.
Drawings
FIG. 1 shows a schematic block diagram of a computer system;
FIG. 2 illustrates a flow chart of a method for determining color values of one or more teeth;
fig. 3 shows an image of a user's mouth identifying regions of individual teeth using a tooth segmentation model.
Detailed Description
Except in the examples, or where otherwise explicitly indicated, all numbers in this description indicating amounts of material or conditions of reaction, physical properties of materials and/or use may optionally be understood as modified by the word "about".
It should be noted that any particular upper value may be associated with any particular lower value when specifying any range of values.
For the avoidance of doubt, the word "comprising" is intended to mean "including", but not necessarily "consisting of …". In other words, the listed steps or options need not be exhaustive.
The disclosure of the invention as found herein is considered to cover all embodiments found in the claims that are dependent on each other in multiple, irrespective of the fact that the claims may be found without multiple dependencies or redundancies.
If features are disclosed in relation to a particular aspect of the invention (e.g., a composition of the invention), such disclosure is also considered applicable to any other aspect of the invention (e.g., a method of the invention) where necessary.
FIG. 1 illustrates a schematic block diagram of a computer system 100 that may be used to implement the methods described herein. The system may typically be provided by a user device, such as a laptop computer, tablet computer or smart phone.
The system 100 includes one or more processors 102 in communication with a memory 104. Memory 104 is an example of a non-transitory computer-readable storage medium. The one or more processors 102 are also in communication with one or more input devices 106 and one or more output devices 108. In addition to the one or more input devices 106, the processor communicates with a camera 110 to obtain one or more images. The various components of system 100 may be implemented using general means for computing known in the art, for example, input device 106 may comprise a keyboard or mouse, or a touch screen interface, and output device 108 may comprise a monitor or display, or an audio output device such as a speaker.
Fig. 2 illustrates a method 200 for determining color values of one or more teeth. The method 200 may be implemented by a computing device. Method 200 includes steps that may be performed by a processor locally at a user device or remotely at a server. Preferably, the computer-implemented method is provided by a software application, such as a WeChat applet, a Taobao applet or an application for a mobile device.
The various steps of method 200 may be performed in various orders and not necessarily in the order presented below. In particular, although some of the steps in fig. 2 are shown as being performed in parallel, alternatively, the steps may be performed in series, and vice versa.
The method 200 includes receiving 202 an image of a tooth and a calibration pattern. Preferably, the image comprises at least 8 teeth. One or more teeth are identified 204 from the image using the segmentation model. Preferably, the segmentation model is configured to identify individual teeth and to associate a region of the image with each individual tooth. It has been found that identifying individual teeth, as opposed to groups of teeth, can improve the accuracy of the color measurement. For example, the segmentation model may be implemented using a trained machine learning system.
Fig. 3 shows an image 300 of a user's mouth with areas of individual teeth identified using a tooth segmentation model. In this image 300, each identified tooth has been marked by a uniform masking region 301-312.
Returning to fig. 2, the observed color of each of the one or more teeth is determined 206 from the image. The observed color may be considered the original color of the pixel or pixels associated with a particular tooth. For example, one or more pixels of a central portion of the dental image may be used to avoid shadowing effects towards the edge of the tooth. In one example, segmentation results for individual teeth require removal of at least 10% or 15% of the peripheral area to reduce the effects of edge shadows and improve the accuracy of overall tooth color calculation. In such examples, the central region may occupy up to 85% or 90% of the tooth area, and the peripheral region may be excluded.
Using the image of the calibration pattern, a plurality of observed color regions of the calibration pattern are identified 208 and the color of each color region is determined 210. The calibration pattern corresponds to a known pattern and includes a plurality of differently colored regions. Each color region on the calibration pattern patch (and known calibration patterns) may be a homogeneous region of a single color, hue, or shade, although the color regions do not necessarily appear to be homogeneous in the image due to illumination effects.
The correction model is determined 212 by comparing the observed color of each color region of the calibration pattern with the corresponding known color of the corresponding known pattern. If observed under specific lighting conditions by known devices, the known pattern contains the colors expected to be observed in the calibration pattern. For example, by comparing the observed red with the corresponding known red and the same comparison of the green and blue areas, a large amount of information about the difference between the observed color and the actual known color in the image can be obtained. A difference value for each color region can be obtained. The difference values may be used to obtain a correction model using conventional color filtering algorithms. The correction model may be configured to provide a mapping between the observed color and the corrected color (thereby taking into account, for example, ambient lighting conditions or camera settings), which may be applied to other portions of the same image. In this manner, after adjustment using the correction model, color values for the one or more teeth may be determined 214 based on the observed color for each of the one or more teeth. The color value may provide an indication of the whiteness level of a tooth or teeth. For example, the color value may provide an indication of an average (e.g., average or median) color of all visible one or more teeth. Preferably, the color value provides an indication of the median color of all one or more teeth. In some examples, the color value may be a definition of a color in the identified color space, such as CIELAB. In other examples, the color values may provide a whiteness score or whiteness grade that may be on any scale.
The user may use the user device to determine the color values of one or more teeth by maintaining the calibration pattern adjacent to the user's mouth, exposing the one or more teeth to a camera of the user device, and operating the user device to perform the method 200. That is, users can self-photograph using their own photographing device (e.g., a smart phone) and then process the image using a program to immediately obtain the tooth whiteness result. In everyday scenes, tooth colors obtained directly from a tooth photograph are not properly exposed to consistent and standard lighting conditions, and images obtained from various smart devices are not standardized for color recognition. To overcome these limitations, the calibration pattern allows color calibration and white balancing of the tooth area.
In some examples, the present invention may replace the traditional dental whiteness assessment method (dentist scoring), for example, by detecting dental whiteness from a photograph taken by a mobile device. Consumers can know the whiteness of the teeth anytime and anywhere by using the invention. By avoiding the need to visit the dentist for the purpose of tooth color assessment, time can be saved. Consumers can use the method at home to track the tooth whitening effect of products such as whitening strips, whitening emulsion and the like. The result of the score may be published immediately.
For example, the above-described methods are configured to allow users to determine their own whiteness of teeth. Thus, it may be convenient to provide access to the method 200 at the tooth whitening product together so that the method 200 may be applied to determine the outcome of using the product.
In one example, a tooth whitening kit is provided that includes a tooth whitening product and a calibration pattern for the method 200. The tooth whitening kit may also provide instructions or code for accessing a computer program configured to perform the method 200. For example, the box of the kit may apply a code such as a QR code, a URL for evaluating or acquiring a computer program, or detailed information of a program name in an application store. Code or information for accessing the computer program may also be provided on a portion of the tooth whitening product container, instructions or calibration pattern.
In some examples, the method 200 may further include providing personalized product recommendations based on the user's color values. In such examples, the appropriate chemical treatment is determined based on the color values determined for one or more teeth. The appropriate process may be determined by entering color values in a look-up table of chemical processes. The determined chemical treatment may involve applying a tooth whitening product (e.g., a particular formulation of a tooth whitening agent) to the tooth based on the color value. Preferably, the tooth whitening product is applied to the teeth for a period of time. Or the determined chemical treatment may involve recommending a tooth whitening product for the tooth based on the color value. Preferably, the tooth whitening product is used to produce a change in tooth color value.
The method can be used to track the progress of the tooth whitening process over time. For example, a user may wish to see how a particular product affects their teeth over a period of use (e.g., days or weeks). To help facilitate such comparison, the method 200 may further include storing the color value with an associated date or time stamp. The aggregate data from the periods of use may be stored in a database and the software may be configured to display the data to the user, for example in tabular or graphical form.
A computer-implemented method for training a segmentation model to identify teeth, wherein the segmentation model is implemented by a machine learning algorithm, the method comprising providing training data to the segmentation model, the training data comprising a plurality of annotated images of teeth, wherein teeth have been manually identified. Preferably, the segmentation model is trained to identify individual teeth, for example, on a tooth-by-tooth basis.
The tooth segmentation model of the tooth whiteness detection algorithm based on the deep neural network can be trained by the following ways:
a. Taking a picture of a tooth sample;
b. marking each tooth in the taken sample photograph (i.e., the area of the tooth can be manually identified); and
C. Training a deep learning tooth-by-tooth segmentation model based on a sample photograph with tooth markers.
Preferably, a sample photograph of the teeth is taken with a rich and evenly distributed tooth color. Preferably, the marker is a polygonal marker.
The color calibration card detection model of the tooth whiteness detection algorithm based on the deep neural network can be trained by the following modes:
a. Taking a sample photograph of the color calibration card;
b. marking the area of the color calibration card in the photographed picture; and
C. training a deep learning object detection model based on the labeled sample photograph.
Preferably, the sample photographs of the color calibration card are taken at different brightness, color temperature, photographing angle and distance. More preferably, the sample photograph is taken at a resolution of greater than 800 p.
Preferably, the area of the color calibration card in the photographed picture is marked with a rectangular frame.
Examples
The following examples are presented to facilitate the understanding of the present invention. The examples are not intended to limit the scope of the claims.
An embodiment according to the method described previously with reference to fig. 2 is provided below. The inventors have determined that various technical challenges may be overcome in order to ensure reliability of tooth whiteness assessment. Tooth-by-tooth segmentation can be used to accurately identify regions of interest and reduce the impact on other regions of the oral cavity.
In one embodiment, the user takes the following steps:
i) The user beats through an application on the user's personal device (e.g., a smart phone or tablet computer)
Take a photograph of the teeth and a color calibration card held by the user. The color calibration card includes a plurality of samples. ii) segmenting each individual tooth by means of a pre-trained tooth-by-tooth segmentation model, obtaining all visualizations
An image of a tooth.
Iii) The color of the color calibration card and each color sample is detected by a pre-trained color calibration card (and color sample) detection model.
Iv) calibrating the color of each individual tooth in the photograph based on the color and brightness variations of the palette matrix in the color calibration card and converting the color (to CIELAB color space for comparison with VITA SHADE) to a tooth standard whiteness score.
Various aspects of the four steps described above are described in further detail below.
In a first step, when the user takes a photograph of the anterior angle of the tooth, the user should show the tooth to the camera, bring the color calibration card close to the tooth, reduce shadows, and ensure that the light has a consistent effect on the tooth and the color calibration card, e.g., the same light intensity and angle, to reduce errors caused by the relative color changes of the tooth and the color calibration card. Typically, the image will include at least 8 teeth. The application may import an image or take a picture of the user directly while exposing the teeth and standard color calibration cards.
In a second step, after the image is obtained, the masked areas of the teeth are extracted from the image. This step procedure does not necessarily use information about the face details nor includes the gingival area.
The Mask R-CNN model may be used to train the tooth-by-tooth segmentation model used in the second step. Details about the implementation of such a model can be found in Mask R-CNN:He,Kaiming,Georgia Gkioxari,Piotr Dollár,and Ross Girshick."Mask R-CNN."ArXiv:1703.06870[Cs],March 20,2017.http://arxiv.org/abs/1703.06870.
In one embodiment, the dataset used to train the model includes 1,500 photographs of teeth taken from 100 users to account for the different color intervals of the teeth. The user may choose to train the model using more or fewer images, or choose other deep neural network models than Mask R-CNN.
In the third step, all color samples in the color calibration card and the card are detected by a pre-trained color calibration card (and color sample) detection model to obtain the area of the color calibration card and the color of each sample.
The YOLO v3 model may be used to mark photographs of the color calibration card and train the color calibration card (and color sample) detection model for the third step. Details about the implementation of such a model can be found in YOLOv3:Redmon,Joseph,and Ali Farhadi."YOLOv3:An Incremental Improvement."ArXiv:1804.02767[Cs],April 8,2018.http://arxiv.org/abs/1804.02767.
In one embodiment, 1000 pictures of the calibration card are used as the training dataset. After extracting the color calibration card area, an OpenCV (https:// OpenCV. Org /) model may be used to extract a numerical color matrix of sample color values according to the relative arrangement positions of the samples. The user may select other deep neural network models, such as Mask R-CNN, to directly or indirectly extract the color patch matrix on the color calibration card with or without OpenCV. Using the obtained numerical color matrix with known truth values of the color matrix, a color calibration and white balance model is obtained in a conventional manner. The present invention does not necessarily impose a limit on the type of calibration card or the number of color patches on the card, so long as the color range covers a wide range or all colors and whiteness.
In a fourth step, a color calibration and white balance model is applied to each individual tooth region divided in the second step to obtain a color-reproduced true tooth color, and the color is converted into a whiteness score of each tooth according to a dental standard. For example, the Finlayson 2015 color calibration method (Finlayson,Graham D.,Michal Mackiewicz,and Anya Hurlbert."Colour correction using root-polynomial regression"IEEE Transactions on Image Processing 24.5(2015):1460-1470) may be applied to each tooth, and the RGB color values may then be extracted and converted to CIELAB color values for all pixels.
To finally obtain the tooth color values, lighting conditions on different teeth are considered. Dark shadows on the side teeth and white reflections on the incisors are sources of noise that should be removed from the color calculation. To improve the accuracy and consistency of overall color extraction, tooth regions are cut from the image and erode 15% of the individual regions that approximate their own contours, as the tooth sides of the teeth tend to be relatively underexposed. That is, 15% of the outer area of each tooth area is removed. In more stringent cases, 10% of the darkest and brightest values in the CIELAB color range are also removed. The median value of each resulting tooth color value may be used for the representation.
The color values may provide CIE whiteness index (WIO:W.Luo,S.Westland,P.Brunton,R.Ellwood,I.A.Pretty,N.Mohan,Comparison of the ability of different colour indices to assess changes in tooth whiteness,J.Dent.35(2007)109-116) and CIELAB-based whiteness index (WID:M.del Mar Pérez,R.Ghinea,M.J.Rivas,A.Yebra,A.M.Ionescu,R.D.Paravina,L.J.Herrera,Development of a customized whiteness index for dentistry based on CIELAB colour space,Dent.Mater.32(2016)461-467), with the following corresponding formulas. The color closest to the Euclidean distance (ΔE) represented by LAB at VITA SHADE Guide may also be provided as a color value.
WIO=Y+1075.012(xn-x)+145.516(yn-y) (1)
Where (x, y) and (xn, yn) are the chromaticity coordinates of the sample and reference white, respectively.
WID=0.511L*-2.324a*-1.100b* (2)
Wherein L, a and b correspond to luminance, green-red and blue-yellow coordinates, respectively.
The color values may also provide a whiteness score or whiteness grade that may be on any scale.

Claims (14)

1. A computer-implemented method (200) for determining color values of one or more teeth, comprising:
receiving (202) an image of a tooth and a calibration pattern;
Identifying (204) one or more teeth from the image using a segmentation model, wherein the segmentation model is a tooth-by-tooth segmentation model configured to detect individual teeth in the image;
Determining (206) an observed color of each of the one or more teeth from the image;
Identifying (208) a plurality of colored regions of the calibration pattern from the image;
Determining (210) an observed color for each colored region of the calibration pattern;
Comparing (212) the observed color of each colored region of the calibration pattern with a corresponding known color of a corresponding known pattern, thereby determining a correction model; and
The correction model is applied to the observed color of each of the one or more teeth to determine (214) color values of the one or more teeth.
2. The method (200) of claim 1, wherein the color value provides an indication of a whiteness level of the one or more teeth.
3. The method (200) of claim 1 or claim 2, wherein the color value provides an indication of a median color of all of the one or more teeth.
4. The method (200) according to one of the preceding claims, wherein the method comprises determining a chemical treatment from the color values of the one or more teeth.
5. The method (200) according to claim 4, wherein the determined chemical treatment involves recommending a tooth whitening product for the tooth based on the color values of the one or more teeth.
6. The method (200) of one of the preceding claims, wherein the calibration pattern is positioned adjacent to an oral cavity containing the one or more teeth in the image.
7. The method (200) according to one of the preceding claims, wherein the image comprises at least 8 teeth.
8. The method (200) of one of the preceding claims, wherein the image is received from a camera of a user equipment.
9. The method (200) according to one of the preceding claims, wherein the method is provided by a software application, preferably a WeChat applet, a Taobao applet or an application for a mobile device.
10. A method performed by a user using a user device to determine color values of one or more teeth, wherein the user:
maintaining a calibration pattern adjacent to the user's mouth;
Exposing one or more teeth to a camera of the user device; and
The user equipment is operated to perform the method (200) according to one of the preceding claims.
11. A computer-implemented method for training a segmentation model to identify teeth, wherein the segmentation model is implemented by a machine learning algorithm, the method comprising providing training data to the segmentation model, the training data comprising a plurality of annotated images of teeth, wherein the teeth have been manually identified.
12. Computer readable medium comprising non-transitory computer program code configured to cause a processor to perform the method (200) according to one of the preceding claims.
13. A mobile computing device, comprising:
The computer readable medium of claim 12;
A processor; and
A camera for acquiring images of teeth and calibration patterns.
14. A tooth whitening kit comprising:
a tooth whitening product;
calibration pattern for use in a method according to one of claims 1 to 10.
CN202280072234.XA 2021-10-28 2022-10-21 Method and apparatus for determining tooth color values Pending CN118160046A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CNPCT/CN2021/126965 2021-10-28
CN2021126965 2021-10-28
EP21210915 2021-11-29
EP21210915.1 2021-11-29
PCT/EP2022/079331 WO2023072743A1 (en) 2021-10-28 2022-10-21 Methods and apparatus for determining a colour value of teeth

Publications (1)

Publication Number Publication Date
CN118160046A true CN118160046A (en) 2024-06-07

Family

ID=84273991

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280072234.XA Pending CN118160046A (en) 2021-10-28 2022-10-21 Method and apparatus for determining tooth color values

Country Status (3)

Country Link
EP (1) EP4423768A1 (en)
CN (1) CN118160046A (en)
WO (1) WO2023072743A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5766006A (en) * 1995-06-26 1998-06-16 Murljacic; Maryann Lehmann Tooth shade analyzer system and methods
WO2004004554A1 (en) * 2002-07-03 2004-01-15 Kabushiki Kaisha Shofu Tooth surface informatin system
DE102019201279A1 (en) * 2019-01-31 2020-08-06 Vita Zahnfabrik H. Rauter Gesellschaft mit beschränkter Haftung & Co. Kommanditgesellschaft Support system for dental treatment, especially by changing the tooth color
FI130746B1 (en) * 2019-03-29 2024-02-26 Lumi Dental Ltd Determining tooth color shade based on image obtained with a mobile device
CN113436734B (en) * 2020-03-23 2024-03-05 北京好啦科技有限公司 Tooth health assessment method, equipment and storage medium based on face structure positioning
CN111462114A (en) * 2020-04-26 2020-07-28 广州皓醒湾科技有限公司 Tooth color value determination method and device and electronic equipment

Also Published As

Publication number Publication date
EP4423768A1 (en) 2024-09-04
WO2023072743A1 (en) 2023-05-04

Similar Documents

Publication Publication Date Title
Koschan et al. Digital color image processing
Cal et al. Application of a digital technique in evaluating the reliability of shade guides
KR101140533B1 (en) Method and system for recommending a product based upon skin color estimated from an image
US7463757B2 (en) Tooth locating within dental images
Carter et al. Automated quantification of dental plaque accumulation using digital imaging
US11010894B1 (en) Deriving a skin profile from an image
CN106791759A (en) The bearing calibration of medical display color uniformity and correction system
JP2008532401A (en) Reflection spectrum estimation and color space conversion using reference reflection spectrum
US7341450B2 (en) Tooth shade scan system and method
WO2012038474A1 (en) Determining the colour of teeth
CN113365546B (en) Dental treatment assistance system, in particular for changing the color of teeth
CN110060250A (en) A kind of tongue body image processing method, device and electronic equipment
KR20180114293A (en) Method Comparing The Forged Sealing
EP4170646A1 (en) Image display system and image display method
WO2005124302A1 (en) Image processing program, image processing apparatus, and image processing method
JP3710802B2 (en) Tooth material color tone selection support program and tooth material color tone selection support method
CN118160046A (en) Method and apparatus for determining tooth color values
CN115278186B (en) Controllable uniform projection method, device, equipment and medium based on Internet of things
US20040248057A1 (en) Enhanced tooth shade guide
Beneducci et al. Dental shade matching assisted by computer vision techniques
CN111462114A (en) Tooth color value determination method and device and electronic equipment
JP2019030587A (en) Gingivitis examination system, gingivitis examination method and its program
PREJMEREAN et al. A DECISION SUPPORT SYSTEM FOR COLOR MATCHING IN DENTISTRY
JP2009003581A (en) Image storage-retrieval system and program therefor
Suehara et al. Color calibration for the pressure ulcer image acquired under different illumination: a key step for simple, rapid and quantitative color evaluation of the human skin diseases using a digital camera

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination