WO2023072743A1 - Méthodes et appareils de détermination d'une valeur de couleur dentaire - Google Patents

Méthodes et appareils de détermination d'une valeur de couleur dentaire Download PDF

Info

Publication number
WO2023072743A1
WO2023072743A1 PCT/EP2022/079331 EP2022079331W WO2023072743A1 WO 2023072743 A1 WO2023072743 A1 WO 2023072743A1 EP 2022079331 W EP2022079331 W EP 2022079331W WO 2023072743 A1 WO2023072743 A1 WO 2023072743A1
Authority
WO
WIPO (PCT)
Prior art keywords
teeth
colour
image
tooth
calibration pattern
Prior art date
Application number
PCT/EP2022/079331
Other languages
English (en)
Inventor
Wenchao HU
Yiwen Sun
Shan Wang
Hui Yang
Original Assignee
Unilever Ip Holdings B.V.
Unilever Global Ip Limited
Conopco, Inc., D/B/A Unilever
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Unilever Ip Holdings B.V., Unilever Global Ip Limited, Conopco, Inc., D/B/A Unilever filed Critical Unilever Ip Holdings B.V.
Publication of WO2023072743A1 publication Critical patent/WO2023072743A1/fr

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • This invention relates to methods and apparatus for determining a colour value of one or more teeth and in particular, for user self-assessment of tooth whiteness.
  • Some tooth whitening products provide a printed colour calibration card for consumers to measure their tooth whiteness.
  • Conventional colour calibration cards usually contain images of teeth of various shades. A user may hold the colour calibration card next to their mouth to allow another person to assess which image appears to be most similar to the teeth of the user, or the user may view their own teeth at the same time as the images in a mirror. Such colour calibration cards are not validated standard tools. Also, the evaluation accuracy could be compromised due to the printing quality of the card and the accuracy of the user’s visual evaluation.
  • a computer-implemented method for determining a colour value of one or more teeth comprising: receiving an image of teeth and a calibration pattern; identifying one or more teeth from the image using a segmenting model, wherein the segmenting model is a tooth-by-tooth segmentation model configured to detect individual teeth in the image; determining an observed colour of each of the one or more teeth from the image; identifying a plurality of coloured areas of the calibration pattern from the image; determining an observed colour of each of the coloured areas of the calibration pattern; determining a correction model by comparing the observed colour of each of the coloured areas of the calibration pattern with a respective known colour of a corresponding known pattern; and applying the correction model to the observed colour of each of the one or more teeth to determine a colour value of the one or more teeth.
  • the colour value may provide an indication of a level of whiteness of the one or more teeth.
  • the colour value of the one or more teeth may be a single value associated with a plurality of teeth.
  • the method may comprise determining a colour value of each of the one or more teeth by applying the correction model to the observed colour of each of the one or more teeth.
  • the observed colour of the teeth or the colour value may comprise an indication of a colour, tone or shade of the one or more teeth.
  • the coloured areas of the calibration pattern may also be referred to as areas of colour, or swatches.
  • the areas of the calibration pattern may be respective homogenous areas of a single colour, including areas of a particular tone or shade.
  • the method may comprise selecting a central portion of each of the one or more teeth. Each central portion may be surrounded by a peripheral portion. Each central portion may comprise at least 60%, preferably from 65% to 95%, more preferably from 70% to 90% and even more preferably from 75 to 85% of the area of a visible area of a respective tooth within the image. A colour value of each of the one or more teeth may be associated with a respective central portion.
  • the method may comprise determining a chemical treatment in accordance with the colour value of the one or more teeth. The chemical treatment may be determined by entering the colour value in a look-up table of chemical treatments. The determined chemical treatment may relate to applying a tooth whitening product to teeth in accordance with the colour value.
  • the tooth whitening product is applied to teeth for a period of time.
  • the determined chemical treatment may relate to recommending a tooth whitening product to teeth in accordance with the colour value.
  • the tooth whitening product is for producing a change in colour value of the teeth.
  • the method may comprise storing the colour value with an associated date-stamp or timestamp.
  • the images may be received from a camera of a user device.
  • the method may be performed by a processor of the user device.
  • the method may be performed remotely from the user device.
  • the method may be performed by a computer server.
  • a method performed by a user to determine a colour value of one or more teeth using a user device, wherein the user: holds a calibration pattern adjacent to the user’s mouth; bares one or more teeth to a camera of the user device; and operates the user device to perform the method of the first aspect.
  • a computer-implementing method for training a segmenting model to identify teeth may be trained to identify individual teeth, for example, on a tooth-by-tooth basis.
  • the method may comprise providing the segmenting model with training data.
  • the training data may comprise a plurality of annotated images of teeth in which the teeth have been manually identified.
  • a computer-readable medium comprising non- transitory computer program code configured to cause a processor to execute any of the above methods.
  • a mobile computing device comprising: a processor; a camera for obtaining an image of teeth and a calibration pattern; and the computer readable medium according to the fourth aspect.
  • the mobile computing device may be a user’s portable computing device, such as a laptop computer, tablet (e.g. Apple® iPad®) or smart phone.
  • a tooth whitening kit comprising: a tooth whitening product; and a calibration pattern for use in the method according to the first and/or second aspects described above.
  • the tooth whitening kit may also provide instructions or a code for accessing a computer program configured to perform the method according to the first and/or second aspects described above, such as the providing of a quick response (QR) code, a universal resource location (URL) or details of the program name in an App Store.
  • QR quick response
  • URL universal resource location
  • a data processing unit configured to perform any method described herein as a computer-implementable.
  • the data processing unit may comprise one or more processors and memory, the memory comprising computer program code configure to cause the processor to perform any method described herein.
  • the computer program may be a software implementation.
  • the computer may comprise appropriate hardware, including one or more processors and memory that are configured to perform the method defined by the computer program.
  • the computer program may be provided on a computer readable medium, which may be a physical computer readable medium such as a disc or a memory device, or may be embodied as a transient signal. Such a transient signal may be a network download, including an internet download.
  • the computer readable medium may be a computer readable storage medium or non-transitory computer readable medium.
  • Figure 1 illustrates a schematic block diagram of a computer system
  • Figure 2 illustrates a flow chart of a method for determining the colour value of one or more teeth
  • Figure 3 illustrates an image of a user’s mouth with areas of individual teeth identified using a tooth segmentation model.
  • Figure 1 illustrates a schematic block diagram of a computer system 100 which may be used to implement the methods described herein.
  • the system may typically be provided by a user device, such as a laptop computer, tablet computer of smart phone.
  • the system 100 comprises one or more processors 102 in communication with memory 104.
  • the memory 104 is an example of a non-transitory computer readable storage medium.
  • the one or more processors 102 are also in communication with one or more input devices 106 and one or more output devices 108.
  • the processor is in communication with a camera 110 for obtaining one or more images.
  • the various components of the system 100 may be implemented using generic means for computing known in the art.
  • the input devices 106 may comprise a keyboard or mouse, or a touch screen interface
  • the output devices 108 may comprise a monitor or display, or an audio output device such as a speaker.
  • Figure 2 illustrates a method 200 for determining a colour value of one or more teeth.
  • the method 200 may be implemented by computing means.
  • the method 200 comprises steps that may be performed by a processor, either locally at a user device or remotely at a server.
  • the computer-implemented method is provided by a software application such as a WeChat Mini Program, Taobao Mini Program or App for a mobile device.
  • the method 200 comprises receiving 202 an image of teeth and a calibration pattern.
  • the image comprises at least 8 teeth.
  • One or more teeth are identified 204 from the image using a segmenting model.
  • the segmentation model is configured to recognize individual teeth and associate a region of the image with each individual tooth. The identification of individual teeth, as opposed to groups of teeth, has been found to improve the accuracy of determination of the colour.
  • the segmentation model may be implemented using a trained machined learning system, for example.
  • Figure 3 illustrates an image 300 of a user’s mouth with areas of individual teeth identified using a tooth segmentation model. The respective identified teeth have been marked by homogeneous masked regions 301-312 in this image 300.
  • an observed colour of each of the one or more teeth is determined 206 from the image.
  • the observed colour may be taken to be the raw colour of a pixel or a plurality of pixels associated with a particular tooth.
  • one or more pixels at a central portion of an image of the tooth may be used to avoid shadow effects towards the edges of the teeth.
  • the segmentation result of single tooth needs to be removed at least 10 % or of the overall tooth colour calculation.
  • the central region may occur up to 85 % or 90 % of the region of the tooth, and may exclude the peripheral region.
  • the calibration pattern corresponds to a known pattern and comprises a plurality of areas of different colours.
  • Each area of colour on the calibration pattern sheet (and the known calibration pattern) may be a homogenous area of a single colour, tone or shade, although due to the effects of lighting the areas of colour will not necessarily appear to be homogeneous in the image.
  • a correction model is determined 212 by comparing the observed colour of each of the areas of colour of the calibration pattern with a respective known colour of a corresponding known pattern.
  • the known pattern contains the colours would be expected to be observed in the calibration pattern if it were viewed under specific lighting conditions by a known device. For example, by comparing an observed red with a corresponding known red, and doing the same for a green area and a blue area, a substantial amount of information is available on the difference between the observed colour in the image and the actual known colour.
  • a difference value for each of the areas of colour can be obtained. The difference values may be used to obtain a correction model using conventional colour filtering algorithms.
  • the correction model may be configured to provide a mapping between observed colours and corrected colours (thereby accounting for the ambient lighting conditions or camera settings, for example), which may be applied to other parts of the same image.
  • a colour value of the one or more teeth may be determined 214 based on the observed colour of each of the one or more teeth after adjustment using the correction model.
  • the colour value may provide an indication of a level of whiteness, of a single tooth, or a plurality of teeth.
  • the colour value may provide an indication of an average (e.g. mean or median) colour for all of the one or more teeth that are visible.
  • the colour value provides an indication of a median colour for all of the one or more teeth.
  • the colour value may be a definition of a colour in a recognised colour space, such as Cl ELAB.
  • the colour value may provide a whiteness score or a whiteness rating, which may be on an arbitrary scale.
  • a user may determine a colour value of one or more teeth using a user device by holding a calibration pattern adjacent to the user’s mouth, baring one or more teeth to a camera of the user device, and operating the user device to perform the method 200. That is, the user may use their own photo-taking device, such as a smartphone, to take a selfie, then process the image using a program to get a tooth whiteness results instantly.
  • the teeth colours obtained directly from the photos of the teeth are not properly exposed to consistent and standard lighting conditions, and the images obtained from various smart devices are also not normalized for colour identification purpose.
  • the calibration pattern allows colour calibration and white balancing on teeth regions.
  • the invention may replace the traditional tooth whiteness evaluation method (dentist scoring) with detecting tooth whiteness from photos taken by a mobile device, for example.
  • Consumers can use the invention to know their tooth whiteness at any time and place. Time can be saved by avoiding the need to visit a dentist for the purpose of tooth colour evaluation. Consumers may use the method at home for tracking tooth whitening effects of products like whitening strips and whitening emulsions. The results of score may be available immediately.
  • the above method is configured to allow a user to determine their own tooth whiteness, for example. As such, it may be convenient to provide access to the method 200 alongside a tooth whitening product, so that the method 200 can be applied to determine the result of using the product.
  • a tooth whitening kit comprising a tooth whitening product and a calibration pattern for use in the method 200.
  • the tooth whitening kit may also provide instructions or a code for accessing a computer program configured to perform the method 200.
  • the box of the kit may be applied with a code such as a QR code, a URL for assessing or obtaining the computer program, or of details of the program’s name in an App Store.
  • the code or information for accessing the computer program may also be provided on the tooth whitening product container, instructions or part of the calibration pattern.
  • the method 200 may further comprise providing a personalized product recommendations based on a users’ color value.
  • an appropriate chemical treatment is determined in accordance with the colour value determined for the one or more teeth.
  • An appropriate treatment may be determined by entering the colour value in a look-up table of chemical treatments.
  • the determined chemical treatment may relate to applying a tooth whitening product, such as a specific formulation of a tooth whitening agent, to teeth in accordance with the colour value.
  • the tooth whitening product is applied to teeth for a period of time.
  • the determined chemical treatment may relate to recommending a tooth whitening product in accordance with the colour value.
  • the tooth whitening product is for producing a change in colour value of the teeth.
  • the method may be used to track the progress of a tooth whitening treatment over time. For example, a user may wish to see how a particular product has affected their teeth over a period of use, such as a number of days or weeks. To assist in facilitating such comparisons, the method 200 may further comprise storing the colour value with an associated date-stamp or time-stamp.
  • the aggregated data from a period of use may be stored in a database and the software may be configured to display the data to the user in the form of a table or graph, for example.
  • a computer-implementing method for training a segmenting model to identify teeth wherein the segmenting model is implemented by a machine learning algorithm, the method comprising providing the segmenting model with training data comprising a plurality of annotated images of teeth in which the teeth have been manually identified.
  • the segmenting model is trained to identify individual teeth, for example, on a tooth-by-tooth basis.
  • a teeth segmentation model of a tooth whiteness detection algorithm based on deep neural network may be trained by: a. taking sample photos of teeth; b. conducting labelling for each individual tooth in the taken sample photos (that is the areas of teeth may be identified manually); and c. training the deep learning tooth-by-tooth segmentation model based on the sample photos with teeth labelling.
  • the sample photos of teeth are taken with abundant and uniform-distributed teeth colours.
  • the labelling is polygonal labelling.
  • a colour calibration card detection model of a tooth whiteness detection algorithm based on deep neural network may be trained by: a. taking sample photos of the colour calibration card; b. labelling the areas of colour calibration card in the taken photos; and c. training deep learning object detection model based on the sample photos with labelling.
  • the sample photos of the colour calibration card are taken in different brightness, colour temperature, shooting angle and distance. More preferably, the sample photos are taken with resolution more than 800p.
  • the areas of colour calibration card in the taken photos are labelled with rectangular boxes.
  • Tooth-by-tooth segmentation may be used to precisely identify a region of interest and reduce detraction from other area of mouth.
  • the following steps are taken by a user: i) User take a photo of their teeth and a colour calibration card held by the user, through an application on the user’s personal device, such as a smartphone or tablet.
  • the colour calibration card comprises a plurality of swatches.
  • ii) Obtain an image of all visible teeth by segmenting each individual tooth through a pretrained tooth-by-tooth segmentation model.
  • iii) Detect the colour calibration card and the colour of each swatch through a pretrained colour calibration card (and colour swatch) detection model.
  • the user when a user takes a photo of a front angle of their teeth, the user should show teeth to the camera, bring the colour calibration card close to the teeth, reduce shadows and ensure the effect of light on teeth and the colour calibration card is consistent, such as the same light intensity and angle, to reduce the error caused by the relative colour changes in teeth and the colour calibration card.
  • the image will comprise at least 8 teeth.
  • the application can import an image or directly take a photo of user while exposing teeth along with a standard colour calibration card.
  • the second step after the image is obtained, masked regions of teeth are extracted from the image.
  • This step process does not necessarily use information regarding facial details nor including gum regions.
  • a Mask R-CNN model may be used to train the tooth-by-tooth segmentation model used in the second step. Details regarding implementation of such a model can be found in Mask R-CNN: He, Kaiming, Georgia Gkioxari, Piotr Dollar, and Ross Girshick. “Mask R-CNN.”
  • the dataset used for the training the model included 1 ,500 pieces of teeth photos taken from 100 users to address different colour intervals of teeth. Users may choose to train the model using more or fewer images, or select other deep neural network models than Mask R-CNN.
  • the colour calibration card and all colour swatches in the card are detected to obtain the area of colour calibration card and the colour of each swatch through the pretrained colour calibration card (and colour swatch) detection model.
  • a YOLO v3 model may be used to label photos of the colour calibration card and train the colour calibration card (and colour swatch) detection model for use in the third step. Details regarding the implementation of such a model can be found in YOLOv3: Redmon, Joseph, and Ali Farhadi. “YOLOv3: An Incremental Improvement.” ArXiv: 1804.02767 [Cs], April 8, 2018. http://arxiv.org/abs/1804.02767.
  • colour calibration and white balancing model is applied to each individual tooth area divided in the second step to obtain the true tooth colour after colour rendition, and the colour is converted into the whiteness score of each tooth according to dental standards.
  • the Finlayson 2015 colour calibration method (Finlayson, Graham D., Michal Mackiewicz, and Anya Hurlbert. “Colour correction using root-polynomial regression” IEEE Transactions on Image Processing 24.5 (2015): 1460-1470) may be applied to each tooth, then the RGB colour values may be extracted and converted to Cl ELAB colour values for all pixels.
  • tooth colour values the lighting condition on different teeth is accounted for.
  • the dark shading on lateral teeth and white reflections on front teeth are noise sources that should be removed from colour calculation.
  • teeth regions were sliced from the image and eroded by 15% of individual area close to their own contours, because the teeth lateral sides of teeth tend to be relatively underexposed. That is, the 15% outer regions of each tooth region were removed.
  • the 10% most dark and bright values from the Cl ELAB colour ranges were also removed.
  • the median value of each resulting tooth colour values may be used to be representative.
  • the colour value may provide a CIE whiteness index (WIO: W. Luo, S. Westland, P. Brunton, R. Ellwood, I. A. Pretty, N. Mohan, Comparison of the ability of different colour indices to assess changes in tooth whiteness, J. Dent. 35 (2007) 109-116) and CIELAB-based whiteness index (WID: M. del Mar Perez, R. Ghinea, M.J. Rivas, A. Yebra, A.M. lonescu, R.D. Paravina, L.J. Herrera, Development of a customized whiteness index for dentistry based on Cl ELAB colour space, Dent. Mater. 32 (2016) 461-467) based on their corresponding formulae, provided below.
  • the closest colour on Vita Shade Guide by Euclidian distance of LAB representatives may also be provided as the colour value.
  • L*, a* and b* relate to color coordinates lightness, green-red and blue-yellow respectively.
  • the colour value may also provide a whiteness score or a whiteness rating, which may be on an arbitrary scale.

Abstract

L'invention concerne des méthodes et des appareils correspondants permettant de déterminer une valeur de couleur d'une ou de plusieurs dents. Une méthode à implémentation informatique consiste à recevoir une image de dents et un motif d'étalonnage ; à identifier une ou plusieurs dents à partir de l'image à l'aide d'un modèle de segmentation, c'est-à-dire un modèle de segmentation dent par dent conçu pour détecter des dents individuelles dans l'image ; à déterminer une couleur observée de chaque dent à partir de l'image ; à identifier une pluralité de zones colorées du motif d'étalonnage à partir de l'image ; à déterminer une couleur observée de chacune des zones colorées du motif d'étalonnage ; à déterminer un modèle de correction par comparaison de la couleur observée de chacune des zones colorées du motif d'étalonnage à une couleur connue respective d'un motif connu correspondant ; et à appliquer le modèle de correction à la couleur observée de chaque dent pour déterminer une valeur correspondante de couleur.
PCT/EP2022/079331 2021-10-28 2022-10-21 Méthodes et appareils de détermination d'une valeur de couleur dentaire WO2023072743A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CNPCT/CN2021/126965 2021-10-28
CN2021126965 2021-10-28
EP21210915.1 2021-11-29
EP21210915 2021-11-29

Publications (1)

Publication Number Publication Date
WO2023072743A1 true WO2023072743A1 (fr) 2023-05-04

Family

ID=84273991

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/079331 WO2023072743A1 (fr) 2021-10-28 2022-10-21 Méthodes et appareils de détermination d'une valeur de couleur dentaire

Country Status (1)

Country Link
WO (1) WO2023072743A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020064751A1 (en) * 1995-06-26 2002-05-30 Shade Analyzing Technologies, Inc. Tooth shade analyzer system and methods
JP2008149117A (ja) * 2002-07-03 2008-07-03 Shiyoufuu:Kk 器械装置の制御システム
CN111462114A (zh) * 2020-04-26 2020-07-28 广州皓醒湾科技有限公司 牙齿色值确定方法、装置和电子设备
US20200246121A1 (en) * 2019-01-31 2020-08-06 Vita Zahnfabrik H. Rauter Gmbh & Co. Kg Assistance System for Dental Treatment, in Particular by Changing a Tooth Color
WO2020201623A1 (fr) * 2019-03-29 2020-10-08 Lumi Dental Oy Détermination de teinte de dent en fonction d'une image obtenue à l'aide d'un dispositif mobile
CN113436734A (zh) * 2020-03-23 2021-09-24 北京好啦科技有限公司 基于人脸结构定位的牙齿健康评估方法、设备和存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020064751A1 (en) * 1995-06-26 2002-05-30 Shade Analyzing Technologies, Inc. Tooth shade analyzer system and methods
JP2008149117A (ja) * 2002-07-03 2008-07-03 Shiyoufuu:Kk 器械装置の制御システム
US20200246121A1 (en) * 2019-01-31 2020-08-06 Vita Zahnfabrik H. Rauter Gmbh & Co. Kg Assistance System for Dental Treatment, in Particular by Changing a Tooth Color
WO2020201623A1 (fr) * 2019-03-29 2020-10-08 Lumi Dental Oy Détermination de teinte de dent en fonction d'une image obtenue à l'aide d'un dispositif mobile
CN113436734A (zh) * 2020-03-23 2021-09-24 北京好啦科技有限公司 基于人脸结构定位的牙齿健康评估方法、设备和存储介质
CN111462114A (zh) * 2020-04-26 2020-07-28 广州皓醒湾科技有限公司 牙齿色值确定方法、装置和电子设备

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
FINLAYSON, GRAHAM D.MICHAL MACKIEWICZANYA HURLBERT: "Colour correction using root-polynomial regression", IEEE TRANSACTIONS ON IMAGE PROCESSING, vol. 24.5, 2015, pages 1460 - 1470, XP011574917, DOI: 10.1109/TIP.2015.2405336
HE, KAIMINGGEORGIA GKIOXARIPIOTR DOLLARROSS GIRSHICK: "Mask R-CNN", ARXIV: 1703.06870 [CS, 20 March 2017 (2017-03-20), Retrieved from the Internet <URL:http://arxiv.org/abs/1703.06870>
M. DEL MAR PEREZR. GHINEAM.J. RIVASA. YEBRAA.M. LONESCUR.D. PARAVINAL.J. HERRERA: "Development of a customized whiteness index for dentistry based on CIELAB colour space", DENT. MATER., vol. 32, 2016, pages 461 - 467, XP029423422, DOI: 10.1016/j.dental.2015.12.008
REDMON, JOSEPHALI FARHADI: "YOLOv3: An Incremental Improvement", ARXIV: 1804.02767 [CS, 8 April 2018 (2018-04-08), Retrieved from the Internet <URL:http://arxiv.org/abs/1804.02767>
W. LUOS. WESTLANDP. BRUNTONR. ELLWOODI.A. PRETTYN. MOHAN: "Comparison of the ability of different colour indices to assess changes in tooth whiteness", J. DENT., vol. 35, 2007, pages 109 - 116, XP005824683, DOI: 10.1016/j.jdent.2006.06.006

Similar Documents

Publication Publication Date Title
EP3906404B1 (fr) Utilisation d&#39;analyse d&#39;image pour suivre le processus de la cicatrisation des plaies
KR101140533B1 (ko) 이미지로부터 추정된 피부색에 기초해서 제품을 추천하는 컴퓨터 구현된 방법
AU2014251373B2 (en) Skin diagnostic and image processing methods
CN108020519B (zh) 一种基于颜色恒常性的虚拟多光源光谱重建方法
US9687155B2 (en) System, method and application for skin health visualization and quantification
US11010894B1 (en) Deriving a skin profile from an image
BR112012017253B1 (pt) Método e aparelho para determinar dados de colorimetria de uma amostra de cor a partir de uma imagem da mesma
JP2008532401A (ja) 基準反射スペクトルを用いた反射スペクトル推定と色空間変換
US20140267782A1 (en) Apparatus And Method For Automated Self-Training Of White Balance By Electronic Cameras
WO2020208421A1 (fr) Système et procédé de création d&#39;agents topiques à capture d&#39;image améliorée
Mendoza et al. Automated prediction of sensory scores for color and appearance in canned black beans (Phaseolus vulgaris L.) using machine vision
Ghinea et al. Gingival shade guides: Colorimetric and spectral modeling
KR102634812B1 (ko) 스크린 광 누설량을 추정하는 방법, 장치 및 전자기기
Wannous et al. Improving color correction across camera and illumination changes by contextual sample selection
US10878941B2 (en) Perpetual bioinformatics and virtual colorimeter expert system
WO2023072743A1 (fr) Méthodes et appareils de détermination d&#39;une valeur de couleur dentaire
CN108451501A (zh) 一种基于像素分析的鲜红斑痣颜色和面积评估方法
CN113642358A (zh) 肤色检测方法、装置、终端和存储介质
WO2020201623A1 (fr) Détermination de teinte de dent en fonction d&#39;une image obtenue à l&#39;aide d&#39;un dispositif mobile
Azmi et al. Color correction of baby images for cyanosis detection
Ye et al. Rapid determination of lycopene content and fruit grading in tomatoes using a smart device camera
CN112504442B (zh) 参数确定方法、装置、设备及存储介质
Beneducci et al. Dental shade matching assisted by computer vision techniques
JP2023039539A (ja) 色判定システム、色判定方法および、色判定プログラム
JP2009003581A (ja) 画像蓄積・検索システム及び画像蓄積・検索システム用プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22809015

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112024007928

Country of ref document: BR