US20170042451A1 - Measuring Teeth Whiteness System and Method - Google Patents
Measuring Teeth Whiteness System and Method Download PDFInfo
- Publication number
- US20170042451A1 US20170042451A1 US15/296,360 US201615296360A US2017042451A1 US 20170042451 A1 US20170042451 A1 US 20170042451A1 US 201615296360 A US201615296360 A US 201615296360A US 2017042451 A1 US2017042451 A1 US 2017042451A1
- Authority
- US
- United States
- Prior art keywords
- teeth
- whiteness
- user
- value
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1032—Determining colour for diagnostic purposes
- A61B5/1034—Determining colour for diagnostic purposes by means of colour cards
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0088—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C19/00—Dental auxiliary appliances
- A61C19/06—Implements for therapeutic treatment
- A61C19/063—Medicament applicators for teeth or gums, e.g. treatment with fluorides
- A61C19/066—Bleaching devices; Whitening agent applicators for teeth, e.g. trays or strips
-
- G06T7/408—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
- A61B2576/02—Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C13/00—Dental prostheses; Making same
- A61C13/08—Artificial teeth; Making same
- A61C13/082—Cosmetic aspects, e.g. inlays; Determination of the colour
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C19/00—Dental auxiliary appliances
- A61C19/10—Supports for artificial teeth for transport or for comparison of the colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30036—Dental; Teeth
Definitions
- the present invention relates to a method of measuring teeth whiteness and, more particularly, to a method of generating a single numerical teeth whiteness value using a mobile device.
- a system and method are presented that allow a user to easily develop a “whiteness” score for their teeth.
- Traditional systems overemphasized color variations, needlessly complicating what most users want to know—“how white are my teeth, and are they getting whiter?”
- One embodiment of the present invention allows a user to use a camera on a mobile device to take a photograph of their teeth. The user examines the photograph and selects a point on a tooth. The system examines the color value (such as R-G-B color values) of pixels surrounding that point, rejects pixels that vary greatly from the typical value, develops an average color value for the remaining pixels, and applies an algorithm to develop a 1-100 score based on this color value.
- the mobile device can share this score with a centralized server. The server can compare this value with other users, and return a normalized percentile score compared to other users. Historical values for a user can be compared with the most recent value to determine any improvement in a user's whiteness score.
- the R-G-B value of pixels in an image depend heavily on the color and intensity of the light and the exposure of the image, it can be difficult to compare the whiteness value of teeth in one image with the whiteness of teeth in a different image taken in different lighting conditions or different exposure values.
- One embodiment overcomes this difficulty by including reference item within the image. For instance a standard gray card or other reference paper or device can be included in the same image as the user's teeth. Such a device can be used to normalize both the exposure and white balance of an image, ensuring that the overall lighting and color balance of images will be consistent with one another.
- the reference item would include a plurality of shaded areas, each have a different whiteness value that is typical for most teeth.
- the mobile device can compare the whiteness calculated for a user's teeth against this comparison chart to find the closest match.
- FIG. 1 is an schematic view of a mobile device and its internal components as it takes an image of a person's teeth.
- FIG. 2 is schematic representation of a touchscreen display showing an image taken by the mobile device and the elements necessary to select a location for analysis.
- FIG. 3 is flow chart showing a method that can be use to implement an embodiment of the present invention.
- FIG. 4 is a schematic representation of the touchscreen display of FIG. 2 with the addition of a reference device within the image.
- a mobile device 100 that can take images of a user 10 in order to establish a value for the whiteness of the user's teeth.
- the mobile device 100 takes the image through an integrated camera 110 .
- the user can use a touchscreen display 120 to aim the camera 120 and to record the image.
- touch screen 120 both presents visual information to the user over the display portion of the touch screen 120 and also receives touch input from the user.
- the recording of an image using the camera 110 and display 120 is allowed through use of a mobile device processor 130 , which operates under the guidance of programming instructions 150 that are stored on non-volatile memory 140 of the device 100 .
- the instructions 150 can instruct the processor 130 on how to record an image, and also on how to calculate a whiteness value for teeth found in the image.
- the mobile device 100 uses a network interface 170 to communicate over network 180 with a remote server 190 .
- the remote server can store and recall data from a database 192 , which may be locally connected to the server 190 or can be accessed by the server 190 over a network.
- the network 180 is a wide area network such as the Internet.
- the data network interface 170 connects the device 100 to a local wireless network that provides connection to the wide area data network 180 .
- the network interface 170 preferably connects via one of the Institute of Electrical and Electronics Engineers' (IEEE) 802.11 standards.
- the local network is based on TCP/IP, and the data network interface 170 utilizes a TCP/IP protocol stack.
- the network 180 is formed in part by a cellular data network.
- the network interface 170 takes the form of a cellular communication interface that communicates over the cellular network 170 .
- the programming that operates the method described below is stored remotely and is accessed over the network 180 .
- a web browser could operate on the mobile device 100 using locally stored programming 150 , and could access a website or other remote programming over network 180 .
- This remote programming could provide the interface for creating an image, determining a whiteness value for a user's teeth, and storing and comparing this values to other values stored on the remote server 190 .
- the mobile device 100 can take the form of a smart phone or tablet computer.
- the mobile device processor 130 can be a general purpose CPU, such as those provided by Intel Corporation (Mountain View, Calif.) or Advanced Micro Devices, Inc. (Sunnyvale, Calif.), or a mobile specific processor, such as those designed by ARM Holdings (Cambridge, UK).
- Mobile devices such as device 100 generally use specific operating systems 140 designed for such devices, such as iOS from Apple Inc. (Cupertino, Calif.) or ANDROID OS from Google Inc. (Menlo Park, Calif.).
- the operating system programming 150 is stored on memory 140 and is used by the processor 130 to provide a user interface for the touch screen display 120 , handle communications for the device 100 , and to manage and provide services to applications programming (or apps) 150 that are stored in the memory 140 .
- the image is taken with a laptop or desktop computer (not shown in FIG. 1 ).
- a laptop or desktop computer Such computing devices frequently have integrated cameras, but may also include external cameras connected to the computer through an expansion bus, such as a USB connection. These computers can then perform the method described below in connection with FIG. 3 .
- the description below will refer to the use of a mobile device to acquire an image, determine a whiteness value for the user's teeth, and to communicate with the server 190 .
- FIG. 2 shows an image 200 of the user 10 taken with camera 110 .
- This image 200 is displayed by the mobile device 100 on the touchscreen display 120 after the image is acquired.
- the present invention uses an image such as this to determine a whiteness value for the user's teeth 210 .
- a method 300 for determining the whiteness value using this image 200 is shown in the flow chart of FIG. 3 .
- the method 300 begins at step 305 by having the programming 150 direct the user 10 to acquire an image 200 of their teeth 210 .
- This image 200 can be created using the camera 110 built into the mobile device 100 .
- the user can select a previously recorded image that is stored either in the memory 140 of the device 100 or selected and retrieved from a remote photographic storage location over the network 180 and the network interface 170 .
- the image 200 is shown in the touchscreen display 120 of the mobile device. The user is then asked to select a location or point 220 on the image that corresponds to one or more of the user's teeth 210 .
- FIG. 1 A method 300 for determining the whiteness value using this image 200 is shown in the flow chart of FIG. 3 .
- the user 10 manipulates a pointing device 230 on the display 120 until the pointing device is pointed at the teeth 210 in the image 200 .
- the user simply presses on the touchscreen display at this location 220 .
- the device 100 uses programming to analyze the context of the image (such as through the use of facial recognition software) and to automatically identify the teeth of the user in the image 200 .
- the method 300 selects a group of pixels 240 surrounding this location 220 (step 320 ).
- the programming 150 automatically selects a group of pixels 240 surrounding the selected point.
- the programming 150 could select a circular region having a radius of approximately 10 pixels centered on the selected point 220 .
- Other shaped and sized regions could also be automatically selected, such as a 10 ⁇ 10 pixel square area centered on the point 220 , or a circle with a radio of 5 or 20 pixels.
- the user is allowed to select a group of pixels 240 instead of only a single point 220 on the image, such as by dragging their finger across an area of the image 200 presented in the touchscreen display 120 .
- the user is allowed to select non-contiguous pixels on the image 200 as part of step 320 , such as by individually selecting the center of each tooth displayed in the image 200 .
- the selected pixels may be from a single tooth in the image 200 and in other cases the pixels may be from multiple teeth.
- pixels relating to a user's “teeth” should be understood to relate to both situations unless otherwise specified.
- step 325 statistical analysis is used to reject pixels within the selected group of pixels 240 that represent outlier values.
- This analysis There are numerous statistical techniques that can be used in this analysis, such as Dixon's Q test. To perform this test, the value of each pixel will need to be compared with the value of the other pixels in the group 240 .
- pixels are represented by three or four separate values. For instance, in RGB representations, each pixel has a red (R) value, a green (G) value, and a blue (B) value.
- CMY having values for cyan, magenta, yellow
- CMYK similar to the CMY color scheme with an added key or black color
- HSL having values for hue, saturation, and lightness
- HSV hue, saturation, and value
- the test for outliers can be performed on each of these separate elements (R, G, and B values, for instance, in an RGB color representation scheme).
- this test can be performed on a universal number derived from these separate elements, such as the whiteness value described below in connection with steps 340 - 344 . Either way, this analysis removes pixels that may not accurately reflect the whiteness value of the user's teeth 210 . For instance, the rejected pixels may actually be part of the user's lips or gums, or may represent a shadow area inside the user's mouth.
- outlier pixels are removed from the selected pixels 240 , the remaining pixels are averaged to create values for an average pixel.
- the individual elements of a pixel (such as the R, G, B values) are averaged across all of the remaining pixels to create a single (R,G,B) value for the pixel set 240 . This occurs at step 330 .
- this average pixel value may not be an accurate representation of the brightness or color of a user's actual teeth 210 . This is because the pixel value of the teeth in the image 210 is heavily dependent on the lighting that existed on the user's face when the image was taken, and on the exposure of the image taken by the camera 110 . If the camera 110 is designed to automatically expose the image 200 , the exposure of the user's teeth will depend on the average brightness of the rest of the image 200 , as the camera 110 will try to balance the exposure of the overall image 200 . In other words, a dark background may overexpose a user's teeth, creating higher values relating to the brightness or individual color values of the average pixel determined at step 330 .
- this limitation of the pixel values in the image 200 is overcome by requesting that the user take the image 200 under a standard lighting condition.
- the actual lighting recommended makes little difference, as long as the lighting conditions remain relatively constant from image to image.
- they can track whiteness changes in their teeth over time in an accurate manor by simply repeating the images 200 under the same lighting conditions using the same exposure settings on their camera 110 .
- users repeat their lighting conditions over multiple images it is still difficult to compare the whiteness of teeth calculated for one user with the whiteness calculated for a different user, as the different users will likely be unable to create identical lighting conditions.
- one embodiment of the present invention has the user take their image 200 with a reference card 400 also visible in the image, as seen in FIG. 4 .
- This reference card may be a standard “gray card,” such as a neutral gray card having 18% reflectivity across the visible spectrum.
- the reference card 400 may contain a pattern of blocks or an icon 410 that can be detected through digital analysis of the image 200 .
- the card contains both a pattern of blocks, going from white to darker gray as one moves up the card 400 in the image 200 , and a computer recognizable icon 410 .
- the programming scans the image 200 for the presence of the card 400 .
- a location on the card having a pure gray color of a known reflectivity is identified. Pixels on this card location are then analyzed in order to standardize both the exposure of the image 200 and the white balance of the image 200 . This standardization is then applied to the average value selected in step 330 , in order to create a corrected pixel value in step 335 .
- a whiteness value is created for this corrected pixel value.
- a single number is create for this whiteness value in order to inform the user how “white” their teeth are.
- Multiple algorithms can be applied to the corrected pixel value in order to create this whiteness value.
- FIG. 3 shows two different options.
- the first option 342 examines the individual color values of the pixel. This works best in an RGB or CMY type of color schema, where brightness or blackness of the pixel is not a separate value.
- the RGB color scheme is ideal for describing the present invention, as higher numbers in this color scheme equal brighter values, and where maximum values for each color results in a pure white color for a pixel. In this type of environment, one can simply take an average of the three colors, and then normalize this number against a maximum value. When the RGB values are recorded as 8 -bits each, each of the three colors for a pixel has a range from 0-255.
- the normalized average would be the sum of the three color values divided by three (to create the average), with the total divided by 255 (creating a number from 0-1). This number can again be multiplied by 100 to create a value from 1-100.
- the algorithm can be expressed as:
- each RGB color value can range from 0 to 65,536, resulting in the following algorithm to create the same normalized average result:
- the pixel can be converted to RGB with the above algorithm applied.
- the 8-bit algorithm for CMY colors would be:
- a second option corrects this problem by lowering scores for teeth whose color is not pure white/grey.
- the difference of each color value from the maximum value (such as 255 ⁇ R for the red color in an 8-bit RGB color scheme) is squared.
- the squared value for each color are then summed together, and the square root of this value is then taken. This number is then normalized compared to the maximum possible value.
- pure black is 0,0,0
- pure white is 255,255,255.
- MaxScore the maximum score for the square root of the sum of the squares can be determine by the following formula based on the maximum value (“MaxValue”) for each color of a pixel (65,536 colors in a 16-bit environment)
- the MaxValue is 65,536, meaning that the MaxScore is approximately 113,511.682.
- the MaxScore in this calculation need not be exact, as this formula will typically result in a number that is not accurately represented in only a few digits.
- the MaxScore can be approximated, such as by using a MaxScore value of 440 in an 8-bit RGB environment, or a MaxScore value of 113,000 in a 16-bit RGB environment.
- using a MaxScore value having at least two significant digits in common with the calculation above is sufficient.
- the formula may equate a pixel color value of less than the maximum value to be a maximum value for the purposes of the formula.
- the maximum value for any color in an 8-bit RGB color scheme may be set at less than the theoretical maximum value (such as 250 or above).
- an 8-bit RGB value of (251, 254, 253) would be reduced to the maximum score of (250, 250, 250), and be given a whiteness score of 100.
- method 300 standardizes and quantifies the level of whiteness for teeth. This provides a defined metric for a near infinite spectrum of teeth whiteness, which allows the consumer to measure the whiteness of their teeth against pure white. At step 350 , this value is displayed to the user, such as on display 120 , giving the user an understandable answer to the question as to the whiteness of their teeth.
- the reference card 400 can include a plurality of different shaded areas. Each of these shaded areas will have a different whiteness value. Preferably, these different areas would have whiteness values that extended through a range of whiteness that is common in human teeth.
- the processor 130 can then compare the selected area 240 of the user's teeth with the various shaded areas on the reference card. Because the processor 130 would be aware of the whiteness score for each of these different shaded areas, a whiteness match between the teeth and a particular area on the reference card would establish a definitive whiteness score for the user's teeth. This can be accomplished without the need to correct for exposure and white balance, as the reference card has received the same exposure and lighting as the user's teeth.
- the whiteness score for the teeth would be known to be between the two pre-established whiteness values for those areas of the card.
- This whiteness comparison can be accomplished by calculating the whiteness value for both the teeth and the various areas on the card using the calculations described above.
- the mobile device transmits this calculated value, along with a user identifier, to the remote server 190 .
- the server 192 will store this data in its database 192 , allowing the user to compare this value to similar values previously calculated for their teeth, and against similar values calculated for other users.
- the sever 190 compares the received value against all other values for all users in the database 192 in order to give the score a percentile rank. This rank is returned to and received by the mobile device 100 in step 360 .
- the server 190 can also send all previous scores for this user back to the mobile device 100 , which is received at step 365 .
- the mobile device 100 keeps historical data and images 160 from previous calculations in its local memory. In these embodiments, the mobile device 100 would not need to receive such historical information from the remote server 190 in step 365 .
- the percentile ranking and the comparison to historical values are displayed to the user on the display 120 of the mobile device.
- the user can then examine this data, and even request to see historical pictures of themselves. In this manner, the user can track their progression as they attempt to whiten their teeth.
- the mobile device 100 may present advertisements to the user as they perform method 300 . This is shown in FIG. 3 as step 375 , although the advertisements may be displayed at any step in the method 300 . Preferably, these advertisements relate to products and services that allow the user 10 to brighten her teeth. These advertisements can be transmitted when needed from the server 190 , or can be stored locally in the mobile device 100 .
- the user is able to use the mobile device to track cosmetic treatments to their teeth.
- the mobile device can then compare and contrast numerically the whiteness of the user's teeth before and after these cosmetic treatments.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Veterinary Medicine (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Animal Behavior & Ethology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Quality & Reliability (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Epidemiology (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
A method and system are presented for measuring the whiteness of teeth. This is accomplished by analyzing an average pixel value of teeth taken from a digital image. The pixel value is mathematically standardized to create an indicator that quantifies teeth whiteness. This eliminates subjectivity in measuring teeth whiteness and permits precise communication of the level of teeth whiteness.
Description
- This application is a continuation of U.S. patent application Ser. No. 14/607,652, filed on Jan. 28, 2015 which in turn claimed the benefit of 61/933,112, filed on Jan. 28 2014.
- The present invention relates to a method of measuring teeth whiteness and, more particularly, to a method of generating a single numerical teeth whiteness value using a mobile device.
- The current systems for measuring the whiteness of teeth are confusing and subjective. All existing systems, including the most pervasive Vita guide, do not cover all colors of the visible electromagnetic spectrum. As can be seen, there is a need for a standardized method of measuring teeth whiteness.
- A system and method are presented that allow a user to easily develop a “whiteness” score for their teeth. Traditional systems overemphasized color variations, needlessly complicating what most users want to know—“how white are my teeth, and are they getting whiter?” One embodiment of the present invention allows a user to use a camera on a mobile device to take a photograph of their teeth. The user examines the photograph and selects a point on a tooth. The system examines the color value (such as R-G-B color values) of pixels surrounding that point, rejects pixels that vary greatly from the typical value, develops an average color value for the remaining pixels, and applies an algorithm to develop a 1-100 score based on this color value. The mobile device can share this score with a centralized server. The server can compare this value with other users, and return a normalized percentile score compared to other users. Historical values for a user can be compared with the most recent value to determine any improvement in a user's whiteness score.
- Because the R-G-B value of pixels in an image depend heavily on the color and intensity of the light and the exposure of the image, it can be difficult to compare the whiteness value of teeth in one image with the whiteness of teeth in a different image taken in different lighting conditions or different exposure values. One embodiment overcomes this difficulty by including reference item within the image. For instance a standard gray card or other reference paper or device can be included in the same image as the user's teeth. Such a device can be used to normalize both the exposure and white balance of an image, ensuring that the overall lighting and color balance of images will be consistent with one another. In other embodiments, the reference item would include a plurality of shaded areas, each have a different whiteness value that is typical for most teeth. In this embodiment, the mobile device can compare the whiteness calculated for a user's teeth against this comparison chart to find the closest match.
-
FIG. 1 is an schematic view of a mobile device and its internal components as it takes an image of a person's teeth. -
FIG. 2 is schematic representation of a touchscreen display showing an image taken by the mobile device and the elements necessary to select a location for analysis. -
FIG. 3 is flow chart showing a method that can be use to implement an embodiment of the present invention. -
FIG. 4 is a schematic representation of the touchscreen display ofFIG. 2 with the addition of a reference device within the image. - In
FIG. 1 , amobile device 100 is shown that can take images of auser 10 in order to establish a value for the whiteness of the user's teeth. Themobile device 100 takes the image through an integratedcamera 110. The user can use atouchscreen display 120 to aim thecamera 120 and to record the image. In the preferred embodiment,touch screen 120 both presents visual information to the user over the display portion of thetouch screen 120 and also receives touch input from the user. The recording of an image using thecamera 110 anddisplay 120 is allowed through use of amobile device processor 130, which operates under the guidance ofprogramming instructions 150 that are stored onnon-volatile memory 140 of thedevice 100. Theinstructions 150 can instruct theprocessor 130 on how to record an image, and also on how to calculate a whiteness value for teeth found in the image. - In
FIG. 1 , themobile device 100 uses anetwork interface 170 to communicate overnetwork 180 with aremote server 190. The remote server can store and recall data from adatabase 192, which may be locally connected to theserver 190 or can be accessed by theserver 190 over a network. In one embodiment, thenetwork 180 is a wide area network such as the Internet. In one embodiment, thedata network interface 170 connects thedevice 100 to a local wireless network that provides connection to the widearea data network 180. Thenetwork interface 170 preferably connects via one of the Institute of Electrical and Electronics Engineers' (IEEE) 802.11 standards. In one embodiment, the local network is based on TCP/IP, and thedata network interface 170 utilizes a TCP/IP protocol stack. In other embodiments, thenetwork 180 is formed in part by a cellular data network. In these embodiments, thenetwork interface 170 takes the form of a cellular communication interface that communicates over thecellular network 170. - In other embodiments, the programming that operates the method described below is stored remotely and is accessed over the
network 180. For instance, a web browser could operate on themobile device 100 using locally storedprogramming 150, and could access a website or other remote programming overnetwork 180. This remote programming could provide the interface for creating an image, determining a whiteness value for a user's teeth, and storing and comparing this values to other values stored on theremote server 190. - The
mobile device 100 can take the form of a smart phone or tablet computer. Themobile device processor 130 can be a general purpose CPU, such as those provided by Intel Corporation (Mountain View, Calif.) or Advanced Micro Devices, Inc. (Sunnyvale, Calif.), or a mobile specific processor, such as those designed by ARM Holdings (Cambridge, UK). Mobile devices such asdevice 100 generally usespecific operating systems 140 designed for such devices, such as iOS from Apple Inc. (Cupertino, Calif.) or ANDROID OS from Google Inc. (Menlo Park, Calif.). Theoperating system programming 150 is stored onmemory 140 and is used by theprocessor 130 to provide a user interface for thetouch screen display 120, handle communications for thedevice 100, and to manage and provide services to applications programming (or apps) 150 that are stored in thememory 140. - In other embodiments, the image is taken with a laptop or desktop computer (not shown in
FIG. 1 ). Such computing devices frequently have integrated cameras, but may also include external cameras connected to the computer through an expansion bus, such as a USB connection. These computers can then perform the method described below in connection withFIG. 3 . For ease in describing the present invention, the description below will refer to the use of a mobile device to acquire an image, determine a whiteness value for the user's teeth, and to communicate with theserver 190. -
FIG. 2 shows animage 200 of theuser 10 taken withcamera 110. Thisimage 200 is displayed by themobile device 100 on thetouchscreen display 120 after the image is acquired. The present invention uses an image such as this to determine a whiteness value for the user'steeth 210. - A
method 300 for determining the whiteness value using thisimage 200 is shown in the flow chart ofFIG. 3 . Themethod 300 begins atstep 305 by having theprogramming 150 direct theuser 10 to acquire animage 200 of theirteeth 210. Thisimage 200 can be created using thecamera 110 built into themobile device 100. Alternatively, the user can select a previously recorded image that is stored either in thememory 140 of thedevice 100 or selected and retrieved from a remote photographic storage location over thenetwork 180 and thenetwork interface 170. Atstep 310, theimage 200 is shown in thetouchscreen display 120 of the mobile device. The user is then asked to select a location orpoint 220 on the image that corresponds to one or more of the user'steeth 210. InFIG. 2 , theuser 10 manipulates apointing device 230 on thedisplay 120 until the pointing device is pointed at theteeth 210 in theimage 200. In other embodiments, the user simply presses on the touchscreen display at thislocation 220. In still other embodiments, thedevice 100 uses programming to analyze the context of the image (such as through the use of facial recognition software) and to automatically identify the teeth of the user in theimage 200. - At
step 320, themethod 300 selects a group ofpixels 240 surrounding this location 220 (step 320). In one embodiment, theprogramming 150 automatically selects a group ofpixels 240 surrounding the selected point. For example, theprogramming 150 could select a circular region having a radius of approximately 10 pixels centered on the selectedpoint 220. Other shaped and sized regions could also be automatically selected, such as a 10×10 pixel square area centered on thepoint 220, or a circle with a radio of 5 or 20 pixels. In still other embodiments, the user is allowed to select a group ofpixels 240 instead of only asingle point 220 on the image, such as by dragging their finger across an area of theimage 200 presented in thetouchscreen display 120. In still further embodiments, the user is allowed to select non-contiguous pixels on theimage 200 as part ofstep 320, such as by individually selecting the center of each tooth displayed in theimage 200. Note that in some cases, the selected pixels may be from a single tooth in theimage 200 and in other cases the pixels may be from multiple teeth. For purposes of this disclosure, pixels relating to a user's “teeth” should be understood to relate to both situations unless otherwise specified. - In
step 325, statistical analysis is used to reject pixels within the selected group ofpixels 240 that represent outlier values. There are numerous statistical techniques that can be used in this analysis, such as Dixon's Q test. To perform this test, the value of each pixel will need to be compared with the value of the other pixels in thegroup 240. In most color images, pixels are represented by three or four separate values. For instance, in RGB representations, each pixel has a red (R) value, a green (G) value, and a blue (B) value. Other color schemes are also possible, such as CMY (having values for cyan, magenta, yellow), CMYK (similar to the CMY color scheme with an added key or black color), HSL (having values for hue, saturation, and lightness), or HSV (hue, saturation, and value). The test for outliers can be performed on each of these separate elements (R, G, and B values, for instance, in an RGB color representation scheme). Alternatively, this test can be performed on a universal number derived from these separate elements, such as the whiteness value described below in connection with steps 340-344. Either way, this analysis removes pixels that may not accurately reflect the whiteness value of the user'steeth 210. For instance, the rejected pixels may actually be part of the user's lips or gums, or may represent a shadow area inside the user's mouth. - Once outlier pixels are removed from the selected
pixels 240, the remaining pixels are averaged to create values for an average pixel. In most cases, the individual elements of a pixel (such as the R, G, B values) are averaged across all of the remaining pixels to create a single (R,G,B) value for the pixel set 240. This occurs atstep 330. - Unfortunately, this average pixel value may not be an accurate representation of the brightness or color of a user's
actual teeth 210. This is because the pixel value of the teeth in theimage 210 is heavily dependent on the lighting that existed on the user's face when the image was taken, and on the exposure of the image taken by thecamera 110. If thecamera 110 is designed to automatically expose theimage 200, the exposure of the user's teeth will depend on the average brightness of the rest of theimage 200, as thecamera 110 will try to balance the exposure of theoverall image 200. In other words, a dark background may overexpose a user's teeth, creating higher values relating to the brightness or individual color values of the average pixel determined atstep 330. In addition, as different lighting sources have different colors, these different light sources may change the overall color or white balance of the image. In one embodiment, this limitation of the pixel values in theimage 200 is overcome by requesting that the user take theimage 200 under a standard lighting condition. The actual lighting recommended makes little difference, as long as the lighting conditions remain relatively constant from image to image. For a single user, they can track whiteness changes in their teeth over time in an accurate manor by simply repeating theimages 200 under the same lighting conditions using the same exposure settings on theircamera 110. Even when users repeat their lighting conditions over multiple images, it is still difficult to compare the whiteness of teeth calculated for one user with the whiteness calculated for a different user, as the different users will likely be unable to create identical lighting conditions. One can give instructions to all users on how to take the image, but each user will likely interpret the instructions differently. - To overcome this limitation, one embodiment of the present invention has the user take their
image 200 with areference card 400 also visible in the image, as seen inFIG. 4 . This reference card may be a standard “gray card,” such as a neutral gray card having 18% reflectivity across the visible spectrum. In order to make thecard 400 easy for theprogramming 150 to automatically detect in the image, thereference card 400 may contain a pattern of blocks or anicon 410 that can be detected through digital analysis of theimage 200. InFIG. 4 , the card contains both a pattern of blocks, going from white to darker gray as one moves up thecard 400 in theimage 200, and a computerrecognizable icon 410. The programming scans theimage 200 for the presence of thecard 400. If identified, a location on the card having a pure gray color of a known reflectivity is identified. Pixels on this card location are then analyzed in order to standardize both the exposure of theimage 200 and the white balance of theimage 200. This standardization is then applied to the average value selected instep 330, in order to create a corrected pixel value instep 335. - At
step 340, a whiteness value is created for this corrected pixel value. In the preferred embodiment, a single number is create for this whiteness value in order to inform the user how “white” their teeth are. Multiple algorithms can be applied to the corrected pixel value in order to create this whiteness value.FIG. 3 shows two different options. - The
first option 342 examines the individual color values of the pixel. This works best in an RGB or CMY type of color schema, where brightness or blackness of the pixel is not a separate value. The RGB color scheme is ideal for describing the present invention, as higher numbers in this color scheme equal brighter values, and where maximum values for each color results in a pure white color for a pixel. In this type of environment, one can simply take an average of the three colors, and then normalize this number against a maximum value. When the RGB values are recorded as 8-bits each, each of the three colors for a pixel has a range from 0-255. The normalized average would be the sum of the three color values divided by three (to create the average), with the total divided by 255 (creating a number from 0-1). This number can again be multiplied by 100 to create a value from 1-100. The algorithm can be expressed as: -
((R+G+B)/3)/255)*100 - In a 16 bit environment, each RGB color value can range from 0 to 65,536, resulting in the following algorithm to create the same normalized average result:
-
((R+G+B)/3)/65,536)*100 - In a CMY color scheme, the pixel can be converted to RGB with the above algorithm applied. Alternatively, with the assumption that C=Max−R, M=Max−G, and Y=Max−B, the 8-bit algorithm for CMY colors would be:
-
(((255−C)+(255−M)+(255−Y))/3)/255)*100 - The conversion of the 16-bit RGB algorithm to CMY would be done in the same manor.
- One issue with the average value created by
option 342 is that discolored teeth are given the same score as pure white/grey teeth of the same overall brightness or intensity. A second option (step 344) corrects this problem by lowering scores for teeth whose color is not pure white/grey. In this option, the difference of each color value from the maximum value (such as 255−R for the red color in an 8-bit RGB color scheme) is squared. The squared value for each color are then summed together, and the square root of this value is then taken. This number is then normalized compared to the maximum possible value. In an 8-bit RGB color scheme, pure black is 0,0,0, and pure white is 255,255,255. To create a whiteness value where pure black is still 0, and pure white is 100, and discolored teeth have a lower value that pure white/grey teeth of the same brightness, the following formula can be applied: -
((441.673−((R−255)̂2+(G−255)̂2+(B−255)̂2)̂0.5)/441.673)*100 - In this formula, the difference of each value from the maximum (255) is determined and squared. These numbers are added together, and the square root is taken. The range of this value from pure black to pure white is from 441.673 (pure black) to 0 (pure white). This number is subtracted from 441.673 (now pure white is 441.763 and pure black is 0), and the resulting value is divided by 441.673 and multiplied by 100 to get a range from 0 (pure black) to 100 (pure white).
- In a 16-bit RGB environment, the formula is the same with the constants changing to represent the new maximum values. The maximum score (“MaxScore”) for the square root of the sum of the squares can be determine by the following formula based on the maximum value (“MaxValue”) for each color of a pixel (65,536 colors in a 16-bit environment)
-
MaxScore=((MaxValuê2)*3)̂0.5 - In a 16-bit environment, the MaxValue is 65,536, meaning that the MaxScore is approximately 113,511.682. Note that the MaxScore in this calculation need not be exact, as this formula will typically result in a number that is not accurately represented in only a few digits. Thus, the MaxScore can be approximated, such as by using a MaxScore value of 440 in an 8-bit RGB environment, or a MaxScore value of 113,000 in a 16-bit RGB environment. Generally, using a MaxScore value having at least two significant digits in common with the calculation above is sufficient.
- If we use the MaxScore and MaxValue variable names, the resulting normalization formula for an RGB color scheme is:
-
((MaxScore−((R−MaxValue)̂2+(G−MaxValue)̂2+(B−MaxValue)̂2)̂0.5)/MaxScore)*100 - The conversion of the general formula to CMY would be clear to one of ordinary skill in the art. Note that this formula can be implemented with several variations while still keeping within the scope of the present invention. For instance, the formula may equate a pixel color value of less than the maximum value to be a maximum value for the purposes of the formula. Thus, the maximum value for any color in an 8-bit RGB color scheme may be set at less than the theoretical maximum value (such as 250 or above). Thus an 8-bit RGB value of (251, 254, 253) would be reduced to the maximum score of (250, 250, 250), and be given a whiteness score of 100.
- By applying one of these
formulas method 300 standardizes and quantifies the level of whiteness for teeth. This provides a defined metric for a near infinite spectrum of teeth whiteness, which allows the consumer to measure the whiteness of their teeth against pure white. Atstep 350, this value is displayed to the user, such as ondisplay 120, giving the user an understandable answer to the question as to the whiteness of their teeth. - In another embodiment, the
reference card 400 can include a plurality of different shaded areas. Each of these shaded areas will have a different whiteness value. Preferably, these different areas would have whiteness values that extended through a range of whiteness that is common in human teeth. Theprocessor 130 can then compare the selectedarea 240 of the user's teeth with the various shaded areas on the reference card. Because theprocessor 130 would be aware of the whiteness score for each of these different shaded areas, a whiteness match between the teeth and a particular area on the reference card would establish a definitive whiteness score for the user's teeth. This can be accomplished without the need to correct for exposure and white balance, as the reference card has received the same exposure and lighting as the user's teeth. If the teeth had a whiteness level that fell between two areas on the reference card, the whiteness score for the teeth would be known to be between the two pre-established whiteness values for those areas of the card. This whiteness comparison can be accomplished by calculating the whiteness value for both the teeth and the various areas on the card using the calculations described above. - In
step 355, the mobile device transmits this calculated value, along with a user identifier, to theremote server 190. Theserver 192 will store this data in itsdatabase 192, allowing the user to compare this value to similar values previously calculated for their teeth, and against similar values calculated for other users. In one embodiment, thesever 190 compares the received value against all other values for all users in thedatabase 192 in order to give the score a percentile rank. This rank is returned to and received by themobile device 100 instep 360. Theserver 190 can also send all previous scores for this user back to themobile device 100, which is received atstep 365. In alternate embodiments, themobile device 100 keeps historical data andimages 160 from previous calculations in its local memory. In these embodiments, themobile device 100 would not need to receive such historical information from theremote server 190 instep 365. - At
step 370, the percentile ranking and the comparison to historical values are displayed to the user on thedisplay 120 of the mobile device. The user can then examine this data, and even request to see historical pictures of themselves. In this manner, the user can track their progression as they attempt to whiten their teeth. - In some instances, the
mobile device 100 may present advertisements to the user as they performmethod 300. This is shown inFIG. 3 asstep 375, although the advertisements may be displayed at any step in themethod 300. Preferably, these advertisements relate to products and services that allow theuser 10 to brighten her teeth. These advertisements can be transmitted when needed from theserver 190, or can be stored locally in themobile device 100. - In other embodiments, the user is able to use the mobile device to track cosmetic treatments to their teeth. The mobile device can then compare and contrast numerically the whiteness of the user's teeth before and after these cosmetic treatments.
- While the invention has been described with reference to an exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from the essential scope thereof. Therefore, it is intended that the invention not be limited to the particular embodiments described herein, but that the invention will include all embodiments falling with the scope of the appended claims.
Claims (1)
1. A computerized method for determining a whiteness score for a user's teeth comprising:
a) at a computing device, identifying a plurality of pixels from an image that show a portion of the user's teeth;
b) at the computing device, determining an average pixel value for the plurality of pixels;
c) at the computing device, determining a plurality of color values for the average pixel;
d) at the computing device, evaluating the plurality of color values to generate the whiteness score for the average pixel; and
e) at the computing device, displaying the whiteness score.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/296,360 US20170042451A1 (en) | 2014-01-29 | 2016-10-18 | Measuring Teeth Whiteness System and Method |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201461933112P | 2014-01-29 | 2014-01-29 | |
US14/607,652 US9478043B2 (en) | 2014-01-29 | 2015-01-28 | Measuring teeth whiteness system and method |
US15/296,360 US20170042451A1 (en) | 2014-01-29 | 2016-10-18 | Measuring Teeth Whiteness System and Method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/607,652 Continuation US9478043B2 (en) | 2014-01-29 | 2015-01-28 | Measuring teeth whiteness system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170042451A1 true US20170042451A1 (en) | 2017-02-16 |
Family
ID=53679520
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/607,652 Active 2035-04-02 US9478043B2 (en) | 2014-01-29 | 2015-01-28 | Measuring teeth whiteness system and method |
US15/296,360 Abandoned US20170042451A1 (en) | 2014-01-29 | 2016-10-18 | Measuring Teeth Whiteness System and Method |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/607,652 Active 2035-04-02 US9478043B2 (en) | 2014-01-29 | 2015-01-28 | Measuring teeth whiteness system and method |
Country Status (1)
Country | Link |
---|---|
US (2) | US9478043B2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110349224A (en) * | 2019-06-14 | 2019-10-18 | 众安信息技术服务有限公司 | A kind of color of teeth value judgment method and system based on deep learning |
EP3985382A1 (en) | 2020-10-14 | 2022-04-20 | Roche Diabetes Care GmbH | A method of controlling auto-exposure settings of a mobile device having a camera |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9478043B2 (en) * | 2014-01-29 | 2016-10-25 | Abdullaibrahim Abdulwaheed | Measuring teeth whiteness system and method |
CN105160329B (en) * | 2015-09-18 | 2018-09-21 | 厦门美图之家科技有限公司 | A kind of tooth recognition methods, system and camera terminal based on YUV color spaces |
US10271732B2 (en) | 2015-12-14 | 2019-04-30 | Colgate-Palmolive Company | Color measurement jig |
RU2020135294A (en) * | 2018-03-28 | 2022-04-28 | Конинклейке Филипс Н.В. | METHOD AND SYSTEM FOR ASSESSING TEETH SHADES IN UNMANAGED ENVIRONMENT |
US10547780B2 (en) | 2018-05-14 | 2020-01-28 | Abdul Abdulwaheed | Body part color measurement detection and method |
US20220398731A1 (en) * | 2021-06-03 | 2022-12-15 | The Procter & Gamble Company | Oral Care Based Digital Imaging Systems And Methods For Determining Perceived Attractiveness Of A Facial Image Portion |
WO2021243640A1 (en) * | 2020-06-04 | 2021-12-09 | The Procter & Gamble Company | Oral care based digital imaging systems and methods for determining perceived attractiveness of facial image portion |
AU2021401669A1 (en) * | 2020-12-17 | 2023-06-29 | Colgate-Palmolive Company | System and device for measuring a color value, and methods thereof |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030179392A1 (en) * | 2002-03-20 | 2003-09-25 | Eastman Kodak Company | Digital color image processing method for improved tone scale reproduction |
US8316052B2 (en) * | 2006-12-12 | 2012-11-20 | Verizon Patent And Licensing Inc. | Method, computer program product and apparatus for providing media sharing services |
US9478043B2 (en) * | 2014-01-29 | 2016-10-25 | Abdullaibrahim Abdulwaheed | Measuring teeth whiteness system and method |
US20170228986A1 (en) * | 2011-10-17 | 2017-08-10 | Gamblit Gaming, Llc | Head-to-head and tournament play for enriched game play environment |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5766006A (en) * | 1995-06-26 | 1998-06-16 | Murljacic; Maryann Lehmann | Tooth shade analyzer system and methods |
EP0777113A1 (en) * | 1995-12-01 | 1997-06-04 | MHT Optic Research AG | Method and device for determination of colour value of transparent bodies |
US6038024A (en) * | 1998-01-09 | 2000-03-14 | Mht Optic Research | Method and an apparatus for determining the color stimulus specification of an object |
US6190170B1 (en) * | 1998-05-05 | 2001-02-20 | Dentech, Llc | Automated tooth shade analysis and matching system |
US6925205B2 (en) * | 2000-08-07 | 2005-08-02 | Digital Colour Measurement Limited | Methods, systems and computer program products for color matching |
EP2433555A3 (en) * | 2002-07-26 | 2013-01-16 | Olympus Corporation | Image processing system |
US7064830B2 (en) * | 2003-06-12 | 2006-06-20 | Eastman Kodak Company | Dental color imaging system |
US7463757B2 (en) * | 2003-12-09 | 2008-12-09 | Carestream Health, Inc. | Tooth locating within dental images |
US20080270175A1 (en) * | 2003-12-31 | 2008-10-30 | Klinger Advanced Aesthetics, Inc. | Systems and methods using a dynamic expert system to provide patients with aesthetic improvement procedures |
EP1621857A3 (en) * | 2004-07-29 | 2006-04-12 | MHT Optic Research AG | Device for the determination of the colour value of transparent objects, in particular teeth |
US20070255589A1 (en) * | 2006-04-27 | 2007-11-01 | Klinger Advanced Aesthetics, Inc. | Systems and methods using a dynamic database to provide aesthetic improvement procedures |
JP4883783B2 (en) * | 2006-12-22 | 2012-02-22 | キヤノン株式会社 | Image processing apparatus and method |
US20080310712A1 (en) * | 2007-06-12 | 2008-12-18 | Edgar Albert D | Method and system to detect and correct whiteness with a digital image |
US7929151B2 (en) * | 2008-01-11 | 2011-04-19 | Carestream Health, Inc. | Intra-oral camera for diagnostic and cosmetic imaging |
US20100284616A1 (en) * | 2008-02-01 | 2010-11-11 | Dan Dalton | Teeth locating and whitening in a digital image |
BR112013002794B1 (en) * | 2010-08-06 | 2018-03-20 | Unilever N. V. | COMPOSITION OF ORAL CARE AND USE OF A COLOR PASTE |
US20120156634A1 (en) * | 2010-12-21 | 2012-06-21 | The Procter & Gamble Company | Intraoral Imaging Devices And Methods |
FR2977469B1 (en) * | 2011-07-08 | 2013-08-02 | Francois Duret | THREE-DIMENSIONAL MEASURING DEVICE USED IN THE DENTAL FIELD |
US20140330577A1 (en) * | 2013-05-06 | 2014-11-06 | MouthWatch, LLC | Apparatus And Method For A Post-Treatment Patient Compliance System |
-
2015
- 2015-01-28 US US14/607,652 patent/US9478043B2/en active Active
-
2016
- 2016-10-18 US US15/296,360 patent/US20170042451A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030179392A1 (en) * | 2002-03-20 | 2003-09-25 | Eastman Kodak Company | Digital color image processing method for improved tone scale reproduction |
US8316052B2 (en) * | 2006-12-12 | 2012-11-20 | Verizon Patent And Licensing Inc. | Method, computer program product and apparatus for providing media sharing services |
US20170228986A1 (en) * | 2011-10-17 | 2017-08-10 | Gamblit Gaming, Llc | Head-to-head and tournament play for enriched game play environment |
US9478043B2 (en) * | 2014-01-29 | 2016-10-25 | Abdullaibrahim Abdulwaheed | Measuring teeth whiteness system and method |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110349224A (en) * | 2019-06-14 | 2019-10-18 | 众安信息技术服务有限公司 | A kind of color of teeth value judgment method and system based on deep learning |
EP3985382A1 (en) | 2020-10-14 | 2022-04-20 | Roche Diabetes Care GmbH | A method of controlling auto-exposure settings of a mobile device having a camera |
WO2022078977A1 (en) | 2020-10-14 | 2022-04-21 | F. Hoffmann-La Roche Ag | A method of controlling auto-exposure settings of a mobile device having a camera |
Also Published As
Publication number | Publication date |
---|---|
US20150213622A1 (en) | 2015-07-30 |
US9478043B2 (en) | 2016-10-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9478043B2 (en) | Measuring teeth whiteness system and method | |
EP2131697B1 (en) | Method and system for recommending a product based upon skin color estimated from an image | |
US7522768B2 (en) | Capture and systematic use of expert color analysis | |
JP2019000680A (en) | System and method for embodying and blending custom external preparation | |
US20070058858A1 (en) | Method and system for recommending a product based upon skin color estimated from an image | |
KR20170132644A (en) | Method for obtaining care information, method for sharing care information, and electronic apparatus therefor | |
Wesolkowski | Color image edge detection and segmentation: A comparison of the vector angle and the euclidean distance color similarity measures | |
AU2015201623A1 (en) | Choosing optimal images with preference distributions | |
Falomir et al. | A model for colour naming and comparing based on conceptual neighbourhood. An application for comparing art compositions | |
US20220335614A1 (en) | Digital Imaging and Learning Systems and Methods for Analyzing Pixel Data of a Scalp Region of a Users Scalp to Generate One or More User-Specific Scalp Classifications | |
CN112580433A (en) | Living body detection method and device | |
Barbero-Álvarez et al. | An adaptive colour calibration for crowdsourced images in heritage preservation science | |
KR102002622B1 (en) | System for determining a personal color and method thereof | |
CN113642358B (en) | Skin color detection method, device, terminal and storage medium | |
CN112016621B (en) | Training method of classification model, color classification method and electronic equipment | |
CN110458232B (en) | Method and equipment for determining image style similarity | |
JP5824423B2 (en) | Illumination light color estimation device, illumination light color estimation method, and illumination light color estimation program | |
CN112215808A (en) | Method and related device for generating human face skin sensitive image | |
CN109493830B (en) | Adjusting method and adjusting system of display panel and display device | |
US20140333659A1 (en) | Image processing apparatus and control method thereof | |
KR20180061629A (en) | Evaluation method for skin condition using image and evaluation apparatus for skin condition using image | |
KR101329136B1 (en) | Methdo and system of immersive enhancement for video sequence displaying | |
US7019789B2 (en) | Apparatus and method for calculating color temperature | |
US11055881B2 (en) | System and a method for providing color vision deficiency assistance | |
JP6591595B2 (en) | Skin undertone determination method and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |