WO2019244254A1 - Dispositif de traitement d'image, procédé de fonctionnement pour dispositif de traitement d'image, et programme de fonctionnement pour dispositif de traitement d'image - Google Patents
Dispositif de traitement d'image, procédé de fonctionnement pour dispositif de traitement d'image, et programme de fonctionnement pour dispositif de traitement d'image Download PDFInfo
- Publication number
- WO2019244254A1 WO2019244254A1 PCT/JP2018/023339 JP2018023339W WO2019244254A1 WO 2019244254 A1 WO2019244254 A1 WO 2019244254A1 JP 2018023339 W JP2018023339 W JP 2018023339W WO 2019244254 A1 WO2019244254 A1 WO 2019244254A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- color
- difference
- color information
- subject
- Prior art date
Links
- UAEPNZWRGJTJPN-UHFFFAOYSA-N CC1CCCCC1 Chemical compound CC1CCCCC1 UAEPNZWRGJTJPN-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C19/00—Dental auxiliary appliances
- A61C19/04—Measuring instruments specially adapted for dentistry
Definitions
- the present invention relates to an image processing device, an operation method of the image processing device, and an operation program of the image processing device.
- a dental clinic or the like photographs a tooth surface of a living tooth to be crown-produced at a dental clinic or the like. This photographing is performed using a digital camera or the like developed for dentistry.
- the dentist selects a shade guide having a color closest to the vital tooth of the patient from a plurality of tooth samples (hereinafter referred to as "shade guide") having different colors (hereinafter, this operation is referred to as “shade take”).
- shade guide is formed, for example, by processing ceramics of different colors into teeth.
- the dentist sends the photograph of the patient's vital tooth taken and the unique identification number assigned to the selected shade guide to the laboratory that produces the crown. At the laboratory, the crown is made based on the information sent.
- the above-described shade take lacks objectivity because it is performed by the dentist subjectively. Further, the appearance of the shade guide color and the appearance of the patient's tooth color are determined by adjusting the shade of the gum, the surrounding environment, the lighting conditions (for example, how to hit the lighting, the lighting color, etc.), the state of the observer, and the like. Since it varies depending on various factors, it is very difficult to select a shade guide that is closest to the color of the teeth of the patient, and the burden on the doctor is great.
- the conventional technique has a problem that it is difficult to intuitively recognize the difference in color between the patient's vital teeth and the selected shade guide.
- the present invention has been made in view of the above, and is an image processing apparatus, an operation method of the image processing apparatus, and an image processing apparatus that can intuitively understand a difference in color between a vital tooth of a patient and a selected shade guide.
- An object of the present invention is to provide a program for operating the device.
- an image processing apparatus includes a subject image capturing a subject, a color sample image compared with the subject, and a pixel of the subject image.
- An image information acquisition unit that acquires first color information that is color information for each color, and second color information that is color information based on the color sample image;
- a calculating unit for calculating a difference from the second color information, an image generating unit for generating data of a difference image obtained by normalizing a part having no difference to a reference color having a saturation of zero, and displaying the difference image
- a display control unit for causing the device to display.
- the difference may be a value obtained by subtracting a value corresponding to the second color information from a value corresponding to the first color information for each pixel.
- the image information acquisition unit acquires the color sample image of a color close to the subject and the second color information for each part of the subject.
- the calculation unit may calculate the difference for each of the parts.
- the image generation unit calculates an XYZ value based on the first color information and the second color information, and calculates an RGB value based on the XYZ value. And calculating the difference image data based on the RGB values.
- the image processing device is characterized in that the display control unit displays the reference color around a region of the display device where the difference image is displayed.
- the image processing device is characterized in that the image generation unit generates data of the difference image having different lightness of the reference color.
- the image processing device is characterized in that the image generation unit generates data of the difference image by performing modulation on the difference to increase contrast with respect to the reference color. .
- the image processing apparatus includes a color sample selection unit that selects a color sample closest to the subject image.
- the image information acquisition unit may include a subject image obtained by capturing the subject, a color sample image to be compared with the subject, and a color for each pixel of the subject image.
- a display control step of causing the display control unit to display the difference image on a display device.
- the operation program of the image processing apparatus may be configured such that the image information acquisition unit includes a subject image obtained by capturing the subject, a color sample image to be compared with the subject, and a color for each pixel of the subject image.
- a display control unit causing the image processing device to execute a display control step of displaying the difference image on a display device.
- FIG. 1 is a block diagram illustrating a schematic configuration of an imaging device and a cradle according to Embodiment 1 of the present invention.
- FIG. 2 is a block diagram illustrating a schematic configuration of the dental colorimetric device according to Embodiment 1 of the present invention.
- FIG. 3 is a diagram showing a spectrum of the light source shown in FIG.
- FIG. 4 is a diagram showing a schematic internal configuration of the spectrum estimation calculation unit shown in FIG.
- FIG. 5 is a flowchart showing a procedure of processing realized by the dental colorimetric device according to Embodiment 1 of the present invention.
- FIG. 6 is a flowchart illustrating a procedure of a process of capturing the multiband image illustrated in FIG.
- FIG. 7 is a diagram illustrating signal correction.
- FIG. 1 is a block diagram illustrating a schematic configuration of an imaging device and a cradle according to Embodiment 1 of the present invention.
- FIG. 2 is a block diagram illustrating a schematic
- FIG. 8 is a diagram illustrating the input ⁇ correction.
- FIG. 9 is a diagram illustrating the input ⁇ correction.
- FIG. 10 is a diagram illustrating an example of a low-pass filter applied to the R signal and the B signal in the pixel interpolation calculation.
- FIG. 11 is a diagram illustrating an example of a low-pass filter applied to a G signal in a pixel interpolation operation.
- FIG. 14 is an explanatory diagram illustrating a method of specifying a tooth region.
- FIG. 14 is an explanatory diagram illustrating a method of specifying a tooth region.
- FIG. 15 is a diagram illustrating an example of the measurement area set in the measurement area setting process.
- FIG. 16 is a diagram illustrating a state where a difference value between the color sample image and the subject image is calculated for each part.
- FIG. 17 is a diagram illustrating an example of a spectrum based on the first color information and the second color information.
- FIG. 18 is a diagram illustrating an example of a difference between the first color information and the second color information.
- FIG. 19 is a diagram illustrating an example of the difference image.
- FIG. 20 is a diagram illustrating an example of a difference between the first color information and the second color information.
- FIG. 21 is a diagram illustrating an example of the difference image.
- FIG. 21 is a diagram illustrating an example of the difference image.
- FIG. 22 is a diagram illustrating a state where a difference between the first color information and the second color information is calculated.
- FIG. 23 is a diagram illustrating how to calculate the difference between the first color information and the second color information.
- FIG. 24 is a diagram illustrating how glare is detected in the subject image.
- FIG. 25 is a diagram illustrating a difference image in which glare has been corrected.
- FIG. 26 is a diagram illustrating a difference image in which glare has been corrected.
- FIG. 27 is a diagram illustrating a difference image in which glare has been corrected.
- FIG. 28 is a diagram illustrating a state in which a reference color is displayed around the difference image.
- FIG. 29 is a diagram illustrating a state in which a reference color is displayed around the difference image.
- FIG. 29 is a diagram illustrating a state in which a reference color is displayed around the difference image.
- FIG. 30 is a diagram illustrating a state in which a patch is superimposed on a difference image.
- FIG. 31 is a diagram illustrating how the reference color is changed.
- FIG. 32 is a diagram illustrating how the reference color is changed.
- FIG. 33 is a diagram illustrating a state in which difference images of a plurality of reference colors are displayed side by side.
- FIG. 34 is a diagram illustrating a state where the contrast with respect to the reference color is enhanced.
- FIG. 35 is a diagram illustrating a state in which the contrast with respect to the reference color is enhanced.
- FIG. 36 is a diagram illustrating a state in which the contrast with respect to the reference color is enhanced.
- FIG. 37 is a block diagram illustrating a schematic configuration of the image processing apparatus according to the second embodiment.
- FIG. 38 is a diagram illustrating a manner in which a reference color is calculated from a subject image.
- FIG. 39 is a diagram illustrating an example of a difference image.
- FIG. 1 is a block diagram illustrating a schematic configuration of an imaging device and a cradle according to Embodiment 1 of the present invention.
- FIG. 2 is a block diagram illustrating a schematic configuration of the dental colorimetric device according to Embodiment 1 of the present invention.
- a dental colorimetric system according to Embodiment 1 of the present invention includes an imaging device 1, a cradle 2, a dental colorimetric device 3 as an image processing device, and a display device. And 4.
- the imaging device 1 includes a light source 10, an imaging unit 20, an imaging control unit 30, a display unit 40, and an operation unit 50.
- the light source 10 is disposed near the tip of the imaging device 1 and emits illumination light having four or more different wavelength bands for illuminating a subject. Specifically, the light source 10 has seven light sources 10a to 10g that emit light of different wavelength bands. Each of the light sources 10a to 10g has four LEDs (Light Emitting Diode).
- FIG. 3 is a diagram showing a spectrum of the light source shown in FIG.
- the center wavelength of each of the light sources 10a to 10g is about 450 nm for the light source 10a, about 465 nm for the light source 10b, about 505 nm for the light source 10c, about 525 nm for the light source 10d, about 575 nm for the light source 10e, and about 575 nm for the light source 10f.
- the wavelength is about 605 nm
- the light source 10 g is about 630 nm.
- the emission spectrum information of the LED is stored in the LED memory 11 and used in the dental colorimeter 3 described later.
- These light sources 10a to 10g are arranged, for example, in a ring shape.
- the arrangement is not particularly limited, but, for example, four LEDs are arranged in ascending order of wavelength. Also, they may be arranged in reverse order or randomly. Further, in addition to the arrangement in which all the LEDs form one ring, the LEDs may be divided into a plurality of groups, and these one group may be arranged so as to form one ring. Note that the arrangement of the LEDs is not limited to the above-described ring shape, and is not limited to a cross-shaped arrangement, a rectangular arrangement, a left-right arrangement, a vertical arrangement, a random arrangement, or the like, as long as it does not interfere with imaging by the imaging unit 20 described later. Can be appropriately adopted.
- the light emitting element of the light source 10 is not limited to the LED, and for example, a semiconductor laser such as an LD (Laser @ Diode) or another light emitting element may be used.
- an illumination optical system (not shown) for irradiating illumination light from the light source 10 to the subject surface substantially uniformly is provided.
- a temperature sensor 12 that detects the temperature of the LED is provided near the light source 10.
- the imaging unit 20 includes an imaging lens 21, an RGB color imaging element 22, a signal processing unit 23, an AD converter 24, a focus lever 25, and a position detection unit 26.
- the imaging lens 21 forms a subject image irradiated by the light source 10.
- the RGB color image sensor 22 captures a subject image formed by the imaging lens 21 and outputs an image signal.
- the RGB color image pickup device 22 is constituted by, for example, a color type CCD, and its sensor sensitivity is such as to substantially cover a wide visible region. Further, the RGB color image pickup device 22 is not limited to a CCD, and a CMOS type or other various image pickup devices can be widely used. Further, instead of the RGB color image sensor 22 for imaging with RGB values, an image sensor for imaging with XYZ values may be provided as the image sensor.
- the signal processing unit 23 performs processing such as gain correction and offset correction on the analog signal output from the RGB color image sensor 22.
- the AD converter 24 converts an analog signal output from the signal processing unit 23 into a digital signal.
- the focus lever 25 is connected to the imaging lens 21 and adjusts the focus of the imaging device 1.
- the focus lever 25 is used to manually adjust the focus.
- the position detector 26 detects the position of the focus lever 25.
- the imaging control unit 30 includes a CPU (Central Processing Unit) 31, an LED driver 32, a data I / F 33, a communication I / F controller 34, an image memory 35, and an operation unit I / F 36. These units are connected to the local bus 37 and can exchange data via the local bus 37.
- CPU Central Processing Unit
- the CPU 31 controls the imaging unit 20, records the spectral image of the subject acquired and processed by the imaging unit 20 in the image memory 35 via the local bus 37, and outputs the spectral image to the LCD controller 41 described later. .
- the LED driver 32 controls light emission of various LEDs included in the light source 10.
- the data I / F 33 is an interface for receiving the contents of the LED memory 11 of the light source 10 and the information of the temperature sensor 12.
- the communication I / F controller 34 is connected to a communication I / F contact 61 serving as a connection point with the outside, and has, for example, a function for realizing communication conforming to USB 2.0.
- the image memory 35 temporarily stores image data captured by the imaging unit 20.
- the image memory 35 has a memory capacity capable of recording at least seven spectral images and one RGB color image.
- the operation unit I / F 36 is connected to various operation buttons included in the operation unit 50 described later, and serves as an interface that transfers an input instruction input via the operation unit 50 to the CPU 31 via the local bus 37. Function.
- the display unit 40 includes an LCD controller 41, an LCD 42, and an overlay memory 43.
- the LCD controller 41 causes an LCD (Liquid Crystal Display) 42 to display an image based on the image signal transferred from the CPU 31, for example, an image currently captured by the imaging unit 20 or a captured image.
- the image pattern stored in the overlay memory 43 may be superimposed on the image obtained from the CPU 31 and displayed on the LCD 42.
- the image pattern recorded in the overlay memory 43 is, for example, a horizontal line for photographing the entire tooth horizontally, a cross line crossing the horizontal line, a photographing mode, an identification number of the photographed tooth, and the like.
- the operation unit 50 includes various operation switches and operation buttons for the user to input an instruction to start a spectral image capturing operation and to input an instruction to start or end a moving image capturing operation.
- the operation unit 50 has a shooting mode changeover switch 51, a shutter button 52, and a viewer control button 53.
- the shooting mode switch 51 is a switch for switching between normal RGB shooting and multi-band shooting.
- the viewer control button 53 is a switch for performing operations such as changing an image displayed on the LCD 42.
- the imaging device 1 includes a lithium battery 60, a communication I / F contact 61, a charging contact 62, a battery LED 63, a power LED 64, and an alarm buzzer 65.
- the lithium battery 60 is a storage battery for supplying power to each unit included in the imaging device 1, and is connected to a charging contact 62 which is a contact for charging.
- the battery LED 63 notifies the charge state of the lithium battery 60.
- the battery LED 63 includes, for example, three LEDs of red, yellow, and green. The green LED lights up to notify that the battery capacity of the lithium battery 60 is sufficient, and the yellow LED lights up to indicate that the battery capacity is low. The fact that charging is necessary is notified, and the fact that the battery capacity is extremely low, in other words, that charging is urgently required is notified by lighting in red.
- the power LED 64 indicates the status of the camera.
- the power LED 64 includes, for example, two LEDs including red and green.
- the power LED 64 lights up in green to notify that the shooting preparation is completed, and blinks green to notify that the shooting preparation is being performed (initial warming or the like). Then, a red light indicates that the battery is being charged.
- the alarm buzzer 65 notifies the danger at the time of shooting.
- the alarm buzzer 65 issues a warning to notify that the captured image data is invalid.
- the cradle 2 supports the mounted imaging device 1.
- the cradle 2 includes a color chart 100, a micro switch 101, a power switch 102, a power lamp 103, a mounting lamp 104, a power connector 105, and an AC adapter 106.
- the color chart 100 is used to calibrate the imaging unit 20.
- the micro switch 101 is used to confirm whether or not the imaging device 1 is mounted at a normal position.
- the power lamp 103 is used to turn on / off the power.
- the power lamp 103 is turned on / off in conjunction with turning on / off of the power switch 102.
- the mounting lamp 104 is turned on / off in conjunction with the microswitch 101 to notify whether or not the imaging apparatus 1 is mounted at a normal position. For example, the mounting lamp 104 lights green when the imaging device 1 is mounted in a normal position, and lights red when the imaging device 1 is not mounted.
- the power supply connector 105 is connected to an AC power supply via an AC adapter 106.
- the dental colorimeter 3 includes a chromaticity calculator 70, a shade guide calculation processor 80, a multiband image memory 110, an RGB image memory 111, and a color image creation processor 112. , An image filing unit 113, a shade guide information storage unit 114, and an image display GUI unit 115.
- the multi-band image memory 110 records the multi-band image and the color chart image transferred from the imaging device 1.
- the RGB image memory 111 records the RGB image transferred from the imaging device 1.
- the color image creation processing unit 112 creates RGB image data of the teeth of the subject to be displayed on the monitor.
- the image filing unit 113 stores the RGB image data of the tooth of the subject generated by the color image creation processing unit 112.
- the shade guide information storage unit 114 stores, for example, photographed image data (hereinafter also referred to as a “color sample image”) of the shade guide manufactured by the manufacturer for each manufacturer of the shade guide in which the color samples are arranged in a row. ) Is stored in association with the shade guide number, and a spectral spectrum (specifically, a spectral reflectance spectrum) of a predetermined area of the shade guide and a gingival shade guide image are stored. Further, the shade guide information storage unit 114 stores second color information which is color information of the color sample image (for example, spectral reflectance, L * a * b * value, L * C * h value, CMYK value, etc.). Are stored for each pixel of the color sample image.
- the image display GUI unit 115 creates screen image data based on various images and displays the screen image data on the display screen of the display device 4.
- the chromaticity calculation unit 70 includes a spectrum estimation calculation unit 71, an observation spectrum calculation unit 72, and a chromaticity value calculation unit 73.
- the chromaticity calculation unit 70 reads the multiband image and the color chart image from the multiband image memory 110, and performs a spectrum estimation process (in the first embodiment, a spectrum of the spectral reflectance in the first embodiment) by the spectrum estimation calculation unit 71.
- FIG. 4 is a diagram showing a schematic internal configuration of the spectrum estimation calculation unit shown in FIG.
- the spectrum estimation calculation unit 71 includes a conversion table creation unit 711, a conversion table 712, an input ⁇ correction unit 713, a pixel interpolation calculation unit 714, an in-plane unevenness correction unit 715, a matrix calculation And a spectrum estimating matrix creating unit 717.
- the input gamma correction unit 713 and the pixel interpolation calculation unit 714 are separately provided for each of the multi-band image and the color patch image, and the input gamma correction unit 713a and the pixel interpolation calculation unit 714a are provided for the multiband image.
- An input ⁇ correction unit 713b and a pixel interpolation calculation unit 714b are provided for a color chart image.
- the multiband image and the color chart image are respectively transferred to the input ⁇ correction units 713a and 713b provided separately, and the input ⁇ correction is performed.
- image interpolation calculation processing is performed by each of the pixel interpolation calculation units 714a and 714b.
- the signals after these processes are transferred to the in-plane unevenness correction unit 715, where the in-plane unevenness correction processing of the multi-band image using the color chart image is performed.
- the multiband image is transferred to the matrix calculation unit 716, and the spectral reflectance as the subject image is calculated using the matrix created by the spectrum estimation matrix creation unit 717.
- the observation spectrum calculator 72 multiplies the tooth spectrum obtained by the spectrum estimator 71 by the illumination light to be observed, thereby obtaining the spectrum of the subject under the illumination light to be observed.
- the chromaticity value calculation unit 73 calculates color information (for example, spectral reflectance, L * a * b * value, L * C * h value, CMYK value, etc.) of the subject image from the spectrum of the subject under the illumination light to be observed. Is calculated as the first color information.
- color information for example, spectral reflectance, L * a * b * value, L * C * h value, CMYK value, etc.
- the shade guide calculation processing unit 80 includes a shade guide selection unit 81 as a color sample selection unit and a comparison calculation processing unit 82.
- the shade guide selecting unit 81 selects a shade guide that is closest to the subject's teeth. Specifically, the shade guide selecting unit 81 determines the tint of the tooth to be measured (spectral reflectance in the first embodiment) and the tint of the shade guide registered in the shade guide information storage unit 114 in advance. In the first embodiment, the shade guide is selected by comparing with the spectral reflectance spectrum) and finding the one with the smallest difference between the two.
- the comparison operation processing unit 82 includes an image information acquisition unit 821, a calculation unit 822, an image generation unit 823, and a display control unit 824.
- the image information acquisition unit 821 obtains a subject image obtained by capturing the subject, a color sample image to be compared with the subject, first color information that is color information for each pixel of the subject image, and color information based on the color sample image. And certain second color information. Further, the image information acquisition unit 821 may acquire a color sample image of a color close to the subject and the second color information for each part of the subject (for example, at the upper part, the center, and the lower part of the teeth).
- the calculation unit 822 calculates a difference based on the first color information and the second color information for each pixel.
- the difference is a value obtained by subtracting a value corresponding to the second color information from a value corresponding to the first color information for each pixel. Further, the calculating unit 822 may calculate a difference based on the first color information and the second color information for each part (for example, the upper part, the center, and the lower part).
- the image generation unit 823 generates difference image data obtained by normalizing a part having a difference of zero to a reference color having a saturation of zero. Specifically, the image generation unit 823 calculates an XYZ value based on the first color information and the second color information, calculates an RGB value based on the calculated XYZ value, and converts the calculated RGB value into an RGB value. The difference image data is generated based on the difference image. In the case where an image sensor that captures images with XYZ values is provided instead of the RGB color image sensor 22, if the RGB values are calculated based on the captured XYZ values and the difference image data is generated based on the calculated RGB values, Good.
- the display control unit 824 causes the display device to display the difference image via the image display GUI unit 115.
- FIG. 5 is a flowchart showing a procedure of processing realized by the dental colorimetric device according to Embodiment 1 of the present invention.
- a multi-band image which is an image of a tooth of a subject, is captured by the imaging device 1 (step S1).
- FIG. 6 is a flowchart showing a procedure of a process of capturing the multiband image shown in FIG.
- a user such as a dentist lifts the imaging device 1 from the cradle 2 and attaches a contact cap to a mounting hole (not shown) provided on the projection port side of the housing of the imaging device 1 ( Step S101).
- This contact cap is formed in a substantially cylindrical shape from a flexible material.
- step S102 the user sets the shooting mode to “colorimetric mode” (step S102). Then, the subject image is displayed on the LCD 42 as a moving image.
- Step S104 the user performs focus adjustment using the focus lever 25 (Step S104).
- the contact cap is formed in a shape that guides the vital tooth to be measured to an appropriate imaging position, positioning can be easily performed.
- step S105 the LED driver 32 sequentially drives the light sources 10a to 10g, so that LED irradiation lights having different wavelength bands are sequentially irradiated on the subject.
- the reflected light of the subject is imaged on the surface of the RGB color image sensor 22 of the imaging unit 20, and is captured as an RGB image.
- the captured RGB images are transferred to the signal processing unit 23.
- the signal processing unit 23 performs predetermined image processing on the input RGB image signal (step S107). Further, the signal processing unit 23 selects predetermined one-color image data corresponding to the wavelength band of each of the light sources 10a to 10g from the RGB image signals. Specifically, the signal processing unit 23 selects B image data from the image signals corresponding to the light sources 10a and 10b, and selects G image data from the image signals corresponding to the light sources 10c to 10e. The R image data is selected from the image signal corresponding to 10 g. As described above, the signal processing unit 23 selects image data having a wavelength substantially matching the center wavelength of the irradiation light.
- the image data selected by the signal processing unit 23 is transferred to the AD converter 24, and is recorded in the image memory 35 via the CPU 31 (Step S108).
- the image memory 35 an image of a color selected from the RGB images corresponding to the center wavelength of the LED is recorded as a multi-band image.
- the irradiation time of the LED, the irradiation intensity, the electronic shutter speed of the image pickup device, and the like are controlled by the CPU 31 so that the photographing of each wavelength is appropriately exposed. In this case, the alarm buzzer 65 sounds and a warning is issued.
- Step S109 the measurement of the calibration image is performed.
- the measurement of the calibration image is to photograph the color chart 100 in the same procedure as the above-described multi-band photographing.
- the multiband image of the color chart 100 is recorded in the image memory 35 as a color chart image.
- the color chart 100 is photographed in a state where no LED is lit (under dark), and this image is recorded in the image memory 35 as a dark current image (step S110).
- the dark current image may be photographed a plurality of times, and an image obtained by averaging these images may be used.
- signal correction is performed for each of the multi-band image and the color chart image using the external light image and the dark current image recorded in the image memory 35 (step S111).
- the signal correction for the multi-band image is performed by, for example, subtracting the signal value of the external light image data for each pixel from each image data of the multi-band image to remove the influence of external light at the time of shooting. it can.
- the signal correction for the color patch image is performed, for example, by subtracting the signal value of the dark current image data for each pixel from the image data of the color patch image, and the dark current noise (CCD) of the CCD that changes with temperature. Dark (Noise) can be removed.
- CCD dark current noise
- FIG. 7 is a diagram for explaining signal correction. 7, the vertical axis represents the sensor signal value, the horizontal axis represents the input light intensity, the solid line represents the original signal before correction, and the broken line represents the signal after signal correction.
- the multiband image and the color chart image after the signal correction are performed via the local bus 37, the communication I / F controller 34, and the communication I / F contact 61, as shown in FIG. It is transmitted to the color device 3 and recorded in the multi-band image memory 110 in the dental colorimeter 3 (step S2).
- the multi-band image and the dark current image of the color chart 100 are not recorded in the image memory 35 in the imaging device 1 but are transmitted through the local bus 37, the communication I / F controller 34, and the communication I / F contact 61.
- the data may be directly transmitted to the dental colorimetric device 3 and recorded in the multi-band image memory 110 in the dental colorimetric device 3. In this case, the above-described signal correction is performed in the dental colorimeter 3.
- the imaging apparatus 1 sequentially irradiates the subject with illumination light of seven types of wavelength bands (illumination light of seven primary colors) and captures seven subject spectral images as still images, as well as an RGB image. It is also possible to take in.
- One of the methods of capturing RGB images is to use an RGB color CCD, which has an object illuminated by natural light or room light without using illumination light of seven primary colors, similarly to a normal digital camera. This is for imaging.
- one or more illumination lights from the seven primary color illumination lights are selected as illumination light of three colors of RGB, and these are sequentially irradiated, so that a still image of a frame sequential type can be captured.
- RGB imaging is used when imaging a wide area such as when imaging the entire face of a patient or when imaging the entire chin.
- multi-band imaging is used when accurately measuring the color of one or two teeth of a patient, that is, when performing color measurement of teeth.
- the conversion table creation unit 711 creates the conversion table 712 as a stage before the input gamma correction (step S3). Specifically, the conversion table creation unit 711 has data in which the input light intensity is associated with the sensor signal value, and creates the conversion table 712 based on the data. This conversion table 712 is created from the relationship between the input light intensity and the output signal value. 8 and 9 are diagrams illustrating the input ⁇ correction. As shown by the solid line in FIG. 8, the conversion table 712 is created so that the input light intensity and the sensor signal value have a substantially proportional relationship.
- Each of the input gamma correction units 713a and 713b performs input gamma correction on the multiband image and the color chart image by referring to the conversion table 712 (step S4).
- This conversion table is created such that the input light intensity D corresponding to the current sensor value A is obtained, and the output sensor value B corresponding to the input light intensity D is output, and as a result, as shown in FIG. .
- the corrected image data is transferred to the pixel interpolation calculation units 714a and 714b, respectively.
- the pixel interpolation calculation units 714a and 714b perform pixel interpolation on each of the multiband image data and the color chart image data after the input ⁇ correction (step S5). Specifically, the pixel interpolation calculation units 714a and 714b perform pixel interpolation by applying a low-pass filter.
- FIG. 10 is a diagram illustrating an example of a low-pass filter applied to the R signal and the B signal in the pixel interpolation calculation.
- FIG. 11 is a diagram illustrating an example of a low-pass filter applied to a G signal in a pixel interpolation operation.
- an image of 144 pixels ⁇ 144 pixels is converted into an image of 288 pixels ⁇ 288 pixels.
- the image data g k (x, y) on which the image interpolation operation has been performed is transferred to the in-plane unevenness correction unit 715.
- the in-plane unevenness correction unit 715 corrects in-plane unevenness of the multi-band image data (Step S6). Specifically, the in-plane unevenness correction unit 715 corrects the luminance at the center of the screen of the multi-band image data using the following equation (1).
- c k (x, y) is color chart image data
- g k (x, y) is multiband image data after input ⁇ correction
- (x 0 , y 0 ) is a central pixel position
- This in-plane unevenness correction is performed on each image data of the multi-band image data.
- the multi-band image data g ′ k (x, y) after the in-plane unevenness correction is transferred to the matrix calculation unit 716.
- the matrix calculation unit 716 performs a spectrum (spectral reflectance in the first embodiment) estimation process using the multiband image data g ′ k (x, y) from the in-plane unevenness correction unit 715 (step S7). ).
- the spectral reflectance is estimated at 1 nm intervals in the wavelength band from 380 nm to 780 nm. That is, in the first embodiment, the 401-dimensional spectral reflectance is estimated.
- a heavy and expensive spectrometer or the like is used to obtain the spectral reflectance for each wavelength.
- the subject is limited to the teeth, so that the subject has By using certain features, a 401-dimensional spectral reflectance is estimated with a small number of bands.
- a 401-dimensional spectrum signal is calculated by performing a matrix operation using the multi-band image data g ′ k (x, y) and the spectrum estimation matrix Mspe.
- the spectrum estimation matrix Mspe is created by the spectrum estimation matrix creation section 717 based on the spectral sensitivity data of the camera, the LED spectrum data, and the statistical data of the subject (tooth).
- the creation of the spectrum estimation matrix is not particularly limited, and a known method can be used. For example, one example is S.S. K. Park and F. O. Huck, "Estimation of spectral reflections, curves, from multispectrum, image, data," and “Applied Optics,” 16, pp3107-3114 (1977).
- the spectral sensitivity data of the camera, the LED spectrum data, the statistical data of the subject (tooth), and the like are stored in the image filing unit 113 shown in FIG. 2 in advance. Further, when the spectral sensitivity of the camera changes depending on the position of the sensor, the spectral sensitivity data may be acquired according to the position, or the position at the center may be appropriately corrected and used. .
- the spectrum estimation calculation unit 71 When the spectral reflectance, which is the subject image, is calculated by the spectrum estimation calculation unit 71, the calculation result is obtained together with the multi-band image data and the shade guide selection unit 81 in the shade guide calculation processing unit 80 shown in FIG. The data is transferred to the observation spectrum calculator 72 in the calculator 70.
- an area specifying process for specifying an area of a tooth to be measured is performed (step S8).
- the multi-band image data captured by the imaging device 1 includes information on a tooth to be measured, a tooth next to the tooth, a gum, and the like. Therefore, in the region specifying process, a process of specifying the region of the tooth to be measured from these intraoral image data is performed.
- the horizontal axis represents wavelength and the vertical axis represents reflectance. Since the teeth are entirely white and the gums red, the spectra of both are, as can be seen from FIGS. 12 and 13, a blue wavelength band (eg, 400 nm to 450 nm) and a green wavelength band (eg, 530 nm). 580 nm).
- attention is paid to the fact that a tooth has a specific reflection spectrum, and a pixel having a specific spectrum of a tooth reflection spectrum is extracted from image data to thereby reduce a tooth area. Identify.
- a wavelength band characteristic value defined by a signal value of each of n types of wavelength bands in a region (a pixel or a group of pixels) in an image represented by captured multiband image data is represented in an n-order space.
- a plane region representing the feature of the measurement target is defined, and when the wavelength band characteristic value represented in the n-order space is projected on the plane region, the wavelength band characteristic value is defined as
- the region (contour) to be measured is determined by judging that the region in the image included in the image is included in the region of the tooth to be measured.
- FIG. 14 is an explanatory diagram for explaining a method of specifying a tooth area.
- a seven-dimensional space is formed by seven wavelengths ⁇ 1 to ⁇ 7.
- a classification plane that best separates the tooth to be measured is set.
- classification spectra d1 ( ⁇ ) and d2 ( ⁇ ) for planar projection are obtained.
- a predetermined area is cut out from the captured multi-band image data, and a feature amount represented in a seven-dimensional space is calculated as a wavelength band characteristic value.
- the extraordinary amount is a set of seven signal values when each band is averaged in this region and converted into seven signal values.
- the size of the region to be cut out is, for example, 2 pixels ⁇ 2 pixels, but is not limited thereto, and may be 1 pixel ⁇ 1 pixel or 3 pixels ⁇ 3 pixels or more.
- the characteristic amount is represented by one point in the seven-dimensional space in FIG.
- One point in the seven-dimensional space represented by the feature amount is projected on the classification plane to obtain one point on the classification plane.
- the coordinates of one point on the classification plane can be obtained from an inner product operation with the classification spectra d1 ( ⁇ ) and d2 ( ⁇ ). If one point on the classification plane is included in the area T on the classification plane defined by the characteristic spectrum of the tooth, that is, the plane area representing the characteristic of the measurement target, the cut-out area is included in the contour of the tooth. It is determined that it is an area. On the other hand, if one point on the classification plane is included in the area G on the classification plane defined by the characteristic spectrum of the gum, it is determined that the cut-out area is an area included in the contour of the gum.
- the region of the tooth is specified by sequentially performing such a determination while changing the region to be cut out.
- the region of the tooth to be measured is usually located near the center of the image represented by the captured multiband image data, the cutout region is changed from near the center of the image to the periphery.
- the region of the tooth to be measured (that is, the contour of the tooth to be measured) is specified.
- the feature amount is defined in a seven-dimensional space that is larger than the three-dimensional space represented by normal RGB, it is preferable because a region (contour) to be measured can be specified more accurately.
- [Tooth area identification method 2] In addition to the method of specifying an area based on the classification spectrum, the present method extracts, for example, only specific signal values (spectrums) corresponding to a blue wavelength band and a green wavelength band, and compares these signal values. , A region having a signal value (spectrum) peculiar to a tooth is specified as a tooth region. According to such a method, since the number of samples to be compared is reduced, it is possible to easily specify an area in a short time.
- the position of the inflection point where the spectral feature value changes rapidly from the center of the image to the periphery is measured, and the position is measured. Determined as the contour of the target tooth.
- a subject (tooth) to be detected is compared with a subject (tooth other than teeth, for example, gums) to be separated, and characteristic bands ⁇ 1 and ⁇ 2 are selected, and a ratio thereof is set as a spectral feature value.
- a ratio between two points is calculated, and an inflection point of the ratio is obtained.
- the contour with the adjacent tooth is determined, and the pixel of the tooth to be measured can be obtained.
- an average of a pixel group including a plurality of pixels may be obtained, and the specification may be performed for each pixel group based on the average.
- the example in which the area is specified for the tooth has been described, but the gum area may be specified.
- the area specifying method for example, it is possible to display the tooth to be measured on the display device 4 and allow the user to set the contour on the screen displayed on the display device 4.
- FIG. 15 is a diagram illustrating an example of the measurement area set in the measurement area setting process. As shown in FIG. 15, this measurement area is set as a rectangular area at the upper part (Certical) C1, the center (Body) C2, and the lower part (Incisal) C3 of the tooth surface. For example, an area having a certain ratio of area to the tooth height is set. That is, the setting area and its position are set at a fixed ratio regardless of whether the teeth are small or large.
- the shape of the measurement region is not limited to a rectangle as shown in FIG. 15, and may be, for example, a circle, an ellipse, an asymmetric shape, or the like.
- a shade guide selection process for selecting the closest shade guide for each set measurement area is performed (step S10).
- a comparison between the tint of the tooth to be measured and the tint of the shade guide is performed. This comparison is performed for each of the measurement areas set previously, and the spectrum of the target measurement area (spectral reflectance in the first embodiment) and each shade guide registered in the shade guide information storage unit 114 in advance. (In the first embodiment, the spectral reflectance spectrum), and the difference between the two is determined.
- Such a shade guide selection process is performed, for example, by obtaining a spectrum determination value Jvalue based on the following equation (2).
- Jvalue is a spectrum determination value
- C is a normalization coefficient
- n is a statistical number (the number of ⁇ used in the calculation)
- ⁇ is a wavelength
- f 1 ( ⁇ ) is a spectrum of a tooth to be determined.
- the reflectance spectrum value, f 2 ( ⁇ ) is the spectral reflectance spectrum value of the shade guide
- E ( ⁇ ) is the judgment sensitivity correction value.
- weighting is performed on spectral sensitivity according to ⁇ according to E ( ⁇ ).
- the value of the spectrum of the shade guide of each company is substituted into f 2 ( ⁇ ) of the equation (2), and each spectrum judgment value Jvalue is calculated. Then, it is determined that the shade guide having the smallest spectrum determination value Jvalue has the shade guide number closest to the tooth.
- a plurality (for example, three) of candidates are extracted in ascending order of the spectrum determination value Jvalue. Of course, one candidate can be extracted.
- the determination sensitivity correction value E ( ⁇ ) may be given various weights.
- the shade guide selecting unit 81 determines the chromaticity value (colorimetric information) L2 * a2 * b2 * and the captured image data at each pixel of the selected shade guide.
- the chromaticity values L2 * a2 * b2 *, the captured image data, and the shade guide number acquired from the shade guide information storage unit 114 are output to the image display GUI unit 115 and the comparison operation processing unit 82.
- the observation spectrum calculation unit 72 of the chromaticity calculation unit 70 multiplies the spectrum of the tooth obtained by the spectrum estimation calculation unit 71 by the illumination light S ( ⁇ ) to be observed, thereby obtaining the light under the illumination light to be observed.
- the spectrum of the subject is required.
- This S ( ⁇ ) is a light source such as a D65 or D55 light source, a fluorescent light source, or the like, for which the user wants to observe the color of the teeth.
- the spectrum of the subject under the illumination light to be observed obtained by the observation spectrum calculation unit 72 is transferred to the chromaticity value calculation unit 73.
- the chromaticity value calculator 73 calculates a chromaticity value L1 * a1 * b1 * for each pixel from the spectrum of the subject under the illumination light to be observed, and calculates a chromaticity value L1 * a1 * b1 * for each pixel. It is calculated as first color information. Further, the chromaticity value calculation unit 73 transfers a value obtained by averaging the chromaticity values of the predetermined area to the image display GUI unit 115.
- the predetermined area is set at, for example, three positions of an upper part, a center, and a lower part of the tooth.
- the spectrum G (x, y, ⁇ ) of the subject under the illumination light to be observed is sent to the color image creation processing unit 112, and the RGB image data (captured image data of a living tooth) to be displayed on a monitor ) RGB2 (x, y).
- the RGB image data of the vital tooth created by the color image creation processing unit 112 is output to the image display GUI unit 115 and transferred to the comparison operation processing unit 82 in the shade guide operation processing unit 80. Further, the RGB image data of the living tooth is output to the image filing unit 113 to be stored.
- the color image creation processing unit 112 may create RGB image data of a desired color tone by further performing correction such as contour enhancement on the RGB image data.
- the image information acquisition unit 821 acquires the subject image from the spectrum estimation computation unit 71 and the first color information from the chromaticity value computation unit 73, and also acquires each part of the subject (for example, the upper part C1 and the center C2). Then, a color sample image of a color close to the subject in the lower part C3) and second color information that is color information based on the color sample image are acquired from the shade guide information storage unit 114 (step S11).
- FIG. 16 is a diagram illustrating a manner of calculating a difference value between the color sample image and the subject image for each part.
- the image information acquisition unit 821 acquires the second color information at the center of the upper part C1 of the color sample image A1 from the shade guide information storage unit 114.
- the image information acquisition unit 821 separates the second color information at the center of the center C2 of the color sample image A1 and the second color information at the center of the lower portion C3 of the color sample image A1 into shade guide information. Obtained from the storage unit 114.
- the calculation unit 822 calculates average values Av1 to Av3 of the second color information for each part (upper part C1, middle part C2, lower part C3). Further, the calculation unit 822 calculates a value obtained by subtracting a value corresponding to the second color information from a value corresponding to the first color information for each pixel included in each part (step S12). Specifically, the calculation unit 822 subtracts the average value Av1 of the second color information from the first color information for each pixel included in the upper part C1 of the subject image B1. Similarly, the calculation unit 822 subtracts the average value Av2 of the second color information from the first color information for each pixel included in the center C2 of the subject image B1, and calculates each pixel included in the lower part C3 of the subject image B1. In the pixel, the average value Av3 of the second color information is subtracted from the first color information.
- FIG. 17 is a diagram illustrating an example of a spectrum based on the first color information and the second color information. As shown in FIG. 17, the calculation unit 822 subtracts the spectrum D2 drawn based on the second color information from the spectrum D1 drawn based on the first color information.
- FIG. 18 is a diagram illustrating an example of a difference between the first color information and the second color information. As shown in FIG. 18, when the long-wavelength component (red component) of the spectrum D1 based on the first color information is larger than the spectrum D2 based on the second color information, the difference includes many red components.
- the image generation unit 823 generates difference image data in which a portion having no difference is normalized to a reference color having a saturation of zero (step S13). Specifically, the image generation unit 823 calculates an XYZ value based on the first color information and the second color information, calculates an RGB value based on the calculated XYZ value, and converts the calculated RGB value into an RGB value. The difference image data is generated based on the difference image.
- the display control unit 824 causes the display device to display the difference image via the image display GUI unit 115 (step S14).
- FIG. 19 is a diagram illustrating an example of a difference image.
- a coloring Col1 of a reference color having a saturation of zero is applied to a central portion of a tooth having no difference.
- a red color Col2 is applied to the difference image F1.
- a green coloring Col3 is applied to the difference image F1
- the blue coloring Col4 is applied to the difference image F1.
- the colorings Col1 to Col4 of each color are represented using hatching patterns and fineness. The smaller the hatched area, the deeper the color component of each color and the higher the saturation.
- the first embodiment it is possible to visually determine how much the color sample image A1 of the selected shade guide deviates from the subject image B1 for each part of the subject by the color difference from the reference color. Can be recognized. As a result, a user such as a dentist can intuitively recognize how much color must be added to the color sample image A1.
- a shade guide whose spectral reflectance spectrum is similar to that of a vital tooth is selected as a candidate, and the difference image F1 is created only for the selected shade guide.
- a difference image may be created for all shade guides stored in the information storage unit 114. According to this aspect, there is no need to select a shade guide having a spectral reflectance spectrum closest to the spectral reflectance spectrum of a vital tooth, and thus the shade guide selecting unit 81 can be eliminated.
- a shade guide specifying unit is provided on the screen for inputting a shade guide number that the user desires to compare, and a difference image is generated using the shade guide identified by the shade guide number input by the shade guide specifying unit as a comparison target. It may be created.
- the dental colorimetric device 3 includes a CPU, a main storage device such as a RAM, and a computer-readable recording medium in which a program for realizing all or a part of the above processing is recorded. Then, the CPU reads out the program recorded on the storage medium and executes information processing and arithmetic processing, thereby realizing the same processing as that of the above-described failure diagnosis device.
- the computer-readable recording medium refers to a magnetic disk, a magneto-optical disk, a CD-ROM, a DVD-ROM, a semiconductor memory, or the like.
- the computer program may be distributed to a computer via a communication line, and the computer that has received the distribution may execute the program.
- the calculation unit 822 may calculate a value obtained by subtracting the first color information from the second color information for each part (upper part C1, center C2, lower part C3).
- FIG. 20 is a diagram illustrating an example of a difference between the first color information and the second color information. As illustrated in FIG. 20, the calculation unit 822 subtracts the spectrum D2 depicted based on the first color information from the spectrum D1 depicted based on the second color information in FIG. can get.
- FIG. 21 is a diagram illustrating an example of a difference image.
- a coloring Col11 of a reference color having a saturation of zero is applied to a central portion of a tooth where there is no difference.
- a red color Col12 is applied to the difference image F2
- a green color Col13 is applied to the difference image F2
- the difference is a blue component.
- blue coloring Col14 is applied. According to the difference image F2, an excessive color can be intuitively recognized in the color sample image A1 with respect to the subject image B1.
- FIG. 22 is a diagram illustrating a state where a difference between the first color information and the second color information is calculated.
- the shade guide selecting unit 81 sets one measurement area in the center of the tooth area to be measured, and selects the shade guide that is the closest to this measurement area.
- the calculation unit 822 subtracts the average value of the second color information of the measurement area from the first color information for each pixel included in the subject image B1.
- the shade guide selecting unit 81 may select at least one shade guide.
- FIG. 23 is a diagram illustrating how to calculate the difference between the first color information and the second color information.
- the shade guide selecting unit 81 sets one measurement area at the center of the tooth area to be measured, and selects the shade guide that is the closest to this measurement area.
- the calculating unit 822 deforms and maps the color sample image A1 so as to match the shape of the subject image B1, and in each pixel included in the subject image B1, converts the second color information of the corresponding pixel from the first color information. Subtract. As in the third modification, the second color information does not need to use the average value.
- FIG. 24 is a diagram illustrating how glare is detected in the subject image.
- the subject image B1 includes glare G1 due to regular reflection of illumination light.
- FIGS. 25 to 27 are diagrams showing difference images in which glare has been corrected.
- the image generation unit 823 may detect the glare Gl and generate a difference image F31 in which the glare Gl is white.
- the image generation unit 823 may detect the glare Gl and generate a difference image F32 in which the glare Gl is blackened.
- the image generation unit 823 may detect the glare Gl, and generate a difference image F33 in which the glare Gl is colored by surrounding colors.
- the configuration may be such that the calculation unit 822 detects glare Gl and does not calculate the difference value of glare Gl.
- the difference in tint between the subject image B1 and the color sample image A1 can be intuitively recognized in consideration of the influence of the glare Gl.
- FIG. 28 and FIG. 29 are diagrams illustrating the appearance of displaying the reference color around the difference image.
- the display control unit 824 displays a reference color bar H11 that is a bar colored with the reference color on the right side of the difference image F1.
- the display control unit 824 displays a reference color frame H12 that is a frame in which the reference color is applied around the difference image F1.
- the display control unit 824 may display a reference color around the area of the display device 4 where the difference image F1 is displayed. As a result, a region where the color of the subject image B1 matches the color of the color sample image A1 can be more easily recognized, and the influence of metamerism can be reduced.
- FIG. 30 is a diagram illustrating a state in which a patch is superimposed on a difference image.
- the display control unit 824 displays a bar including the reference color and gradually changing color from white to black on the right side of the area of the display device 4 where the difference image F1 is displayed. Further, when a desired color in the bar is dragged onto the difference image F1 with an input device such as a mouse, the patch I of that color is displayed so as to be superimposed on the difference image F1. As a result, a desired color in the difference image F1 can be easily recognized, and the influence of metamerism can be reduced.
- FIG. 31 and FIG. 32 are diagrams showing how the reference color is changed.
- the image generation unit 823 causes the display control unit 824 to display a bar whose color changes stepwise from white to black on the right side of the area of the display device 4 where the difference image F41 is displayed. .
- Each color in this bar has zero saturation and different lightness.
- the image generation unit 823 generates a plurality of difference images having different lightness of the reference color, and causes the display device 4 to display the difference image F41 selected according to the input.
- FIG. 31 and FIG. 32 are diagrams showing how the reference color is changed.
- the image generation unit 823 causes the display control unit 824 to display a bar whose color changes stepwise from white to black on the right side of the area of the display device 4 where the difference image F41 is displayed. .
- Each color in this bar has zero saturation and different lightness.
- the display control unit 824 may display a reference color frame H2 that is a frame in which the selected reference color is applied around the difference image F41. According to Modification 7, the user can set the reference color in the difference image F41 to a desired color, and can reduce the influence of metamerism.
- FIG. 33 is a diagram illustrating a state in which difference images of a plurality of reference colors are displayed side by side.
- the image generating unit 823 displays the difference images F51 to F53 having different lightness of the reference color side by side. Further, the image generation unit 823 may display a bar whose color changes stepwise from white to black on the right side of the difference images F51 to F53.
- a desired reference color can be set from the difference images F51 to F53 having different lightness, and the influence of metamerism can be reduced.
- FIG. 34 to FIG. 36 are diagrams illustrating a state in which the contrast with respect to the reference color is enhanced.
- the image generation unit 823 generates a difference image by performing modulation on the difference to increase the contrast of each of the RGB colors with respect to the reference color. More specifically, the image generation unit 823 includes a difference image F61 having a normal contrast shown in FIG. 34, a difference image F62 having a larger contrast of each of the RGB colors with respect to the reference color than the difference image F61 shown in FIG. A difference image F63 having a larger contrast of each of the RGB colors with respect to the reference color than the difference image F62 is generated. By switching and observing the difference images F61 to F63, the user can recognize a slight difference in color between the subject image B1 and the color sample image A1.
- FIG. 37 is a block diagram illustrating a schematic configuration of the image processing apparatus according to the second embodiment.
- the image processing device 3A according to the second embodiment calculates second color information corresponding to the color sample from the subject image, and generates a difference image.
- the subject is, for example, a leaf of a plant.
- the comparison operation processing unit 82A of the image processing device 3A includes a color information calculation unit 825A that calculates the second color information based on the first color information.
- the image processing apparatus 3A does not include the shade guide information storage unit 114 and the shade guide selection unit 81.
- FIG. 38 is a diagram illustrating a manner in which a reference color is calculated from a subject image.
- the subject is a leaf of a plant, but the color information calculation unit 825A calculates the second color information corresponding to the color sample from the first color information which is the color information of the subject image B2 including the color information of the leaf. calculate.
- the color information calculation unit 825A extracts, for example, a subject portion (a leaf portion of a plant in this case) of the subject image B2, and calculates the average value of the first color information of the subject portion as the second color information.
- the color information calculation unit 825A may calculate the second color information using a statistical value such as a mode value of the first color information of the subject portion.
- the color information calculation unit 825A may calculate the second color information using the first color information of a desired area such as the center of the subject.
- FIG. 39 is a diagram illustrating an example of a difference image.
- the color information calculation unit 825A calculates the second color information from the first color information, so that even if there is no data of the color sample image such as a leaf of a plant, a desired area from the subject is obtained. (For example, an area having a color different from the average color of the subject) can be extracted. As a result, the user can intuitively recognize, for example, a discolored portion of a leaf as a subject.
- Imaging device 2 cradle 3 dental colorimeter 3A image processing device 4 display device 10 light source 11 LED memory 12 temperature sensor 20 imaging unit 21 imaging lens 22 RGB color imaging device 23 signal processing unit 24 A / D converter 25 focus lever 26 position Detection unit 30 Imaging control unit 31 CPU 32 LED driver 33 Data I / F 34 communication I / F controller 35 image memory 36 operation unit I / F 37 local bus 40 display unit 41 LCD controller 42 LCD 43 Overlay Memory 50 Operation Unit 51 Shooting Mode Changeover Switch 52 Shutter Button 53 Viewer Control Button 60 Lithium Battery 61 Communication I / F Contact 62 Charging Contact 63 Battery LED 64 power LED 65 Alarm buzzer 70 Chromaticity calculating section 71 Spectrum estimation calculating section 72 Observation spectrum calculating section 73 Chromaticity value calculating section 80 Shade guide calculation processing section 81 Shade guide selecting section 82, 82A Comparison calculation processing section 100 Color chart 101 Micro switch 102 Power supply Switch 103 Power lamp 104 Mounting lamp 105 Power connector 106 AC adapter 110 Multiband image
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Epidemiology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
- Image Processing (AREA)
Abstract
L'invention concerne un dispositif de traitement d'image qui comporte : une unité d'acquisition d'informations d'image qui acquiert une image de sujet obtenue par imagerie d'un sujet photographique, une image d'échantillon de couleur devant être comparée au sujet, des premières informations de couleur étant des informations de couleur de pixels respectifs de l'image de sujet, et des secondes informations de couleur étant des informations de couleur basées sur l'image d'échantillon de couleur ; une unité de calcul qui calcule, pour chaque pixel, la différence entre les premières informations de couleur et les secondes informations de couleur ; une unité de génération d'image qui génère des données d'une image de différence obtenue par normalisation de parties, dans lesquelles la différence est nulle, à une couleur de référence ayant une saturation nulle ; et une unité de commande d'affichage qui amène l'image de différence à être affichée sur un dispositif d'affichage. Cette configuration fournit un dispositif de traitement d'image avec lequel il est possible de déterminer intuitivement la différence de teinte entre les dents vitales d'un patient et un guide de teinte sélectionné.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2018/023339 WO2019244254A1 (fr) | 2018-06-19 | 2018-06-19 | Dispositif de traitement d'image, procédé de fonctionnement pour dispositif de traitement d'image, et programme de fonctionnement pour dispositif de traitement d'image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2018/023339 WO2019244254A1 (fr) | 2018-06-19 | 2018-06-19 | Dispositif de traitement d'image, procédé de fonctionnement pour dispositif de traitement d'image, et programme de fonctionnement pour dispositif de traitement d'image |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019244254A1 true WO2019244254A1 (fr) | 2019-12-26 |
Family
ID=68983890
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/023339 WO2019244254A1 (fr) | 2018-06-19 | 2018-06-19 | Dispositif de traitement d'image, procédé de fonctionnement pour dispositif de traitement d'image, et programme de fonctionnement pour dispositif de traitement d'image |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2019244254A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004030158A (ja) * | 2002-06-25 | 2004-01-29 | Dent Craft:Kk | 歯科用補綴物の材料を特定するための方法、装置及びプログラム |
JP2005018628A (ja) * | 2003-06-27 | 2005-01-20 | Casio Comput Co Ltd | 電子機器、この電子機器に用いられるモニタ表示方法及びプログラム |
JP2007047045A (ja) * | 2005-08-10 | 2007-02-22 | Olympus Corp | 画像処理装置及び方法並びにプログラム |
JP2015066192A (ja) * | 2013-09-30 | 2015-04-13 | 株式会社ジーシー | 歯牙測色方法、歯牙測色装置、及び歯牙測色プログラム |
JP2016051277A (ja) * | 2014-08-29 | 2016-04-11 | 株式会社ニコン | 被写体検索装置、被写体検索方法、および被写体検索プログラム |
-
2018
- 2018-06-19 WO PCT/JP2018/023339 patent/WO2019244254A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004030158A (ja) * | 2002-06-25 | 2004-01-29 | Dent Craft:Kk | 歯科用補綴物の材料を特定するための方法、装置及びプログラム |
JP2005018628A (ja) * | 2003-06-27 | 2005-01-20 | Casio Comput Co Ltd | 電子機器、この電子機器に用いられるモニタ表示方法及びプログラム |
JP2007047045A (ja) * | 2005-08-10 | 2007-02-22 | Olympus Corp | 画像処理装置及び方法並びにプログラム |
JP2015066192A (ja) * | 2013-09-30 | 2015-04-13 | 株式会社ジーシー | 歯牙測色方法、歯牙測色装置、及び歯牙測色プログラム |
JP2016051277A (ja) * | 2014-08-29 | 2016-04-11 | 株式会社ニコン | 被写体検索装置、被写体検索方法、および被写体検索プログラム |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR100998805B1 (ko) | 화상 합성 장치 | |
JP5968944B2 (ja) | 内視鏡システム、プロセッサ装置、光源装置、内視鏡システムの作動方法、プロセッサ装置の作動方法、光源装置の作動方法 | |
JP6254826B2 (ja) | カメラシステム、これに用いられる色変換装置及び方法並びに色変換プログラム | |
JP2007047045A (ja) | 画像処理装置及び方法並びにプログラム | |
JP6045130B2 (ja) | 歯色調マッピングのための方法 | |
US20070140553A1 (en) | Dental colorimetry apparatus | |
JP4800591B2 (ja) | 撮影システム | |
JP5190944B2 (ja) | 内視鏡装置および内視鏡装置の作動方法 | |
JP2012020130A (ja) | 歯色調マッピング | |
US20110188086A1 (en) | Tooth shade analyzer system and methods | |
JP2005000652A (ja) | 歯科用カラー画像システム | |
JP2010081057A (ja) | 画像処理装置、画像処理方法および測色システム | |
US20050196039A1 (en) | Method for color determination using a digital camera | |
JP3989522B2 (ja) | 歯科用測色装置、システム、方法、およびプログラム | |
JP3989521B2 (ja) | 画像合成装置およびその方法並びにプログラム | |
JP2009195495A (ja) | 歯科用測色装置 | |
JP7015382B2 (ja) | 内視鏡システム | |
WO2019244254A1 (fr) | Dispositif de traitement d'image, procédé de fonctionnement pour dispositif de traitement d'image, et programme de fonctionnement pour dispositif de traitement d'image | |
JP4831962B2 (ja) | 撮影装置 | |
JP2009053160A (ja) | 歯科用測色装置 | |
JP2019153931A (ja) | 測定装置、該測定装置における測色換算用パラメータの設定方法、該測定装置によって検査された工業製品 | |
JP2005131066A (ja) | 細隙灯顕微鏡から取得された白内障被検眼のカラー画像データに基づき、白内障の症状を定量化するための色補正方法および装置 | |
JP6960773B2 (ja) | 撮像画像処理システム | |
KR20210017096A (ko) | 자연스러운 치아 색깔 매칭을 위한 색조가이드 툴키트 | |
Kadhim et al. | Object Color Estimation Using Digital Camera for Noncontact Imaging Applications (Case Study: Teeth Color Estimation) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18923467 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18923467 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |