WO2015156039A1 - Organ imaging apparatus - Google Patents

Organ imaging apparatus Download PDF

Info

Publication number
WO2015156039A1
WO2015156039A1 PCT/JP2015/054706 JP2015054706W WO2015156039A1 WO 2015156039 A1 WO2015156039 A1 WO 2015156039A1 JP 2015054706 W JP2015054706 W JP 2015054706W WO 2015156039 A1 WO2015156039 A1 WO 2015156039A1
Authority
WO
WIPO (PCT)
Prior art keywords
tongue
organ
image
image data
unit
Prior art date
Application number
PCT/JP2015/054706
Other languages
French (fr)
Japanese (ja)
Inventor
松田 伸也
楠田 将之
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Publication of WO2015156039A1 publication Critical patent/WO2015156039A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons

Definitions

  • the present invention relates to an organ image photographing apparatus for photographing an organ of a living body and detecting a diagnosis item of the organ from the photographed image.
  • a diagnostic method for diagnosing a health condition or medical condition by observing the state of the tongue.
  • the physical condition and health level are diagnosed based on the color and shape of the tongue and moss, and the diagnosis items include the color of the tongue, the thickness of the tongue, and the crack on the surface of the tongue.
  • the tongue In a healthy state, the tongue is light red, but if there is a problem with the respiratory or circulatory system, the tongue will turn blue, and if there is fever or dehydration, the tongue will turn red. Further, when the metabolism of water is poor, the so-called swelled state is formed, and the tongue swells and becomes thick. Conversely, when the blood flow is insufficient or the water is insufficient, the tongue becomes thin and thin. Furthermore, when the immunity decreases due to lack of nutrition, poor blood flow, stress, etc., the regenerative power of the tongue cells decreases and the surface of the tongue cracks.
  • Patent Document 1 a tongue is photographed with a camera to extract a region of interest such as a tongue apex, a tongue, a tongue base, a tongue base, and the tongue quality (tongue color) and tongue tongue (moss color) of the region of interest. This makes it easy to diagnose the health status of individuals.
  • Patent Document 2 an index for diagnosing the state of blood or blood vessels is obtained by photographing the tongue with a camera and detecting the color and gloss of the tongue separately.
  • the ratio of the red (R) image data and the green (G) image data in each pixel of the captured image of the tongue is calculated, and the ratio is compared with a threshold value, whereby the tongue The color and the color of moss are judged. Furthermore, the number of connections in the vertical and horizontal directions of pixels on the base of the tongue (parts other than the nipple) is counted, and the number of connections in one direction (for example, the vertical direction) is equal to or greater than a predetermined value and the connection in the other direction (for example, the horizontal direction). A portion where the number is equal to or less than a predetermined value is determined as a crack in the tongue.
  • JP 2011-239926 A (refer to claim 1, paragraph [0028] etc.)
  • JP 2005-137756 A (refer to claim 3, paragraphs [0059] to [0063], [0079] to [0081], FIG. 4 to FIG. 6, FIG. 8, etc.)
  • a captured image (visible image) of the tongue is acquired by irradiating the tongue with visible light.
  • the part where the moss exists on the surface of the tongue is white or yellow, and the color of the underlying tongue cannot be detected, so the image data of the part where the moss does not exist on the surface is selected from the visible image Thus, the color of the tongue is detected.
  • the entire surface of the tongue is covered with moss, it is impossible to select image data of a portion where the moss is not present on the surface, and it is impossible to detect the color of the tongue.
  • the thickness of the tongue can be detected by using a method such as a light cutting method.
  • the light cutting method is a method of detecting the shape of the subject by irradiating the subject with linear light (eg, visible light) and detecting the shape of the reflected light.
  • linear light eg, visible light
  • detecting the shape of the reflected light even if this method is used, if the moss distribution on the surface of the tongue is uneven, the reflected light of the irradiated light will be disturbed in the portion where the moss is present. It becomes impossible to accurately detect the thickness of the.
  • the present invention has been made to solve the above problems, and an object of the present invention is to provide an organ imaging apparatus capable of accurately detecting a diagnosis item of an organ regardless of the surface state of the organ of a living body. It is to provide.
  • An organ imaging apparatus receives an illumination unit that illuminates a living organ with illumination light including near-infrared light, and the illumination light reflected from the surface of the organ.
  • An imaging unit that acquires an image of the organ, and at least image data of a near-infrared image obtained by receiving the near-infrared light included in the illumination light among the images acquired by the imaging unit.
  • an arithmetic unit for classifying the degree of the diagnosis item of the organ.
  • the calculation unit classifies the degree of the diagnosis item of the organ based on the image data of the near-infrared image, thereby accurately determining the diagnosis item of the organ regardless of the state of the organ surface that causes noise. Can be detected.
  • It is explanatory drawing which shows the positional relationship of the said illumination part with respect to imaging
  • FIG. 9 is a graph showing a distribution of RGB image data in one direction of the visible image in FIG. 8. It is a graph which shows distribution of the image data in one direction of the near-infrared image of FIG. It is explanatory drawing which shows the cross-sectional shape of a tongue, and the distribution of the image data in one direction of the near-infrared image of a tongue.
  • FIG. 16 is an explanatory diagram illustrating the distribution of the image data in FIG. 15 together with a region set when detecting the thickness of the tongue. It is a graph which shows the frequency distribution of the image data of the one part area
  • the numerical value range includes the values of the lower limit A and the upper limit B.
  • FIG. 1 is a perspective view showing an external appearance of an organ image photographing apparatus 1 of the present embodiment
  • FIG. 2 is a block diagram showing a schematic configuration of the organ image photographing apparatus 1.
  • the organ image capturing apparatus 1 captures an organ of a living body and detects information (diagnostic items) necessary for health diagnosis.
  • diagnostic items for example, the color of the tongue, the thickness of the tongue, and the crack on the surface of the tongue can be considered.
  • the organ image photographing apparatus 1 includes an illumination unit 2, an imaging unit 3, a display unit 4, an operation unit 5, a communication unit 6, and an audio output unit 7.
  • the illumination unit 2 is provided in the housing 21, and the configuration other than the illumination unit 2 (for example, the imaging unit 3, the display unit 4, the operation unit 5, the communication unit 6, and the audio output unit 7) is provided in the housing 22. ing.
  • casing 22 are connected so that relative rotation is possible, rotation is not necessarily required and one side may be completely fixed to the other.
  • etc., May be provided in the single housing
  • the organ image photographing device 1 may be composed of a multifunctional portable information terminal.
  • the illuminator 2 illuminates a living organ (herein, the tongue) that is an object to be imaged with illumination light including near-infrared light, and includes an illuminator that illuminates the object to be imaged from above.
  • a light source that emits a daylight color such as a xenon lamp is used.
  • FIG. 3 shows the light emission characteristics of the xenon lamp used as the light source of the illumination unit 2.
  • Xenon lamps are often used in camera flashes, and are almost flat from the visible wavelength range (400 nm to 700 nm) to the near infrared wavelength range (700 nm to 1000 nm). Emission characteristics close to the distribution.
  • the illumination unit 2 includes a lighting circuit and a dimming circuit in addition to the above light source, and lighting / extinguishing and dimming are controlled by a command from the illumination control unit 11.
  • the imaging unit 3 acquires an image of the organ by receiving illumination light from the illumination unit 2 reflected from the surface of the organ.
  • FIG. 4 is an explanatory diagram showing a positional relationship between the illumination unit 2 and the imaging unit 3 with respect to the imaging target.
  • the imaging unit 3 is disposed so as to face the tongue that is the subject of imaging.
  • the imaging unit 3 includes an imaging lens 31, an infrared reflecting mirror 32, a visible sensor 33, and an infrared sensor 34.
  • Illumination light (reflected light) emitted from the illumination unit 2 and reflected by the tongue passes through the imaging lens 31 and is separated into visible light and infrared light (including near infrared light) by the infrared reflection mirror 32. Is done.
  • the visible light separated by the infrared reflecting mirror 32 is guided to the visible sensor 33 and forms an image there.
  • the visible sensor 33 has a color filter 33a shown in FIG. 5 on the light incident side of the light receiving surface, and thereby, for each pixel of the sensor, red (R), green ( G) and blue (B) light can be received.
  • the visible sensor 33 obtains a visible image composed of each color of RGB.
  • the infrared light separated by the infrared reflecting mirror 32 is guided to the infrared sensor 34 and forms an image there.
  • the infrared sensor 34 acquires a near-infrared image by receiving near-infrared (IR) light.
  • the aperture (lens brightness), shutter speed, and focal length of the imaging lens 31 are set so that the entire range to be photographed is in focus.
  • F number 16
  • shutter speed 1/120 seconds
  • focal length 20 mm.
  • the area sensor (the visible sensor 33 and the infrared sensor 34) is composed of an image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal-Oxide Semiconductor), for example, and has sufficient color and shape to be photographed. Sensitivity, resolution, etc. are set so that they can be detected easily. As an example, sensitivity: 60 db, resolution: 10 million pixels.
  • Imaging by the imaging unit 3 is controlled by the imaging control unit 12.
  • the imaging unit 3 includes a focus mechanism (not shown), a diaphragm mechanism, a drive circuit, an A / D conversion circuit, and the like, and in response to a command from the imaging control unit 12. Focus, aperture control, A / D conversion, and the like are controlled.
  • data of 0 to 255 in 8 bits is acquired for each of R, G, B, and IR as captured image data.
  • the illumination unit 2 described above is arranged so as to illuminate the imaging target at an angle A of, for example, 0 ° to 45 ° with respect to the imaging optical axis X of the imaging unit 3 passing through the imaging target.
  • the imaging optical axis X refers to the optical axis of the imaging lens 31.
  • FIG. 6 is an explanatory diagram showing another configuration of the imaging unit 3.
  • the imaging unit 3 may include an imaging lens 31, a color filter 35, and an imaging element 36.
  • the image sensor 36 is composed of an image sensor such as a CCD or CMOS, and has a Si photodiode having detection sensitivity from visible light to near infrared light.
  • the color filter 35 is disposed on the light incident side of the image sensor 36, and transmits visible light and near infrared light to the image sensor 36.
  • FIG. 7 schematically shows the configuration of the color filter 35.
  • the color filter 33a shown in FIG. 5 includes a G filter that transmits green light at a ratio of 2 pixels to 4 pixels, and a B filter that transmits blue light and a red light transmitted to the remaining pixels.
  • the R filter is arranged at a ratio of 1: 1.
  • the color filter 35 in FIG. 7 uses one of the G filters arranged at a ratio of 2 pixels per 4 pixels as near-infrared light. Is replaced with an IR filter that transmits light. By using such a color filter 35, the detection of visible light and the detection of near-infrared light can be shared by one image sensor 36.
  • both visible light and near infrared light can be detected by one sensor. Further, in this configuration, since the infrared reflection mirror 32 and the infrared sensor 34 in FIG. 4 can be omitted, it is possible to realize downsizing and cost reduction.
  • the display unit 4 includes a liquid crystal panel (not shown), a backlight, a lighting circuit, and a control circuit.
  • the display unit 4 calculates and outputs an image acquired by photographing with the imaging unit 3 and a calculation unit 16 described later. Displayed information. Display of various types of information on the display unit 4 is controlled by the display control unit 13.
  • the operation unit 5 is an input unit for instructing imaging by the imaging unit 3, and includes an OK button (imaging execution button) 5a and a CANCEL button 5b.
  • the display unit 4 and the operation unit 5 are configured by a common touch panel display device 41, and the display area of the display unit 4 and the display area of the operation unit 5 in the touch panel display device 41 are separated.
  • the display of the operation unit 5 on the touch panel display device 41 is controlled by the operation control unit 14.
  • the operation unit 5 may be configured by an input unit other than the touch panel display device 41 (the operation unit 5 may be provided at a position outside the display area of the touch panel display device 41).
  • the communication unit 6 transmits the image data acquired by the imaging unit 3 and the information calculated and output by the calculation unit 16 described below to the outside via a communication line (including wired and wireless). It is an interface for receiving information from the outside. Transmission / reception of information in the communication unit 6 is controlled by the communication control unit 18.
  • the audio output unit 7 outputs various types of information as audio, and is composed of, for example, a speaker.
  • the information output by voice includes the result calculated by the calculation unit 16.
  • the sound output in the sound output unit 7 is controlled by the sound output control unit 19.
  • the organ imaging apparatus 1 further includes an illumination control unit 11, an imaging control unit 12, a display control unit 13, an operation control unit 14, an image processing unit 15, a calculation unit 16, a storage unit 17, a communication control unit 18, and a voice.
  • An output control unit 19 and an overall control unit 20 that controls these units are provided.
  • the illumination control unit 11, the imaging control unit 12, the display control unit 13, the operation control unit 14, the communication control unit 18, and the audio output control unit 19 are the illumination unit 2, the imaging unit 3, the display unit 4, and the operation.
  • the unit 5, the communication unit 6, and the audio output unit 7 are controlled.
  • the overall control unit 20 is composed of, for example, a CPU (Central Processing Unit).
  • the illumination control unit 11, the imaging control unit 12, the display control unit 13, the operation control unit 14, the image processing unit 15, the calculation unit 16, the communication control unit 18, the audio output control unit 19, and the overall control unit 20 are: It may be configured integrally (for example, with one CPU).
  • the image processing unit 15 has a function of performing various types of image processing, such as extracting an outline of an organ from an image acquired by the imaging unit 3.
  • the outline of the organ can be extracted by extracting the luminance edge of the captured image (the portion where the brightness has changed abruptly in the image) using a filter.
  • the edge extraction filter is a filter that weights pixels in the vicinity of the target pixel when performing first-order differentiation (when obtaining a difference in image data between adjacent pixels).
  • the storage unit 17 stores image data acquired by the imaging unit 3, data acquired by the image processing unit 15, data calculated by the calculation unit 16, information received from the outside, and the like.
  • the calculation unit 16 determines the degree of the diagnostic item of the organ based on at least the image data of the near-infrared image obtained by receiving the near-infrared light included in the illumination light among the images acquired by the imaging unit 3.
  • Classify The above classification may be performed by quantifying the degree of the diagnosis item, or may be performed by an output other than the numerical value.
  • a specific example for classifying the degree of diagnosis items will be described.
  • FIG. 8 shows a visible image and a near-infrared image of the tongue acquired by the imaging unit 3 when the imaging unit 3 images the tongue under illumination by the illumination unit 2.
  • the moss region on the tongue surface is indicated by hatching for convenience.
  • the visible image it is confirmed that there is moss on almost the entire surface of the tongue, and the moss distribution is uneven.
  • the near-infrared image the difference between the color of the tongue and the color of the moss disappears, and the moss cannot be seen.
  • FIG. 9 shows the spectral distribution of tongue color and moss color.
  • the B wavelength range (400 to 500 nm) and the G wavelength range (500 to 600 nm) there is a large difference in reflectance between the tongue color and moss color, but in the wavelength range exceeding 650 nm, the tongue color and moss There is almost no difference in reflectance between the colors of and. Accordingly, when the near-infrared light exceeding the wavelength of 650 nm is used to illuminate the tongue, and the tongue is photographed to obtain a near-infrared image, the tongue and the moss have the same output (same color) in the near-infrared image. The difference between the color of moss and the color of moss disappears.
  • the tongue color can be accurately detected.
  • the moss unevenness does not appear in the near-infrared image, the thickness and crack of the tongue, which will be described later, can be accurately detected by using light having any wavelength of 650 to 1000 nm.
  • the color of the blood reflects the color of the blood.
  • the color of blood changes mainly depending on the degree of oxidation of hemoglobin.
  • FIG. 10 shows the absorption characteristics of oxygenated hemoglobin (HbO 2 ) and reduced hemoglobin (Hb). From the figure, it can be seen that in the wavelength region of 600 to 800 nm, reduced hemoglobin has a larger absorption coefficient than oxygenated hemoglobin. That is, reduced hemoglobin has a higher degree of light absorption in the wavelength region of 600 to 800 nm than oxyhemoglobin. For this reason, blood with a high amount of deoxyhemoglobin looks relatively blue because of its large red absorption. Therefore, it is possible to detect the color of the tongue reflecting the color of blood by determining whether the reduced hemoglobin is high or low.
  • FIG. 10 shows that the difference between the absorption coefficient of reduced hemoglobin and the absorption coefficient of oxyhemoglobin is maximum when the wavelength of light is 660 nm and is equal to 805 nm.
  • the color of the tongue was detected in the region of the left or right end of the tongue without the moss or the lower part (lingual apex), but in this embodiment, it is not affected by the moss, The color may be detected in any region of the tongue.
  • the central region C of the near-infrared image of the tongue shown in FIG. 8 (the region in the tongue between the tip of the tongue and the base of the tongue) is set as a diagnosis target region, and the tongue C based on the image data of this region C.
  • the color of is detected as follows.
  • the calculation unit 16 obtains an average value Bm of the B image data and an average value IRm of the IR image data in the region C of the near-infrared image acquired by the imaging unit 3, and uses these ratios IRm / Bm as a tongue. It is used as an index when detecting the color.
  • the ratio IRm / Bm is large, the color of the tongue is red (the degree of oxidation of blood is large) because blue is relatively small.
  • the ratio IRm / Bm is small, the color of the tongue is blue (the degree of oxidation of blood is small) because blue is relatively large.
  • the calculation unit 16 classifies the tongue color into a plurality of ranks according to the value of the ratio IRm / Bm.
  • the value of IRm / Bm is the rank "1" less than L 1
  • the value of IRm / Bm is the rank "2" less than L 1 or L 2
  • the value of IRm / Bm is less than L 2 or L 3
  • the rank is set to “3”
  • the calculation unit 16 classifies the tongue color into any one of “1” to “3” according to the value of IRm / Bm.
  • this classification indicates the degree of tongue color such as “dark red”, “normal red”, “light red”, etc. (other than numerical values) ). Thereby, even if the whole tongue surface is covered with moss or the moss distribution is uneven, the color of the tongue (including the degree of redness) can be accurately detected.
  • illumination including light with a wavelength of 600 to 800 nm is used when detecting the color of the tongue (degree of oxidation of hemoglobin). It is desirable to use light.
  • the index used when detecting the color of the tongue is not limited to the ratio IRm / Bm, but may be, for example, the ratio IRm / (IRm + B).
  • FIGS. 11 and 12 show the visible image of the tongue, the cross-sectional shape of the tongue along the line AA ′, and the distribution of RGB image data of each pixel aligned in the AA ′ direction in the visible image.
  • FIG. 11 shows a case where the tongue is thin and the moss is thin
  • FIG. 12 shows a case where the tongue is thick and the moss is thick.
  • the AA ′ direction refers to a direction that passes through the tongue between the tongue tip and the tongue base of the tongue in the visible image and is perpendicular to the direction connecting the tongue tip and the tongue base, and corresponds to the horizontal direction.
  • the surface of the tongue When the tongue is thin, the surface of the tongue has a valley shape with the central part recessed downward. For this reason, when illumination light reflected from the surface of the tongue is received by the imaging unit and a distribution of image data in the direction AA ′ is created, the above distribution has a shape recessed downward in both RGB (FIG. 11). reference).
  • the surface of the tongue bulges in the center and becomes a mountain (convex upward). Therefore, when illumination light reflected from the surface of the tongue is received by the imaging unit and a distribution of image data in the direction AA ′ is created, the distribution has a convex shape with a high central portion in both RGB (FIG. 12).
  • the RGB image data distribution includes noise (due to the reflected light from the moss). Become.
  • FIGS. 13 and 14 show the distribution of image data of each pixel arranged in the A-A ′ direction of the visible image and the near-infrared image shown in FIG. 8, respectively.
  • the AA ′ direction of the near-infrared image refers to a direction perpendicular to the direction connecting the tongue apex and the tongue base, passing through the tongue between the tongue apex and the tongue base of the near-infrared image, and in the horizontal direction. It corresponds.
  • the calculation unit 16 detects the thickness of the tongue as follows using the distribution of the image data of the near-infrared image.
  • FIG. 15 shows a sectional shape of the tongue and a distribution of image data of each pixel arranged in the A-A ′ direction of the near-infrared image of the tongue.
  • FIG. 16 shows the distribution of the image data in FIG. 15 together with the region S set when the thickness is detected.
  • This region S is a region closer to the end than the central portion in the AA ′ direction.
  • W width of the tongue determined from the contour line of the tongue
  • FIG. the dimensional relationship shown in FIG. (Region including a region of width W / 4 at a distance of W / 8 from the end of the tongue) can be considered.
  • the contour line of the tongue is obtained by the image processing unit 15 described above.
  • the cross-sectional shape of the tongue changes variously.
  • the distribution shape is horizontal, or the horizontal to concave shape, no matter how the tongue moves. To change.
  • the quadratic coefficient of the approximate polynomial is 0 or a positive value.
  • the quadratic coefficient of the approximate polynomial of the distribution in the region S is a negative value. . Note that the thicker the tongue, the greater the negative value of the coefficient. Therefore, the thickness of the tongue can be detected by looking at the quadratic coefficient of the approximate polynomial in the region S of the distribution of the image data of the near-infrared image.
  • the calculation unit 16 classifies the thickness of the tongue into any one of “1” to “3” according to the second-order coefficient of the approximate polynomial in the image data distribution region S.
  • this classification indicates the level of tongue thickness (other than numerical values) such as “thin”, “normal”, and “thick”. May be. Thereby, the thickness of the tongue (including the thickness) can be accurately detected regardless of the presence or absence of moss on the tongue surface and the unevenness of the moss.
  • Detection of tongue surface cracks 17 and 18 show the frequency distribution of the image data in the region C of the visible image and the near-infrared image shown in FIG. Since many cracks occur near the center of the surface of the tongue, in the present embodiment, a region having a length of 1 ⁇ 4 of the length in the vertical direction is defined as a region C in the tongue near the vertical center of the tongue. This is set as a crack detection area (diagnosis area). Note that the frequency distribution in the region C of the visible image in FIG. 17 indicates the frequency distribution for the B image data of RGB.
  • the base of the tongue appears more than when there is no crack, so the range of values that can be taken by the image data of the pixels that make up the base is expanded for both RGB. For this reason, when the frequency distribution of the image data of the captured image is created, the width of the frequency distribution is widened.
  • the crack on the tongue surface is caused by a decrease in immunity due to lack of nutrition, poor blood flow, stress, or the like.
  • moss is a keratinized papillary tissue of the lingual mucosa, with exfoliated cells, mucus, food, and bacteria attached to the tongue surface. Since the base of the tongue appears in the portion without moss, the range of values that can be taken by the image data of the pixels constituting the base is also expanded for both RGB. For this reason, in the frequency distribution of the visible image in FIG. 17, the standard deviation (or variance) increases as a result of including both the background image data of the portion without moss and the background image data of the crack portion. Incidentally, the standard deviation of the frequency distribution of FIG. 17 is 26.78 (the variance is the square of the standard deviation).
  • the standard deviation (or variance) is smaller than the frequency distribution of the visible image as in the frequency distribution of FIG.
  • the standard deviation of the frequency distribution in FIG. 18 is 13.18. Therefore, by creating a frequency distribution of the image data of the near-infrared image and looking at its standard deviation (or variance), it is possible to accurately crack the tongue regardless of the surface condition of the tongue (uneven distribution of moss). Can be detected.
  • the rank is “1”, and N 1 or more and less than N 2 is the rank “2”.
  • a rank “3” is set between N 2 and N 3 .
  • the calculation unit 16 classifies the cracks of the tongue into any one of “1” to “3” according to the value of the standard deviation ⁇ (or variance ⁇ 2 ) of the frequency distribution. In addition to the numerical values “1” to “3” shown above, this classification indicates the degree of cracking of the tongue (other than numerical values), such as “small”, “medium”, and “large”. Also good. This makes it possible to accurately detect cracks in the tongue (including the degree of cracking) regardless of the presence or absence of moss on the tongue surface and unevenness of the moss.
  • FIG. 19 is a flowchart showing an operation flow in the organ image photographing apparatus 1 of the present embodiment.
  • the illumination control unit 11 turns on the illumination unit 2 (S1) and sets photographing conditions such as illuminance (S1). S2).
  • the imaging control unit 12 controls the imaging unit 3 to photograph the tongue that is the photographing target (S3). Thereby, the visible image and near-infrared image of a tongue are obtained.
  • the image processing unit 15 extracts the contour line of the tongue from the photographed image of the tongue (S4). Then, the computing unit 16 detects the upper and lower ends and left and right ends of the tongue from the extracted outline, and sets areas for detecting diagnostic items (tongue color, thickness, crack) (S5). Specifically, the calculation unit 16 sets the region C in the tongue (see FIG. 8) when detecting the color of the tongue, and in the horizontal direction (A ⁇ When a region (each pixel) in the A ′ direction) is set and a crack on the tongue surface is detected, a region C in the tongue is set.
  • the calculation unit 16 uses a feature amount (ratio IRm / Bm, region S) for detecting the color, thickness, and crack of the tongue based on the photographed image (visible image, near infrared image) of the tongue.
  • the quadratic coefficient of the approximate polynomial and the standard deviation ⁇ of the frequency distribution are extracted (S6), and these are classified (for example, digitized) by the above-described method (S7).
  • the calculated numerical value is output from the calculation unit 16 and displayed on the display unit 4 and is also stored in the storage unit 17. However, the calculated numerical value is output from the audio output unit 7 by voice as necessary, or an output device (not shown). Or transferred to the outside via the communication unit 6 (S8).
  • the calculation unit 16 determines the tongue diagnosis item (tongue) based on at least the image data of the near-infrared image among the images obtained by photographing the tongue with the imaging unit 3. Color, thickness, cracks). In the near-infrared image, noise (eg, moss) that was visible in the visible image can no longer be seen, so the diagnostic items for the tongue can be accurately detected regardless of the state of the tongue surface (whether moss is present or moss is uneven). Can do. Therefore, the accuracy of diagnosis based on the detected diagnosis item is improved.
  • the near-infrared image noise (eg, moss) that was visible in the visible image can no longer be seen, so the diagnostic items for the tongue can be accurately detected regardless of the state of the tongue surface (whether moss is present or moss is uneven). Can do. Therefore, the accuracy of diagnosis based on the detected diagnosis item is improved.
  • the calculation unit 16 classifies the degree of tongue thickness as a diagnostic item based on the distribution of image data in one direction in the near-infrared image.
  • the near-infrared image the influence of the moss on the tongue surface that becomes noise is eliminated, so that the distribution of the image data in one direction is smoother than that of the visible image. Accordingly, it becomes possible to accurately and easily grasp the cross-sectional shape of the tongue surface based on the distribution of the image data in one direction of the near-infrared image, and to detect the thickness of the tongue accurately.
  • the horizontal surface of the tongue is obtained by using the distribution of the image data of each pixel that passes through the tongue and is aligned in a direction perpendicular to the direction connecting the tongue tip and the tongue base.
  • the calculation unit 16 obtains a second order polynomial that approximates the above distribution in a predetermined region (region S) of the distribution of the image data, and classifies the degree of tongue thickness based on the second order coefficient. Therefore, it becomes easy to classify the thickness of the tongue into a plurality of ranks based on the magnitude and sign (positive / negative) of the second-order coefficient of the approximate polynomial.
  • the calculation unit 16 classifies the degree of cracking of the tongue as a diagnostic item based on the frequency distribution of the image data of a partial region of the tongue in the near-infrared image.
  • the near-infrared image the influence of the moss on the tongue surface, which becomes noise, is eliminated. Therefore, the frequency distribution of the near-infrared image becomes smaller than the frequency distribution of the visible image. Therefore, it is possible to accurately detect the presence or absence of a crack on the tongue surface based on the frequency distribution of the near-infrared image.
  • the calculation unit 16 determines the degree of cracking of the tongue based on the standard deviation or variance of the frequency distribution. This makes it possible to accurately and reliably detect cracks on the tongue surface.
  • the calculation unit 16 classifies the degree of tongue color as a diagnostic item based on the near-infrared image data and the B image data in the visible image.
  • B image data is less likely to fluctuate due to the presence of moss than other G and R image data, so the near-infrared image data and B image data are used for tongue color detection.
  • the color of the tongue can be accurately detected regardless of the presence or absence of moss.
  • the calculation unit 16 sets the ratio (IRm / Bm) of the average value IRm of the image data in a partial region of the near-infrared image and the average value Bm of the B image data in a partial region of the visible image. Based on the degree of tongue color. In this case, it is possible to determine whether the tongue is reddish or bluish by a relative comparison between IRm and Bm, and the color of the tongue can be easily detected.
  • the subject to be photographed is a human tongue
  • it may not be a human but may be an animal other than a human.
  • the diagnostic item can be accurately detected by applying the method of the present embodiment. In this case, it is possible to quickly and accurately determine the poor physical condition of an animal that cannot communicate its intention.
  • the case where the organ of the living body is the tongue has been described as an example.
  • the eyelid or the like can be used as the organ to be diagnosed.
  • the organ image capturing apparatus described above can be expressed as follows, and has the following effects.
  • the organ imaging apparatus described above receives an illumination unit that illuminates a living organ with illumination light including near-infrared light, and receives the illumination light that is reflected from the surface of the organ. Based on image data of a near-infrared image obtained by receiving at least the near-infrared light included in the illumination light, among the image acquired by the imaging unit and the image acquired by the imaging unit, And an arithmetic unit for classifying the degree of diagnosis items of the organ.
  • the image acquired by the imaging unit includes a near-infrared image of the organ.
  • the noise for example, moss if the organ is a tongue
  • the operation unit classifies the degree of the diagnosis item of the organ based on the image data of the near-infrared image, thereby accurately detecting the diagnosis item of the organ regardless of the state of the organ surface that causes noise. Is possible.
  • the organ may be a tongue.
  • the arithmetic unit is affected by the moss on the tongue surface by classifying the degree of the diagnostic items of the tongue (for example, the color, thickness of the tongue, cracks on the surface) based on the image data of the near-infrared image. Therefore, the diagnostic item can be detected accurately. That is, even if the entire surface of the tongue is covered with moss or the moss distribution is uneven, it is possible to accurately detect the diagnostic items of the tongue.
  • the calculation unit may classify the degree of the thickness of the tongue as the diagnostic item based on the distribution of image data in one direction in the near-infrared image.
  • the influence of the moss on the tongue surface that becomes noise is eliminated, so that the distribution of the image data in one direction in the near-infrared image is smoother than the distribution of the image data in one direction in the visible image. Therefore, it is possible to accurately and easily grasp the cross-sectional shape of the tongue surface based on the former distribution, and it is possible to accurately detect the thickness of the tongue.
  • the calculation unit may obtain a second order polynomial that approximates the distribution in a predetermined region of the distribution of the image data, and classify the degree of the thickness of the tongue based on the second order coefficient of the polynomial. .
  • the thickness of the tongue can be detected based on the magnitude and sign (positive / negative) of the second-order coefficient of the polynomial, and the level of the thickness of the tongue can be easily classified.
  • the distribution of the image data is a distribution of image data of each pixel that passes through the tongue between the tongue tip and the tongue base of the tongue in the near-infrared image and is aligned in a direction perpendicular to the direction connecting the tongue tip and the tongue base. Also good.
  • the surface shape in the horizontal direction of the tongue is grasped by classifying the degree of the thickness of the tongue based on the distribution of horizontal image data passing through the center of the tongue (in the tongue) in the near-infrared image.
  • the thickness of the tongue can be accurately detected.
  • the calculation unit may classify the degree of cracking of the tongue as the diagnostic item based on a frequency distribution of image data of a partial region of the tongue in the near-infrared image.
  • the influence of the moss on the tongue surface that causes noise is eliminated, so the spread of the frequency distribution of the image data of a partial region of the tongue in the near-infrared image is the same region as the above region in the visible image Smaller than the spread of the frequency distribution of data. Therefore, it is possible to accurately detect the presence or absence of a crack on the tongue surface based on the former frequency distribution.
  • the calculation unit may classify the degree of tongue cracking based on the standard deviation or variance of the frequency distribution. Since the standard deviation or variance indicates the extent of the frequency distribution, it is possible to accurately and reliably detect cracks on the tongue surface.
  • the partial region may be a region in the tongue between the tongue tip and the base of the tongue in the near-infrared image. In this case, it is possible to reliably detect a crack in the tongue that is likely to occur in the vicinity of the tongue surface on the tongue surface based on the frequency distribution.
  • the illumination light includes visible light
  • the calculation unit receives the image data of the near-infrared image and the visible light included in the illumination light through the surface of the organ.
  • the degree of tongue color as the diagnostic item may be classified based on the blue image data in the visible image.
  • the blue (B) image data of the visible image is less likely to fluctuate due to the presence of moss (less susceptible to moss) than the other green (G) and red (R) image data. Therefore, the color of the tongue can be accurately detected regardless of the presence of moss by classifying the degree of the color of the tongue based on the image data of the near-infrared image and the B image data of the visible image. It becomes possible.
  • the calculation unit calculates an average value of image data in the target region of the near-infrared image and a blue color in the target region of the visible image.
  • the degree of tongue color may be classified based on the ratio to the average value of the image data.
  • Redness or blueness of the tongue is detected by relative comparison between the average value (IRm) of the image data in the target region of the near-infrared image and the average value (Bm) of the B image data in the target region of the visible image. Therefore, it is easy to detect the color of the tongue.
  • the present invention can be used in an apparatus for photographing a living organ and detecting a diagnosis item of the organ from the photographed image.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

An organ imaging apparatus (1) provided with an illumination unit (2), an imaging unit (3), and a calculating unit (16). The illumination unit (2) illuminates an organ of a living body using illumination light comprising near infrared light. The imaging unit (3) acquires images of the organ by receiving illumination light reflected at the surface of the organ. The calculating unit (16) classifies the degrees of organ diagnosis categories on the basis of the image data of at least the near infrared images obtained using received near infrared light contained in the illumination light from among the images acquired by the imaging unit (3).

Description

器官画像撮影装置Organ imaging device
 本発明は、生体の器官を撮影し、その撮影画像から器官の診断項目を検出する器官画像撮影装置に関するものである。 The present invention relates to an organ image photographing apparatus for photographing an organ of a living body and detecting a diagnosis item of the organ from the photographed image.
 東洋医学において、舌の状態を観察することにより健康状態や病状を診断する診断手法(舌診)が知られている。舌診では、舌および苔の色および形を元に体調や健康度を診断しており、その診断項目に、舌の色、舌の厚さ、舌の表面の亀裂がある。 In oriental medicine, a diagnostic method (tongue diagnosis) is known for diagnosing a health condition or medical condition by observing the state of the tongue. In the tongue examination, the physical condition and health level are diagnosed based on the color and shape of the tongue and moss, and the diagnosis items include the color of the tongue, the thickness of the tongue, and the crack on the surface of the tongue.
 健康な状態では、舌の色は淡い紅色であるが、呼吸器や循環器に問題があると、舌の色が青色になり、発熱や脱水があると、舌の色が赤色になる。また、水分の代謝が悪いと、いわゆるむくんだ状態となり、舌が膨らみ分厚くなる。反対に、血流の不足や水分不足になると、舌が痩せて薄くなる。さらに、栄養不足や血流の不良、ストレスなどによる免疫の低下が生じると、舌細胞の再生力が低下し、舌の表面に亀裂が入る。 In a healthy state, the tongue is light red, but if there is a problem with the respiratory or circulatory system, the tongue will turn blue, and if there is fever or dehydration, the tongue will turn red. Further, when the metabolism of water is poor, the so-called swelled state is formed, and the tongue swells and becomes thick. Conversely, when the blood flow is insufficient or the water is insufficient, the tongue becomes thin and thin. Furthermore, when the immunity decreases due to lack of nutrition, poor blood flow, stress, etc., the regenerative power of the tongue cells decreases and the surface of the tongue cracks.
 現在、これらの診断は専門の医師が実施しているが、経験や勘に頼っているため、個人差があり、客観性に乏しい。 At present, these diagnoses are carried out by specialist doctors, but because they rely on experience and intuition, there are individual differences and poor objectivity.
 そこで、デジタルカメラを用いて被写体を撮影し、撮影画像からその特徴を数値化して記録、診断するシステムが提案されている。例えば特許文献1では、舌をカメラで撮影して、舌尖、舌中、舌辺、舌根のような関心領域を抽出し、関心領域の舌質(舌の色)および舌あか(苔の色)を把握して簡便に個人の健康状態を診断できるようにしている。また、特許文献2では、舌をカメラで撮影して舌の色や光沢を分離して検出することで、血液や血管の状態を診断する指標を得るようにしている。 Therefore, a system has been proposed in which a subject is photographed using a digital camera, and the features of the photographed image are digitized and recorded and diagnosed. For example, in Patent Document 1, a tongue is photographed with a camera to extract a region of interest such as a tongue apex, a tongue, a tongue base, a tongue base, and the tongue quality (tongue color) and tongue tongue (moss color) of the region of interest. This makes it easy to diagnose the health status of individuals. In Patent Document 2, an index for diagnosing the state of blood or blood vessels is obtained by photographing the tongue with a camera and detecting the color and gloss of the tongue separately.
 また、特許文献3では、舌の撮影画像の各画素における赤色(R)の画像データと緑色(G)の画像データとの比を演算し、この比と閾値とを比較することにより、舌の色および苔の色を判断するようにしている。さらに、舌の下地(乳頭以外の部分)の画素の上下左右方向の連結数をカウントし、一方向(例えば上下方向)の連結数が所定値以上であり、他方向(例えば左右方向)の連結数が所定値以下となる部分を、舌の亀裂と判断している。 In Patent Document 3, the ratio of the red (R) image data and the green (G) image data in each pixel of the captured image of the tongue is calculated, and the ratio is compared with a threshold value, whereby the tongue The color and the color of moss are judged. Furthermore, the number of connections in the vertical and horizontal directions of pixels on the base of the tongue (parts other than the nipple) is counted, and the number of connections in one direction (for example, the vertical direction) is equal to or greater than a predetermined value and the connection in the other direction (for example, the horizontal direction). A portion where the number is equal to or less than a predetermined value is determined as a crack in the tongue.
特許第3854966号公報(請求項1、段落〔0004〕、〔0009〕、〔0022〕~〔0024〕等参照)Japanese Patent No. 3854966 (see claim 1, paragraphs [0004], [0009], [0022] to [0024], etc.) 特開2011-239926号公報(請求項1、段落〔0028〕等参照)JP 2011-239926 A (refer to claim 1, paragraph [0028] etc.) 特開2005-137756号公報(請求項3、段落〔0059〕~〔0063〕、〔0079〕~〔0081〕、図4~図6、図8等参照)JP 2005-137756 A (refer to claim 3, paragraphs [0059] to [0063], [0079] to [0081], FIG. 4 to FIG. 6, FIG. 8, etc.)
 ところが、特許文献1~3のシステムでは、舌表面の状態によっては、舌の診断項目を正確に検出することができない場合がある。 However, in the systems of Patent Documents 1 to 3, depending on the state of the tongue surface, there are cases in which the diagnostic items of the tongue cannot be accurately detected.
 すなわち、特許文献1~3のシステムでは、舌に可視光を照射して舌の撮影画像(可視画像)を取得している。可視画像では、舌の表面に苔が存在する部分は白色または黄色となり、その下地となる舌の色を検出することができないため、可視画像から、表面に苔が存在しない部分の画像データを選択して、舌の色を検出することになる。このため、舌の表面全体が苔に覆われる場合は、表面に苔が存在しない部分の画像データの選択自体ができなくなり、舌の色を正確に検出するどころか、その検出自体ができなくなる。 That is, in the systems disclosed in Patent Documents 1 to 3, a captured image (visible image) of the tongue is acquired by irradiating the tongue with visible light. In the visible image, the part where the moss exists on the surface of the tongue is white or yellow, and the color of the underlying tongue cannot be detected, so the image data of the part where the moss does not exist on the surface is selected from the visible image Thus, the color of the tongue is detected. For this reason, when the entire surface of the tongue is covered with moss, it is impossible to select image data of a portion where the moss is not present on the surface, and it is impossible to detect the color of the tongue.
 また、可視画像では、舌表面の苔の分布にムラがあると、苔の無い部分と亀裂との区別がつかないため、舌の亀裂を正確に検出することができなくなる。 In the visible image, if the distribution of moss on the surface of the tongue is uneven, it is impossible to distinguish between the moss-free portion and the crack, so that the tongue crack cannot be accurately detected.
 また、舌が厚い場合と薄い場合とでは、舌の断面形状が異なるため、例えば光切断法という手法を利用することにより、舌の厚さを検出することも可能と考えられる。なお、光切断法とは、被写体に対して線状の光(例えば可視光)を照射してその反射光の形状を検出することにより、被写体の形状を検出する手法である。しかし、この手法を利用しても、舌表面の苔の分布にムラがあると、苔の存在する部分で、照射された光の反射光に乱れが生じるため、舌の断面形状、すなわち、舌の厚さを正確に検出することができなくなる。 Also, since the cross-sectional shape of the tongue is different between the case where the tongue is thick and the case where the tongue is thin, it is considered that the thickness of the tongue can be detected by using a method such as a light cutting method. The light cutting method is a method of detecting the shape of the subject by irradiating the subject with linear light (eg, visible light) and detecting the shape of the reflected light. However, even if this method is used, if the moss distribution on the surface of the tongue is uneven, the reflected light of the irradiated light will be disturbed in the portion where the moss is present. It becomes impossible to accurately detect the thickness of the.
 本発明は、上記の問題点を解決するためになされたもので、その目的は、生体の器官の表面の状態に関係なく、器官の診断項目を正確に検出することができる器官画像撮影装置を提供することにある。 The present invention has been made to solve the above problems, and an object of the present invention is to provide an organ imaging apparatus capable of accurately detecting a diagnosis item of an organ regardless of the surface state of the organ of a living body. It is to provide.
 本発明の一側面に係る器官画像撮影装置は、近赤外光を含む照明光によって、生体の器官を照明する照明部と、前記器官の表面で反射される前記照明光を受光することにより、前記器官の画像を取得する撮像部と、前記撮像部にて取得される前記画像のうち、少なくとも、前記照明光に含まれる前記近赤外光の受光によって得られる近赤外画像の画像データに基づいて、前記器官の診断項目の程度を分類する演算部とを備えている。 An organ imaging apparatus according to one aspect of the present invention receives an illumination unit that illuminates a living organ with illumination light including near-infrared light, and the illumination light reflected from the surface of the organ. An imaging unit that acquires an image of the organ, and at least image data of a near-infrared image obtained by receiving the near-infrared light included in the illumination light among the images acquired by the imaging unit. And an arithmetic unit for classifying the degree of the diagnosis item of the organ.
 上記構成によれば、演算部が、近赤外画像の画像データに基づいて、器官の診断項目の程度を分類することにより、ノイズとなる器官表面の状態に関係なく、器官の診断項目を正確に検出することが可能となる。 According to the above configuration, the calculation unit classifies the degree of the diagnosis item of the organ based on the image data of the near-infrared image, thereby accurately determining the diagnosis item of the organ regardless of the state of the organ surface that causes noise. Can be detected.
本発明の実施の一形態に係る器官画像撮影装置の外観を示す斜視図である。It is a perspective view which shows the external appearance of the organ imaging device which concerns on one Embodiment of this invention. 上記器官画像撮影装置の概略の構成を示すブロック図である。It is a block diagram which shows the schematic structure of the said organ image imaging device. 上記器官画像撮影装置の照明部に用いられるキセノンランプの発光特性を示すグラフである。It is a graph which shows the light emission characteristic of the xenon lamp used for the illumination part of the said organ imaging device. 撮影対象に対する上記照明部と撮像部との位置関係を示す説明図である。It is explanatory drawing which shows the positional relationship of the said illumination part with respect to imaging | photography object, and an imaging part. 上記撮像部が有する色フィルタの構成を模式的に示す説明図である。It is explanatory drawing which shows typically the structure of the color filter which the said imaging part has. 上記撮像部の他の構成を示す説明図である。It is explanatory drawing which shows the other structure of the said imaging part. 図6の撮像部が有する色フィルタの構成を模式的に示す説明図である。It is explanatory drawing which shows typically the structure of the color filter which the imaging part of FIG. 6 has. 上記照明部による照明下で舌を撮影して得られる、舌の可視画像および近赤外画像を示す説明図である。It is explanatory drawing which shows the visible image and near-infrared image of a tongue obtained by image | photographing a tongue under illumination by the said illumination part. 舌の色と苔の色の分光分布を示すグラフである。It is a graph which shows the spectral distribution of the color of a tongue, and the color of moss. ヘモグロビンの吸収特性を示すグラフである。It is a graph which shows the absorption characteristic of hemoglobin. 舌が薄く、苔が薄い場合の、舌の可視画像と、舌の断面形状と、一方向におけるRGBの画像データの分布とを併せて示す説明図である。It is explanatory drawing which shows together the visible image of a tongue, the cross-sectional shape of a tongue, and the distribution of RGB image data in one direction when a tongue is thin and moss is thin. 舌が厚く、苔が厚い場合の、舌の可視画像と、舌の断面形状と、一方向におけるRGBの画像データの分布とを併せて示す説明図である。It is explanatory drawing which shows together the visible image of a tongue, the cross-sectional shape of a tongue, and the distribution of RGB image data in one direction when a tongue is thick and moss is thick. 図8の可視画像の一方向におけるRGBの画像データの分布を示すグラフである。9 is a graph showing a distribution of RGB image data in one direction of the visible image in FIG. 8. 図8の近赤外画像の一方向における画像データの分布を示すグラフである。It is a graph which shows distribution of the image data in one direction of the near-infrared image of FIG. 舌の断面形状と、舌の近赤外画像の一方向における画像データの分布とを示す説明図である。It is explanatory drawing which shows the cross-sectional shape of a tongue, and the distribution of the image data in one direction of the near-infrared image of a tongue. 図15の画像データの分布を、舌の厚さを検出する際に設定される領域とともに示す説明図である。FIG. 16 is an explanatory diagram illustrating the distribution of the image data in FIG. 15 together with a region set when detecting the thickness of the tongue. 図8で示した可視画像の一部の領域の画像データの度数分布を示すグラフである。It is a graph which shows the frequency distribution of the image data of the one part area | region of the visible image shown in FIG. 図8で示した近赤外画像の一部の領域の画像データの度数分布を示すグラフである。It is a graph which shows the frequency distribution of the image data of the one part area | region of the near-infrared image shown in FIG. 上記器官画像撮影装置における動作の流れを示すフローチャートである。It is a flowchart which shows the flow of operation | movement in the said organ imaging device.
 本発明の実施の一形態について、図面に基づいて説明すれば、以下の通りである。なお、本明細書において、数値範囲をA~Bと表記した場合、その数値範囲に下限Aおよび上限Bの値は含まれるものとする。 An embodiment of the present invention will be described below with reference to the drawings. In this specification, when the numerical range is expressed as A to B, the numerical value range includes the values of the lower limit A and the upper limit B.
 〔器官画像撮影装置の全体構成〕
 図1は、本実施形態の器官画像撮影装置1の外観を示す斜視図であり、図2は、器官画像撮影装置1の概略の構成を示すブロック図である。器官画像撮影装置1は、生体の器官を撮影して、健康度の診断に必要な情報(診断項目)を検出するものである。ここでは、例として、生体の器官が舌である場合について説明する。したがって、上記の診断項目としては、例えば舌の色、舌の厚さ、舌表面の亀裂を考えることができる。
[Overall configuration of organ imaging system]
FIG. 1 is a perspective view showing an external appearance of an organ image photographing apparatus 1 of the present embodiment, and FIG. 2 is a block diagram showing a schematic configuration of the organ image photographing apparatus 1. The organ image capturing apparatus 1 captures an organ of a living body and detects information (diagnostic items) necessary for health diagnosis. Here, as an example, a case where a living organ is a tongue will be described. Therefore, as the diagnostic items, for example, the color of the tongue, the thickness of the tongue, and the crack on the surface of the tongue can be considered.
 上記の器官画像撮影装置1は、照明部2、撮像部3、表示部4、操作部5、通信部6および音声出力部7を備えている。照明部2は筐体21に設けられており、照明部2以外の構成(例えば撮像部3、表示部4、操作部5、通信部6、音声出力部7)は、筐体22に設けられている。筐体21と筐体22とは相対的に回転可能に連結されているが、必ずしも回転は必要ではなく、一方が他方に完全に固定されていてもよい。なお、上記の照明部2等は、単一の筐体に設けられていてもよい。また、器官画像撮影装置1は、多機能携帯情報端末で構成されてもよい。 The organ image photographing apparatus 1 includes an illumination unit 2, an imaging unit 3, a display unit 4, an operation unit 5, a communication unit 6, and an audio output unit 7. The illumination unit 2 is provided in the housing 21, and the configuration other than the illumination unit 2 (for example, the imaging unit 3, the display unit 4, the operation unit 5, the communication unit 6, and the audio output unit 7) is provided in the housing 22. ing. Although the housing | casing 21 and the housing | casing 22 are connected so that relative rotation is possible, rotation is not necessarily required and one side may be completely fixed to the other. In addition, said illumination part 2 grade | etc., May be provided in the single housing | casing. Moreover, the organ image photographing device 1 may be composed of a multifunctional portable information terminal.
 照明部2は、近赤外光を含む照明光によって、撮影対象である生体の器官(ここでは舌)を照明するものであり、撮影対象を上方より照明する照明器で構成されている。照明部2の光源としては、色再現性を向上するため、および近赤外光を照射して撮影するため、例えばキセノンランプなどの昼光色を発光するものを用いている。 The illuminator 2 illuminates a living organ (herein, the tongue) that is an object to be imaged with illumination light including near-infrared light, and includes an illuminator that illuminates the object to be imaged from above. As the light source of the illuminating unit 2, in order to improve color reproducibility and to shoot by irradiating near infrared light, a light source that emits a daylight color such as a xenon lamp is used.
 図3は、照明部2の光源として用いたキセノンランプの発光特性を示している。キセノンランプは、カメラフラッシュでよく使用されているものであり、可視光の波長域(400nm~700nm)から近赤外光の波長域(700nm~1000nm)まで、ほぼフラットで、自然昼光の分光分布に近い発光特性を持つ。 FIG. 3 shows the light emission characteristics of the xenon lamp used as the light source of the illumination unit 2. Xenon lamps are often used in camera flashes, and are almost flat from the visible wavelength range (400 nm to 700 nm) to the near infrared wavelength range (700 nm to 1000 nm). Emission characteristics close to the distribution.
 光源の明るさは、撮像部3の感度や撮影対象までの距離により異なるが、一例としては、撮影対象の照度が1000~10000lxとなるような明るさを考えることができる。照明部2は、上記の光源の他に、点灯回路や調光回路を有しており、照明制御部11からの指令によって点灯/消灯および調光が制御される。 Although the brightness of the light source varies depending on the sensitivity of the imaging unit 3 and the distance to the shooting target, as an example, it is possible to consider the brightness at which the illuminance of the shooting target is 1000 to 10000 lx. The illumination unit 2 includes a lighting circuit and a dimming circuit in addition to the above light source, and lighting / extinguishing and dimming are controlled by a command from the illumination control unit 11.
 撮像部3は、器官の表面で反射される照明部2からの照明光を受光することにより、器官の画像を取得するものである。図4は、撮影対象に対する照明部2と撮像部3との位置関係を示す説明図である。撮像部3は、撮影対象である舌に対面するように配置されている。 The imaging unit 3 acquires an image of the organ by receiving illumination light from the illumination unit 2 reflected from the surface of the organ. FIG. 4 is an explanatory diagram showing a positional relationship between the illumination unit 2 and the imaging unit 3 with respect to the imaging target. The imaging unit 3 is disposed so as to face the tongue that is the subject of imaging.
 撮像部3は、撮像レンズ31と、赤外反射ミラー32と、可視用センサ33と、赤外用センサ34とを有している。照明部2から出射され、舌で反射された照明光(反射光)は、撮像レンズ31を通過し、赤外反射ミラー32で可視光と赤外光(近赤外光を含む)とに分離される。赤外反射ミラー32にて分離された可視光は、可視用センサ33に導かれてそこで結像する。可視用センサ33は、受光面の光入射側に、図5に示す色フィルタ33aを有しており、これによって、センサの各画素ごとに、色フィルタ33aを透過した赤(R)、緑(G)、青(B)の各光の受光することができる。その結果、可視用センサ33では、RGBの各色からなる可視画像が取得される。一方、赤外反射ミラー32にて分離された赤外光は、赤外用センサ34に導かれてそこで結像する。赤外用センサ34では、近赤外(IR)の光の受光により、近赤外画像が取得される。 The imaging unit 3 includes an imaging lens 31, an infrared reflecting mirror 32, a visible sensor 33, and an infrared sensor 34. Illumination light (reflected light) emitted from the illumination unit 2 and reflected by the tongue passes through the imaging lens 31 and is separated into visible light and infrared light (including near infrared light) by the infrared reflection mirror 32. Is done. The visible light separated by the infrared reflecting mirror 32 is guided to the visible sensor 33 and forms an image there. The visible sensor 33 has a color filter 33a shown in FIG. 5 on the light incident side of the light receiving surface, and thereby, for each pixel of the sensor, red (R), green ( G) and blue (B) light can be received. As a result, the visible sensor 33 obtains a visible image composed of each color of RGB. On the other hand, the infrared light separated by the infrared reflecting mirror 32 is guided to the infrared sensor 34 and forms an image there. The infrared sensor 34 acquires a near-infrared image by receiving near-infrared (IR) light.
 撮像レンズ31の絞り(レンズの明るさ)、シャッター速度、焦点距離は、撮影対象の全ての範囲に焦点が合うように設定されている。一例としては、Fナンバー:16、シャッター速度:1/120秒、焦点距離:20mmである。 The aperture (lens brightness), shutter speed, and focal length of the imaging lens 31 are set so that the entire range to be photographed is in focus. As an example, F number: 16, shutter speed: 1/120 seconds, focal length: 20 mm.
 エリアセンサ(可視用センサ33、および赤外用センサ34)は、例えばCCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)のような撮像素子で構成されており、撮影対象の色および形状を十分に検出できるように、感度や解像度などが設定されている。一例としては、感度:60db、解像度:1000万画素である。 The area sensor (the visible sensor 33 and the infrared sensor 34) is composed of an image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal-Oxide Semiconductor), for example, and has sufficient color and shape to be photographed. Sensitivity, resolution, etc. are set so that they can be detected easily. As an example, sensitivity: 60 db, resolution: 10 million pixels.
 撮像部3による撮影は、撮像制御部12によって制御されている。また、撮像部3は、撮像レンズ31やエリアセンサの他にも、不図示のフォーカス機構、絞り機構、駆動回路およびA/D変換回路などを有しており、撮像制御部12からの指令により、フォーカスや絞りの制御、A/D変換などが制御される。撮像部3では、撮影画像のデータとして、R、G、BおよびIRのそれぞれについて、例えば8ビットで0~255のデータが取得される。 Imaging by the imaging unit 3 is controlled by the imaging control unit 12. In addition to the imaging lens 31 and the area sensor, the imaging unit 3 includes a focus mechanism (not shown), a diaphragm mechanism, a drive circuit, an A / D conversion circuit, and the like, and in response to a command from the imaging control unit 12. Focus, aperture control, A / D conversion, and the like are controlled. In the imaging unit 3, for example, data of 0 to 255 in 8 bits is acquired for each of R, G, B, and IR as captured image data.
 上記した照明部2は、撮影対象を通る撮像部3の撮影光軸Xに対して、例えば0°~45°の角度Aで撮影対象を照明するように配置されている。なお、撮影光軸Xとは、撮像レンズ31の光軸を指す。照明時の角度Aが大きいと、凹凸の計測精度が向上するが、上唇の影により、舌を撮影できる範囲が小さくなる。逆に、角度Aが小さいと、撮影範囲が拡大するが、計測精度が低下したり、正反射による色とびが大きくなる。以上のことを考慮すると、照明時の角度Aの好ましい範囲は、15°~30°である。 The illumination unit 2 described above is arranged so as to illuminate the imaging target at an angle A of, for example, 0 ° to 45 ° with respect to the imaging optical axis X of the imaging unit 3 passing through the imaging target. The imaging optical axis X refers to the optical axis of the imaging lens 31. When the angle A at the time of illumination is large, the measurement accuracy of the unevenness is improved, but the range in which the tongue can be photographed is reduced by the shadow of the upper lip. On the other hand, when the angle A is small, the photographing range is enlarged, but the measurement accuracy is lowered, and the color jump due to regular reflection is increased. Considering the above, the preferable range of the angle A at the time of illumination is 15 ° to 30 °.
 図6は、撮像部3の他の構成を示す説明図である。撮像部3は、撮像レンズ31と、色フィルタ35と、撮像素子36とを有して構成されていてもよい。撮像素子36は、CCDやCMOSなどのイメージセンサで構成されており、可視光から近赤外光まで検出感度を備えたSiフォトダイオードを有している。色フィルタ35は、撮像素子36の光入射側に配置されており、可視光および近赤外光を透過させて撮像素子36に導く。 FIG. 6 is an explanatory diagram showing another configuration of the imaging unit 3. The imaging unit 3 may include an imaging lens 31, a color filter 35, and an imaging element 36. The image sensor 36 is composed of an image sensor such as a CCD or CMOS, and has a Si photodiode having detection sensitivity from visible light to near infrared light. The color filter 35 is disposed on the light incident side of the image sensor 36, and transmits visible light and near infrared light to the image sensor 36.
 図7は、色フィルタ35の構成を模式的に示している。図5の色フィルタ33aは、4画素に2画素の割合で、緑色の光を透過させるGのフィルタを配置し、残りの画素に青色の光を透過させるBのフィルタと赤色の光を透過させるRのフィルタとを1:1の割合で配置したものであるが、図7の色フィルタ35は、4画素に2画素の割合で配置されているGのフィルタの1つを、近赤外光を透過させるIRのフィルタで置き換えたものである。このような色フィルタ35を用いることにより、1つの撮像素子36で、可視光の検出と近赤外光の検出とを共用することができる。つまり、色フィルタ33aのGのフィルタを間引くことにより、1つのセンサで可視光と近赤外光とを両方検出することができる。また、この構成では、図4の赤外反射ミラー32および赤外用センサ34を省略できるため、小型化および低コスト化を実現できる。 FIG. 7 schematically shows the configuration of the color filter 35. The color filter 33a shown in FIG. 5 includes a G filter that transmits green light at a ratio of 2 pixels to 4 pixels, and a B filter that transmits blue light and a red light transmitted to the remaining pixels. The R filter is arranged at a ratio of 1: 1. However, the color filter 35 in FIG. 7 uses one of the G filters arranged at a ratio of 2 pixels per 4 pixels as near-infrared light. Is replaced with an IR filter that transmits light. By using such a color filter 35, the detection of visible light and the detection of near-infrared light can be shared by one image sensor 36. That is, by decimating the G filter of the color filter 33a, both visible light and near infrared light can be detected by one sensor. Further, in this configuration, since the infrared reflection mirror 32 and the infrared sensor 34 in FIG. 4 can be omitted, it is possible to realize downsizing and cost reduction.
 図1および図2に戻って説明を続ける。表示部4は、不図示の液晶パネル、バックライト、点灯回路および制御回路を有しており、撮像部3での撮影によって取得される画像や、後述する演算部16にて算出されて出力された情報を表示する。表示部4における各種の情報の表示は、表示制御部13によって制御されている。 Referring back to FIGS. 1 and 2, the description will be continued. The display unit 4 includes a liquid crystal panel (not shown), a backlight, a lighting circuit, and a control circuit. The display unit 4 calculates and outputs an image acquired by photographing with the imaging unit 3 and a calculation unit 16 described later. Displayed information. Display of various types of information on the display unit 4 is controlled by the display control unit 13.
 操作部5は、撮像部3による撮影を指示するための入力部であり、OKボタン(撮影実行ボタン)5aおよびCANCELボタン5bで構成されている。本実施形態では、表示部4および操作部5を、共通のタッチパネル表示装置41で構成し、タッチパネル表示装置41における表示部4の表示領域と操作部5の表示領域とを別々にしている。タッチパネル表示装置41における操作部5の表示は、操作制御部14によって制御される。なお、操作部5は、タッチパネル表示装置41以外の入力部で構成されてもよい(タッチパネル表示装置41の表示領域外の位置に操作部5を設けてもよい)。 The operation unit 5 is an input unit for instructing imaging by the imaging unit 3, and includes an OK button (imaging execution button) 5a and a CANCEL button 5b. In the present embodiment, the display unit 4 and the operation unit 5 are configured by a common touch panel display device 41, and the display area of the display unit 4 and the display area of the operation unit 5 in the touch panel display device 41 are separated. The display of the operation unit 5 on the touch panel display device 41 is controlled by the operation control unit 14. The operation unit 5 may be configured by an input unit other than the touch panel display device 41 (the operation unit 5 may be provided at a position outside the display area of the touch panel display device 41).
 通信部6は、撮像部3にて取得された画像のデータや、後述する演算部16で算出されて出力された情報を、通信回線(有線や無線を含む)を介して外部に送信したり、外部からの情報を受信するためのインターフェースである。通信部6における情報の送受信は、通信制御部18によって制御されている。 The communication unit 6 transmits the image data acquired by the imaging unit 3 and the information calculated and output by the calculation unit 16 described below to the outside via a communication line (including wired and wireless). It is an interface for receiving information from the outside. Transmission / reception of information in the communication unit 6 is controlled by the communication control unit 18.
 音声出力部7は、各種の情報を音声で出力するものであり、例えばスピーカーで構成される。音声で出力される情報には、演算部16で演算された結果が含まれる。音声出力部7における音声出力は、音声出力制御部19によって制御される。 The audio output unit 7 outputs various types of information as audio, and is composed of, for example, a speaker. The information output by voice includes the result calculated by the calculation unit 16. The sound output in the sound output unit 7 is controlled by the sound output control unit 19.
 また、器官画像撮影装置1は、さらに、照明制御部11、撮像制御部12、表示制御部13、操作制御部14、画像処理部15、演算部16、記憶部17、通信制御部18、音声出力制御部19およびこれらの各部を制御する全体制御部20を備えている。照明制御部11、撮像制御部12、表示制御部13、操作制御部14、通信制御部18および音声出力制御部19は、上述したように、照明部2、撮像部3、表示部4、操作部5、通信部6および音声出力部7をそれぞれ制御する。全体制御部20は、例えばCPU(Central Processing Unit)で構成されている。なお、照明制御部11、撮像制御部12、表示制御部13、操作制御部14、画像処理部15、演算部16、通信制御部18および音声出力制御部19と、全体制御部20とは、一体的に(例えば1つのCPUで)構成されてもよい。 In addition, the organ imaging apparatus 1 further includes an illumination control unit 11, an imaging control unit 12, a display control unit 13, an operation control unit 14, an image processing unit 15, a calculation unit 16, a storage unit 17, a communication control unit 18, and a voice. An output control unit 19 and an overall control unit 20 that controls these units are provided. As described above, the illumination control unit 11, the imaging control unit 12, the display control unit 13, the operation control unit 14, the communication control unit 18, and the audio output control unit 19 are the illumination unit 2, the imaging unit 3, the display unit 4, and the operation. The unit 5, the communication unit 6, and the audio output unit 7 are controlled. The overall control unit 20 is composed of, for example, a CPU (Central Processing Unit). The illumination control unit 11, the imaging control unit 12, the display control unit 13, the operation control unit 14, the image processing unit 15, the calculation unit 16, the communication control unit 18, the audio output control unit 19, and the overall control unit 20 are: It may be configured integrally (for example, with one CPU).
 画像処理部15は、撮像部3にて取得された画像から器官の輪郭線を抽出するなど、各種の画像処理を行う機能を有する。器官の輪郭線の抽出は、撮影画像の輝度エッジ(画像の中で急激に明るさが変化している部分)を、フィルタを用いて抽出することによって行うことができる。エッジ抽出用のフィルタは、1次微分をするときに(隣接画素間で画像データの差分をとるときに)、注目画素の近傍の画素に重みを付けるフィルタである。 The image processing unit 15 has a function of performing various types of image processing, such as extracting an outline of an organ from an image acquired by the imaging unit 3. The outline of the organ can be extracted by extracting the luminance edge of the captured image (the portion where the brightness has changed abruptly in the image) using a filter. The edge extraction filter is a filter that weights pixels in the vicinity of the target pixel when performing first-order differentiation (when obtaining a difference in image data between adjacent pixels).
 記憶部17は、撮像部3にて取得した画像のデータ、画像処理部15で取得したデータ、演算部16で算出されたデータ、外部から受信した情報などを記憶したり、上述した各種の制御部を動作させるためのプログラムを記憶するメモリである。 The storage unit 17 stores image data acquired by the imaging unit 3, data acquired by the image processing unit 15, data calculated by the calculation unit 16, information received from the outside, and the like. A memory for storing a program for operating the unit.
 演算部16は、撮像部3にて取得される画像のうち、少なくとも、照明光に含まれる近赤外光の受光によって得られる近赤外画像の画像データに基づいて、器官の診断項目の程度を分類する。なお、上記の分類は、診断項目の程度を数値化することによって行ってもよいし、数値以外の出力によって行ってもよい。以下、診断項目の程度を分類する具体例について説明する。 The calculation unit 16 determines the degree of the diagnostic item of the organ based on at least the image data of the near-infrared image obtained by receiving the near-infrared light included in the illumination light among the images acquired by the imaging unit 3. Classify. The above classification may be performed by quantifying the degree of the diagnosis item, or may be performed by an output other than the numerical value. Hereinafter, a specific example for classifying the degree of diagnosis items will be described.
 〔舌の色の検出〕
 図8は、照明部2による照明下で撮像部3によって舌を撮影したときに、撮像部3にて取得される舌の可視画像および近赤外画像を示している。なお、図8では、便宜上、舌表面の苔の領域をハッチングで示している。可視画像では、舌表面のほぼ全体に苔があり、しかも、苔の分布にムラがあることが確認される。一方、近赤外画像では、舌の色と苔の色との差が消えて、苔が見えなくなっている。
[Detection of tongue color]
FIG. 8 shows a visible image and a near-infrared image of the tongue acquired by the imaging unit 3 when the imaging unit 3 images the tongue under illumination by the illumination unit 2. In FIG. 8, the moss region on the tongue surface is indicated by hatching for convenience. In the visible image, it is confirmed that there is moss on almost the entire surface of the tongue, and the moss distribution is uneven. On the other hand, in the near-infrared image, the difference between the color of the tongue and the color of the moss disappears, and the moss cannot be seen.
 図9は、舌の色と苔の色の分光分布を示している。Bの波長域(400~500nm)およびGの波長域(500~600nm)では、舌の色と苔の色とで反射率の差が大きいが、650nmを超える波長域では、舌の色と苔の色とで反射率の差がほとんど見られない。したがって、波長650nmを超える近赤外光を用いて舌を照明し、舌を撮影して近赤外画像を取得すると、近赤外画像において舌と苔とは同じ出力(同じ色)となり、舌の色と苔の色との差が消える。近赤外画像におけるこの特徴を利用することにより、舌の表面状態に関係なく(舌の表面全体に苔がある場合でも、苔の分布にムラがある場合でも)、また、苔が厚くても、舌の色を正確に検出することができる。なお、近赤外画像では苔のムラが現れないため、後述する舌の厚さや亀裂についても、650~1000nmのいずれかの波長の光を用いることによって正確に検出することができる。 FIG. 9 shows the spectral distribution of tongue color and moss color. In the B wavelength range (400 to 500 nm) and the G wavelength range (500 to 600 nm), there is a large difference in reflectance between the tongue color and moss color, but in the wavelength range exceeding 650 nm, the tongue color and moss There is almost no difference in reflectance between the colors of and. Accordingly, when the near-infrared light exceeding the wavelength of 650 nm is used to illuminate the tongue, and the tongue is photographed to obtain a near-infrared image, the tongue and the moss have the same output (same color) in the near-infrared image. The difference between the color of moss and the color of moss disappears. By using this feature in the near-infrared image, regardless of the surface condition of the tongue (whether the whole surface of the tongue has moss or uneven moss distribution), and even if the moss is thick The tongue color can be accurately detected. In addition, since the moss unevenness does not appear in the near-infrared image, the thickness and crack of the tongue, which will be described later, can be accurately detected by using light having any wavelength of 650 to 1000 nm.
 ここで、舌の色には、血液の色が反映される。血液の色は、主にヘモグロビンの酸化の度合いにより変化する。図10は、酸化ヘモグロビン(HbO2)および還元ヘモグロビン(Hb)の吸収特性を示している。同図より、600~800nmの波長域においては、酸化ヘモグロビンよりも還元ヘモグロビンのほうが、吸収係数が大きいことがわかる。つまり、酸化ヘモグロビンよりも還元ヘモグロビンのほうが、600~800nmの波長域の光の吸収度合いが大きい。このため、還元ヘモグロビンの多い血液は、赤色の吸収が大きいことから、相対的に青く見える。したがって、還元ヘモグロビンが多いか少ないかを判断することにおり、血液の色が反映される舌の色を検出することができる。 Here, the color of the blood reflects the color of the blood. The color of blood changes mainly depending on the degree of oxidation of hemoglobin. FIG. 10 shows the absorption characteristics of oxygenated hemoglobin (HbO 2 ) and reduced hemoglobin (Hb). From the figure, it can be seen that in the wavelength region of 600 to 800 nm, reduced hemoglobin has a larger absorption coefficient than oxygenated hemoglobin. That is, reduced hemoglobin has a higher degree of light absorption in the wavelength region of 600 to 800 nm than oxyhemoglobin. For this reason, blood with a high amount of deoxyhemoglobin looks relatively blue because of its large red absorption. Therefore, it is possible to detect the color of the tongue reflecting the color of blood by determining whether the reduced hemoglobin is high or low.
 なお、図10より、還元ヘモグロビンの吸収係数と酸化ヘモグロビンの吸収係数との差は、光の波長が660nmのときに最大となり、805nmのときに等しいことがわかる。 Note that FIG. 10 shows that the difference between the absorption coefficient of reduced hemoglobin and the absorption coefficient of oxyhemoglobin is maximum when the wavelength of light is 660 nm and is equal to 805 nm.
 舌の色を検出するにあたって、従来では、苔の無い舌の左右端部か下部(舌尖)の領域で舌の色を検出していたが、本実施形態では、苔の影響を受けないため、舌のどの領域で色を検出してもよい。ここでは、図8で示した舌の近赤外画像の中央の領域C(舌尖と舌根との間の舌中の領域)を診断の対象領域とし、この領域Cの画像データに基づいて、舌の色を以下のように検出する。 In detecting the color of the tongue, conventionally, the color of the tongue was detected in the region of the left or right end of the tongue without the moss or the lower part (lingual apex), but in this embodiment, it is not affected by the moss, The color may be detected in any region of the tongue. Here, the central region C of the near-infrared image of the tongue shown in FIG. 8 (the region in the tongue between the tip of the tongue and the base of the tongue) is set as a diagnosis target region, and the tongue C based on the image data of this region C. The color of is detected as follows.
 演算部16は、撮像部3で取得される近赤外画像の領域CにおけるBの画像データの平均値Bmと、IRの画像データの平均値IRmとを求め、これらの比率IRm/Bmを舌の色を検出する際の指標とする。比率IRm/Bmが大きいと、青色が相対的に少ないため、舌の色が赤い(血液の酸化度が大きい)ことになる。一方、比率IRm/Bmが小さいと、青色が相対的に多いため、舌の色が青い(血液の酸化度が小さい)ことになる。 The calculation unit 16 obtains an average value Bm of the B image data and an average value IRm of the IR image data in the region C of the near-infrared image acquired by the imaging unit 3, and uses these ratios IRm / Bm as a tongue. It is used as an index when detecting the color. When the ratio IRm / Bm is large, the color of the tongue is red (the degree of oxidation of blood is large) because blue is relatively small. On the other hand, when the ratio IRm / Bm is small, the color of the tongue is blue (the degree of oxidation of blood is small) because blue is relatively large.
 演算部16は、比率IRm/Bmの値に応じて舌の色を複数のランクに分類する。例えば、IRm/Bmの値がL1未満をランク“1”とし、IRm/Bmの値がL1以上L2未満をランク“2”とし、IRm/Bmの値がL2以上L3未満をランク“3”としておき、演算部16は、IRm/Bmの値に応じて舌の色を“1”~“3”のいずれかに分類する。この分類は、上記で示した“1”~“3”の数値の他に、“濃い赤色”、“普通の赤色”、“淡い赤色”等の、舌の色の程度を示すもの(数値以外)であってもよい。これにより、舌表面の全体が苔で覆われていたり、苔の分布にムラがあっても、舌の色(赤さの程度も含む)を正確に検出することができる。 The calculation unit 16 classifies the tongue color into a plurality of ranks according to the value of the ratio IRm / Bm. For example, the value of IRm / Bm is the rank "1" less than L 1, the value of IRm / Bm is the rank "2" less than L 1 or L 2, the value of IRm / Bm is less than L 2 or L 3 The rank is set to “3”, and the calculation unit 16 classifies the tongue color into any one of “1” to “3” according to the value of IRm / Bm. In addition to the numerical values “1” to “3” shown above, this classification indicates the degree of tongue color such as “dark red”, “normal red”, “light red”, etc. (other than numerical values) ). Thereby, even if the whole tongue surface is covered with moss or the moss distribution is uneven, the color of the tongue (including the degree of redness) can be accurately detected.
 なお、波長600~800nmの光について、酸化ヘモグロビンおよび還元ヘモグロビンの吸収係数の差が大きいことから、舌の色(ヘモグロビンの酸化度合い)の検出の際には、波長600~800nmの光を含む照明光を用いることが望ましい。 Since the difference in absorption coefficient between oxyhemoglobin and reduced hemoglobin is large for light with a wavelength of 600 to 800 nm, illumination including light with a wavelength of 600 to 800 nm is used when detecting the color of the tongue (degree of oxidation of hemoglobin). It is desirable to use light.
 なお、舌の色を検出する際に用いる上記の指標は、比率IRm/Bmのみならず、例えば比率IRm/(IRm+B)であってもよい。 The index used when detecting the color of the tongue is not limited to the ratio IRm / Bm, but may be, for example, the ratio IRm / (IRm + B).
 〔舌の厚さの検出〕
 図11および図12は、舌の可視画像と、この舌のA-A’線での断面形状と、可視画像においてA-A’方向に並ぶ各画素のRGBの画像データの分布をそれぞれ示しており、図11は、舌が薄く、苔が薄い場合を示し、図12は、舌が厚く、苔が厚い場合を示している。なお、A-A’方向は、可視画像における舌の舌尖と舌根との間の舌中を通り、舌尖と舌根とを結ぶ方向に垂直な方向を指し、水平方向に対応している。
[Detection of tongue thickness]
FIGS. 11 and 12 show the visible image of the tongue, the cross-sectional shape of the tongue along the line AA ′, and the distribution of RGB image data of each pixel aligned in the AA ′ direction in the visible image. FIG. 11 shows a case where the tongue is thin and the moss is thin, and FIG. 12 shows a case where the tongue is thick and the moss is thick. The AA ′ direction refers to a direction that passes through the tongue between the tongue tip and the tongue base of the tongue in the visible image and is perpendicular to the direction connecting the tongue tip and the tongue base, and corresponds to the horizontal direction.
 舌が薄い場合、舌の表面は、中央部が下に凹んだ谷状になる。このため、舌の表面で反射される照明光を撮像部で受光して、A-A’方向の画像データの分布を作成すると、上記分布は、RGBともに下に凹んだ形状となる(図11参照)。 When the tongue is thin, the surface of the tongue has a valley shape with the central part recessed downward. For this reason, when illumination light reflected from the surface of the tongue is received by the imaging unit and a distribution of image data in the direction AA ′ is created, the above distribution has a shape recessed downward in both RGB (FIG. 11). reference).
 反対に、舌が厚い場合、舌の表面は、中央部が膨らんで山なり(上に凸)になる。このため、舌の表面で反射される照明光を撮像部で受光して、A-A’方向の画像データの分布を作成すると、上記分布は、RGBともに中央部が高い凸状となる(図12参照)。 On the contrary, when the tongue is thick, the surface of the tongue bulges in the center and becomes a mountain (convex upward). Therefore, when illumination light reflected from the surface of the tongue is received by the imaging unit and a distribution of image data in the direction AA ′ is created, the distribution has a convex shape with a high central portion in both RGB (FIG. 12).
 また、図11のように、苔が薄い場合、RGBの画像データの分布が滑らかであるため、この分布に基づき、舌の形(厚さ)を正確に検出することができる。しかし、図12のように、苔が厚い場合、RGBの画像データの分布にノイズ(苔での反射光に起因するもの)が含まれるため、この分布からでは舌の形の検出が不正確となる。 Further, as shown in FIG. 11, when the moss is thin, the distribution of RGB image data is smooth, and therefore the shape (thickness) of the tongue can be accurately detected based on this distribution. However, as shown in FIG. 12, when the moss is thick, the RGB image data distribution includes noise (due to the reflected light from the moss). Become.
 そこで、本実施形態では、以下のようにして舌の厚さを検出している。図13および図14は、図8で示した可視画像および近赤外画像のA-A’方向に並ぶ各画素の画像データの分布をそれぞれ示している。なお、近赤外画像のA-A’方向は、近赤外画像における舌の舌尖と舌根との間の舌中を通り、舌尖と舌根とを結ぶ方向に垂直な方向を指し、水平方向に対応している。 Therefore, in this embodiment, the thickness of the tongue is detected as follows. FIGS. 13 and 14 show the distribution of image data of each pixel arranged in the A-A ′ direction of the visible image and the near-infrared image shown in FIG. 8, respectively. The AA ′ direction of the near-infrared image refers to a direction perpendicular to the direction connecting the tongue apex and the tongue base, passing through the tongue between the tongue apex and the tongue base of the near-infrared image, and in the horizontal direction. It corresponds.
 近赤外画像の画像データの分布では、可視画像の画像データの分布に比べて、苔によるノイズが低減され、分布が滑らかになっていることがわかる。演算部16は、近赤外画像の画像データの分布を用い、舌の厚さを以下のようにして検出する。 In the distribution of the image data of the near-infrared image, it can be seen that noise due to moss is reduced and the distribution is smoother than the distribution of the image data of the visible image. The calculation unit 16 detects the thickness of the tongue as follows using the distribution of the image data of the near-infrared image.
 図15は、舌の断面形状と、舌の近赤外画像のA-A’方向に並ぶ各画素の画像データの分布とを示している。また、図16は、図15の画像データの分布を、厚さを検出する際に設定される領域Sと併せて示したものである。この領域Sは、A-A’方向の中央部よりも端部側の領域であり、例えば、舌の輪郭線から求まる舌の幅をW(mm)としたときに、図16に示す寸法関係で定まる領域(舌の端部からW/8の距離にある幅W/4の領域を含む領域)を考えることができる。なお、舌の輪郭線は、前述の画像処理部15にて求められるものである。 FIG. 15 shows a sectional shape of the tongue and a distribution of image data of each pixel arranged in the A-A ′ direction of the near-infrared image of the tongue. FIG. 16 shows the distribution of the image data in FIG. 15 together with the region S set when the thickness is detected. This region S is a region closer to the end than the central portion in the AA ′ direction. For example, when the width of the tongue determined from the contour line of the tongue is W (mm), the dimensional relationship shown in FIG. (Region including a region of width W / 4 at a distance of W / 8 from the end of the tongue) can be considered. The contour line of the tongue is obtained by the image processing unit 15 described above.
 舌の筋肉による動きにより、舌の断面形状は様々に変化するが、舌が薄い場合は、舌がどのように動いても、領域Sでは、分布の形状は水平であるか、水平から凹形状に変化する。このため、領域Sでの分布の形状を2次の多項式で近似したときに、近似多項式の2次の係数は、0またはプラスの値となる。一方、舌が厚い場合、舌がどのように動いても、領域Sでは、分布の形状は凸形状となるため、領域Sでの分布の近似多項式の2次の係数は、マイナスの値となる。なお、舌が厚くなればなるほど、上記係数のマイナスの値が大きくなる。したがって、近赤外画像の画像データの分布の領域Sにおける近似多項式の2次の係数を見ることにより、舌の厚さを検出することができる。 Depending on the movement of the tongue, the cross-sectional shape of the tongue changes variously. However, when the tongue is thin, in the region S, the distribution shape is horizontal, or the horizontal to concave shape, no matter how the tongue moves. To change. For this reason, when the shape of the distribution in the region S is approximated by a quadratic polynomial, the quadratic coefficient of the approximate polynomial is 0 or a positive value. On the other hand, if the tongue is thick, no matter how the tongue moves, the distribution shape in the region S becomes a convex shape, so the quadratic coefficient of the approximate polynomial of the distribution in the region S is a negative value. . Note that the thicker the tongue, the greater the negative value of the coefficient. Therefore, the thickness of the tongue can be detected by looking at the quadratic coefficient of the approximate polynomial in the region S of the distribution of the image data of the near-infrared image.
 例えば、近似多項式の2次の係数がM1未満をランク“1”とし、M1以上M2未満をランク“2”とし、M2以上M3未満をランク“3”としておく。演算部16は、画像データの分布の領域Sにおける近似多項式の2次の係数に応じて、舌の厚さを“1”~“3”のいずれかに分類する。この分類は、上記で示した“1”~“3”の数値の他に、“薄い”、“普通”、“厚い”等の、舌の厚さの程度を示すもの(数値以外)であってもよい。これにより、舌表面の苔の有無および苔のムラに関係なく、舌の厚さ(厚さの程度も含む)を正確に検出することができる。 For example, if the quadratic coefficient of the approximate polynomial is less than M 1 , the rank is “1”, M 1 or more and less than M 2 is rank “2”, and M 2 or more and less than M 3 is rank “3”. The calculation unit 16 classifies the thickness of the tongue into any one of “1” to “3” according to the second-order coefficient of the approximate polynomial in the image data distribution region S. In addition to the numerical values “1” to “3” shown above, this classification indicates the level of tongue thickness (other than numerical values) such as “thin”, “normal”, and “thick”. May be. Thereby, the thickness of the tongue (including the thickness) can be accurately detected regardless of the presence or absence of moss on the tongue surface and the unevenness of the moss.
 〔舌表面の亀裂の検出〕
 図17および図18は、図8で示した可視画像および近赤外画像の領域Cの画像データの度数分布を示している。亀裂は、舌の表面の中心付近に多く発生するため、本実施形態では、舌の上下方向の中心付近で、上下方向の長さの1/4の長さの領域を、舌中の領域Cと考え、これを亀裂の検出領域(診断領域)として設定している。なお、図17の可視画像の領域Cの度数分布は、RGBのうちのBの画像データについての度数分布を示している。
[Detection of tongue surface cracks]
17 and 18 show the frequency distribution of the image data in the region C of the visible image and the near-infrared image shown in FIG. Since many cracks occur near the center of the surface of the tongue, in the present embodiment, a region having a length of ¼ of the length in the vertical direction is defined as a region C in the tongue near the vertical center of the tongue. This is set as a crack detection area (diagnosis area). Note that the frequency distribution in the region C of the visible image in FIG. 17 indicates the frequency distribution for the B image data of RGB.
 舌の表面に亀裂があると、亀裂がない場合に比べて、舌の下地がより現れるため、下地を構成する画素の画像データの取り得る値の範囲が、RGBともに広がる。このため、撮影画像の画像データの度数分布を作成したときに、度数分布の幅が広がる。なお、舌表面の亀裂は、前述したように、栄養不足や血流の不良、ストレスなどによる免疫の低下が原因となって生じる。 If there is a crack on the surface of the tongue, the base of the tongue appears more than when there is no crack, so the range of values that can be taken by the image data of the pixels that make up the base is expanded for both RGB. For this reason, when the frequency distribution of the image data of the captured image is created, the width of the frequency distribution is widened. In addition, as described above, the crack on the tongue surface is caused by a decrease in immunity due to lack of nutrition, poor blood flow, stress, or the like.
 一方、苔は、舌粘膜の乳頭組織が角化したものであり、剥離した細胞や粘液、食べかす、細菌が舌表面に付着したものである。苔の無い部分では、舌の下地が現れるため、やはり、下地を構成する画素の画像データの取り得る値の範囲が、RGBともに広がる。このため、図17の可視画像の度数分布では、苔の無い部分の下地の画像データと、亀裂の部分の下地の画像データとが両方含まれる結果、標準偏差(または分散)が大きくなる。ちなみに、図17の度数分布の標準偏差は26.78である(分散は標準偏差の2乗である)。 On the other hand, moss is a keratinized papillary tissue of the lingual mucosa, with exfoliated cells, mucus, food, and bacteria attached to the tongue surface. Since the base of the tongue appears in the portion without moss, the range of values that can be taken by the image data of the pixels constituting the base is also expanded for both RGB. For this reason, in the frequency distribution of the visible image in FIG. 17, the standard deviation (or variance) increases as a result of including both the background image data of the portion without moss and the background image data of the crack portion. Incidentally, the standard deviation of the frequency distribution of FIG. 17 is 26.78 (the variance is the square of the standard deviation).
 これに対して、図8の近赤外画像では、苔の影響がなくなるため、図18の度数分布のように、標準偏差(または分散)が可視画像の度数分布に比べて小さくなる。ちなみに、図18の度数分布の標準偏差は13.18である。しがたって、近赤外画像の画像データの度数分布を作成し、その標準偏差(または分散)を見ることで、舌の表面状態(苔の分布ムラ)に関係なく、舌の亀裂を正確に検出することができる。 On the other hand, in the near-infrared image of FIG. 8, the influence of moss is eliminated, so that the standard deviation (or variance) is smaller than the frequency distribution of the visible image as in the frequency distribution of FIG. Incidentally, the standard deviation of the frequency distribution in FIG. 18 is 13.18. Therefore, by creating a frequency distribution of the image data of the near-infrared image and looking at its standard deviation (or variance), it is possible to accurately crack the tongue regardless of the surface condition of the tongue (uneven distribution of moss). Can be detected.
 より具体的には、近赤外画像の画像データの度数分布の標準偏差σ(または分散σ2)がN1未満をランク“1”とし、N1以上N2未満をランク“2”とし、N2以上N3未満をランク“3”としておく。演算部16は、上記度数分布の標準偏差σ(または分散σ2)の値に応じて、舌の亀裂を“1”~“3”のいずれかに分類する。この分類は、上記で示した“1”~“3”の数値の他に、“小”、“中”、“大”等の、舌の亀裂の程度を示すもの(数値以外)であってもよい。これにより、舌表面の苔の有無および苔のムラに関係なく、舌の亀裂(亀裂の度合いも含む)を正確に検出することができる。 More specifically, when the standard deviation σ (or variance σ 2 ) of the frequency distribution of the image data of the near-infrared image is less than N 1 , the rank is “1”, and N 1 or more and less than N 2 is the rank “2”. A rank “3” is set between N 2 and N 3 . The calculation unit 16 classifies the cracks of the tongue into any one of “1” to “3” according to the value of the standard deviation σ (or variance σ 2 ) of the frequency distribution. In addition to the numerical values “1” to “3” shown above, this classification indicates the degree of cracking of the tongue (other than numerical values), such as “small”, “medium”, and “large”. Also good. This makes it possible to accurately detect cracks in the tongue (including the degree of cracking) regardless of the presence or absence of moss on the tongue surface and unevenness of the moss.
 〔制御フロー〕
 図19は、本実施形態の器官画像撮影装置1における動作の流れを示すフローチャートである。器官画像撮影装置1は、操作部5または不図示の入力部により、撮影指示を受け付けると、照明制御部11は照明部2を点灯させて(S1)、照度等の撮影条件の設定を行う(S2)。撮影条件の設定が終了すると、撮像制御部12は撮像部3を制御して撮影対象である舌を撮影する(S3)。これにより、舌の可視画像と近赤外画像とが得られる。
[Control flow]
FIG. 19 is a flowchart showing an operation flow in the organ image photographing apparatus 1 of the present embodiment. When the organ image photographing apparatus 1 receives a photographing instruction from the operation unit 5 or an unillustrated input unit, the illumination control unit 11 turns on the illumination unit 2 (S1) and sets photographing conditions such as illuminance (S1). S2). When the setting of the photographing conditions is completed, the imaging control unit 12 controls the imaging unit 3 to photograph the tongue that is the photographing target (S3). Thereby, the visible image and near-infrared image of a tongue are obtained.
 撮影が終了すると、画像処理部15は、舌の撮影画像から舌の輪郭線を抽出する(S4)。そして、演算部16は、抽出された輪郭線から、舌の上下端および左右端を検出し、診断項目(舌の色、厚さ、亀裂)を検出するための領域を設定する(S5)。具体的には、演算部16は、舌の色を検出する場合、舌中の領域C(図8参照)を設定し、舌の厚さを検出する場合、舌中を通る水平方向(A-A’方向)の領域(各画素)を設定し、舌表面の亀裂を検出する場合、舌中の領域Cを設定する。 When the photographing is completed, the image processing unit 15 extracts the contour line of the tongue from the photographed image of the tongue (S4). Then, the computing unit 16 detects the upper and lower ends and left and right ends of the tongue from the extracted outline, and sets areas for detecting diagnostic items (tongue color, thickness, crack) (S5). Specifically, the calculation unit 16 sets the region C in the tongue (see FIG. 8) when detecting the color of the tongue, and in the horizontal direction (A− When a region (each pixel) in the A ′ direction) is set and a crack on the tongue surface is detected, a region C in the tongue is set.
 次に、演算部16は、舌の撮影画像(可視画像、近赤外画像)をもとに、舌の色、厚さ、亀裂を検出するための特徴量(比率IRm/Bm、領域Sの近似多項式の2次の係数、度数分布の標準偏差σ)を抽出し(S6)、これらを上述の方法で分類(例えば数値化)する(S7)。算出した数値は、演算部16から出力されて表示部4にて表示され、また、記憶部17に記憶されるが、必要に応じて音声出力部7から音声で出力されたり、図示しない出力装置にて記録されたり、通信部6を介して外部に転送される(S8)。 Next, the calculation unit 16 uses a feature amount (ratio IRm / Bm, region S) for detecting the color, thickness, and crack of the tongue based on the photographed image (visible image, near infrared image) of the tongue. The quadratic coefficient of the approximate polynomial and the standard deviation σ of the frequency distribution are extracted (S6), and these are classified (for example, digitized) by the above-described method (S7). The calculated numerical value is output from the calculation unit 16 and displayed on the display unit 4 and is also stored in the storage unit 17. However, the calculated numerical value is output from the audio output unit 7 by voice as necessary, or an output device (not shown). Or transferred to the outside via the communication unit 6 (S8).
 以上のように、本実施形態では、演算部16は、撮像部3にて舌を撮影して得られる画像のうち、少なくとも、近赤外画像の画像データに基づいて、舌の診断項目(舌の色、厚さ、亀裂)の程度を分類する。近赤外画像では、可視画像では見えていたノイズ(例えば苔)が見えなくなるため、舌の表面の状態(苔の有無、苔のムラ)に関係なく、舌の診断項目を正確に検出することができる。したがって、検出した診断項目に基づく診断の精度が向上する。 As described above, in the present embodiment, the calculation unit 16 determines the tongue diagnosis item (tongue) based on at least the image data of the near-infrared image among the images obtained by photographing the tongue with the imaging unit 3. Color, thickness, cracks). In the near-infrared image, noise (eg, moss) that was visible in the visible image can no longer be seen, so the diagnostic items for the tongue can be accurately detected regardless of the state of the tongue surface (whether moss is present or moss is uneven). Can do. Therefore, the accuracy of diagnosis based on the detected diagnosis item is improved.
 また、演算部16は、近赤外画像における一方向の画像データの分布に基づいて、診断項目としての舌の厚さの程度を分類する。近赤外画像では、ノイズとなる舌表面の苔の影響がなくなるため、可視画像に比べて、一方向の画像データの分布が滑らかになる。したがって、近赤外画像の一方向の画像データの分布に基づいて、舌表面の断面形状の把握を正確かつ容易に行うことが可能となり、舌の厚さを正確に検出することが可能となる。 Also, the calculation unit 16 classifies the degree of tongue thickness as a diagnostic item based on the distribution of image data in one direction in the near-infrared image. In the near-infrared image, the influence of the moss on the tongue surface that becomes noise is eliminated, so that the distribution of the image data in one direction is smoother than that of the visible image. Accordingly, it becomes possible to accurately and easily grasp the cross-sectional shape of the tongue surface based on the distribution of the image data in one direction of the near-infrared image, and to detect the thickness of the tongue accurately. .
 特に、近赤外画像における上記画像データの分布として、舌中を通り、舌尖と舌根とを結ぶ方向に垂直な方向に並ぶ各画素の画像データの分布を用いることにより、舌の水平方向の表面形状を把握して、舌の厚さを正確に検出することが可能となる。 In particular, as the distribution of the image data in the near-infrared image, the horizontal surface of the tongue is obtained by using the distribution of the image data of each pixel that passes through the tongue and is aligned in a direction perpendicular to the direction connecting the tongue tip and the tongue base. By grasping the shape, the thickness of the tongue can be accurately detected.
 また、演算部16は、上記画像データの分布の所定領域(領域S)において、上記分布と近似する2次の多項式を求め、その2次の係数に基づいて、舌の厚さの程度を分類するため、近似多項式の2次の係数の大きさおよび符号(正負)に基づいて、舌の厚さを複数のランクに分類することが容易となる。 In addition, the calculation unit 16 obtains a second order polynomial that approximates the above distribution in a predetermined region (region S) of the distribution of the image data, and classifies the degree of tongue thickness based on the second order coefficient. Therefore, it becomes easy to classify the thickness of the tongue into a plurality of ranks based on the magnitude and sign (positive / negative) of the second-order coefficient of the approximate polynomial.
 また、演算部16は、近赤外画像における舌の一部の領域の画像データの度数分布に基づいて、診断項目としての舌の亀裂の程度を分類する。近赤外画像では、ノイズとなる舌表面の苔の影響がなくなるため、近赤外画像の度数分布は、可視画像の度数分布に比べて、広がりが小さくなる。したがって、近赤外画像の度数分布に基づいて、舌表面の亀裂の有無を正確に検出することが可能となる。 Further, the calculation unit 16 classifies the degree of cracking of the tongue as a diagnostic item based on the frequency distribution of the image data of a partial region of the tongue in the near-infrared image. In the near-infrared image, the influence of the moss on the tongue surface, which becomes noise, is eliminated. Therefore, the frequency distribution of the near-infrared image becomes smaller than the frequency distribution of the visible image. Therefore, it is possible to accurately detect the presence or absence of a crack on the tongue surface based on the frequency distribution of the near-infrared image.
 特に、近赤外画像における舌中の領域の度数分布を用いることにより、舌中付近で多く発生しやすい舌の亀裂を確実に検出することが可能となる。 Especially, by using the frequency distribution of the region in the tongue in the near-infrared image, it is possible to reliably detect the cracks in the tongue that are likely to occur in the vicinity of the tongue.
 また、近赤外画像の度数分布の標準偏差または分散は、度数分布の広がりの程度を示すものであるため、演算部16が上記度数分布の標準偏差または分散に基づいて、舌の亀裂の程度を分類することにより、舌表面の亀裂を正確にかつ確実に検出することが可能となる。 Further, since the standard deviation or variance of the frequency distribution of the near-infrared image indicates the extent of the frequency distribution, the calculation unit 16 determines the degree of cracking of the tongue based on the standard deviation or variance of the frequency distribution. This makes it possible to accurately and reliably detect cracks on the tongue surface.
 また、演算部16は、近赤外画像の画像データと、可視画像におけるBの画像データとに基づいて、診断項目としての舌の色の程度を分類する。Bの画像データは、他のGやRの画像データに比べて、苔の存在によって変動しにくいため、舌の色の検出に近赤外画像の画像データとBの画像データとを用いることで、苔の有無に関係なく、舌の色を正確に検出することが可能となる。 Also, the calculation unit 16 classifies the degree of tongue color as a diagnostic item based on the near-infrared image data and the B image data in the visible image. B image data is less likely to fluctuate due to the presence of moss than other G and R image data, so the near-infrared image data and B image data are used for tongue color detection. The color of the tongue can be accurately detected regardless of the presence or absence of moss.
 特に、演算部16は、近赤外画像の一部の領域における画像データの平均値IRmと、可視画像の一部の領域におけるBの画像データの平均値Bmとの比率(IRm/Bm)に基づいて、舌の色の程度を分類する。この場合、IRmとBmとの相対比較によって、舌が赤っぽいか青っぽいかを判断することができ、舌の色の検出が容易となる。 In particular, the calculation unit 16 sets the ratio (IRm / Bm) of the average value IRm of the image data in a partial region of the near-infrared image and the average value Bm of the B image data in a partial region of the visible image. Based on the degree of tongue color. In this case, it is possible to determine whether the tongue is reddish or bluish by a relative comparison between IRm and Bm, and the color of the tongue can be easily detected.
 〔その他〕
 以上では、撮影対象が人間の舌である場合について説明したが、生体(生きているもの)であれば人間でなくてもよく、人間以外の動物であってもよい。例えば、ペットや家畜などの動物の舌であっても、本実施形態の手法を適用して診断項目を正確に検出することが可能である。この場合、意思の伝達ができない動物の体調不良を速やかに、かつ的確に判断することができる。
[Others]
In the above, the case where the subject to be photographed is a human tongue has been described. However, as long as it is a living body (living object), it may not be a human but may be an animal other than a human. For example, even for the tongue of an animal such as a pet or livestock, the diagnostic item can be accurately detected by applying the method of the present embodiment. In this case, it is possible to quickly and accurately determine the poor physical condition of an animal that cannot communicate its intention.
 本実施形態では、生体の器官が舌である場合を例として説明したが、例えば、まぶたなどを診断対象の器官とすることも可能である。 In the present embodiment, the case where the organ of the living body is the tongue has been described as an example. However, for example, the eyelid or the like can be used as the organ to be diagnosed.
 以上で説明した器官画像撮影装置は、以下のように表現することができ、これによって以下の作用効果を奏する。 The organ image capturing apparatus described above can be expressed as follows, and has the following effects.
 以上で説明した器官画像撮影装置は、近赤外光を含む照明光によって、生体の器官を照明する照明部と、前記器官の表面で反射される前記照明光を受光することにより、前記器官の画像を取得する撮像部と、前記撮像部にて取得される前記画像のうち、少なくとも、前記照明光に含まれる前記近赤外光の受光によって得られる近赤外画像の画像データに基づいて、前記器官の診断項目の程度を分類する演算部とを備えている。 The organ imaging apparatus described above receives an illumination unit that illuminates a living organ with illumination light including near-infrared light, and receives the illumination light that is reflected from the surface of the organ. Based on image data of a near-infrared image obtained by receiving at least the near-infrared light included in the illumination light, among the image acquired by the imaging unit and the image acquired by the imaging unit, And an arithmetic unit for classifying the degree of diagnosis items of the organ.
 撮像部にて取得される画像には、器官の近赤外画像が含まれている。この近赤外画像では、可視光の受光によって取得される可視画像では見えていたノイズ(例えば器官が舌であれば苔)が見えなくなる。したがって、演算部が、近赤外画像の画像データに基づいて、器官の診断項目の程度を分類することにより、ノイズとなる器官表面の状態に関係なく、器官の診断項目を正確に検出することが可能となる。 The image acquired by the imaging unit includes a near-infrared image of the organ. In this near-infrared image, the noise (for example, moss if the organ is a tongue) cannot be seen in the visible image acquired by receiving visible light. Therefore, the operation unit classifies the degree of the diagnosis item of the organ based on the image data of the near-infrared image, thereby accurately detecting the diagnosis item of the organ regardless of the state of the organ surface that causes noise. Is possible.
 前記器官は、舌であってもよい。この場合、近赤外画像では、舌とその表面の苔とで色の差がほとんどなくなり、苔が見えなくなる。したがって、演算部が、近赤外画像の画像データに基づいて、舌の診断項目(例えば舌の色、厚さ、表面の亀裂)の程度を分類することにより、舌表面の苔の影響を受けることなく、診断項目を正確に検出することが可能となる。つまり、舌の表面全体が苔で覆われていたり、苔の分布にムラがある場合でも、舌の診断項目を正確に検出することが可能となる。 The organ may be a tongue. In this case, in the near-infrared image, there is almost no color difference between the tongue and the moss on the surface, and the moss cannot be seen. Therefore, the arithmetic unit is affected by the moss on the tongue surface by classifying the degree of the diagnostic items of the tongue (for example, the color, thickness of the tongue, cracks on the surface) based on the image data of the near-infrared image. Therefore, the diagnostic item can be detected accurately. That is, even if the entire surface of the tongue is covered with moss or the moss distribution is uneven, it is possible to accurately detect the diagnostic items of the tongue.
 前記演算部は、前記近赤外画像における一方向の画像データの分布に基づいて、前記診断項目としての舌の厚さの程度を分類してもよい。 The calculation unit may classify the degree of the thickness of the tongue as the diagnostic item based on the distribution of image data in one direction in the near-infrared image.
 近赤外画像では、ノイズとなる舌表面の苔の影響がなくなるため、近赤外画像における一方向の画像データの分布は、可視画像における一方向の画像データの分布に比べて滑らかになる。したがって、前者の分布に基づいて、舌表面の断面形状の把握を正確かつ容易に行うことが可能となり、舌の厚さを正確に検出することが可能となる。 In the near-infrared image, the influence of the moss on the tongue surface that becomes noise is eliminated, so that the distribution of the image data in one direction in the near-infrared image is smoother than the distribution of the image data in one direction in the visible image. Therefore, it is possible to accurately and easily grasp the cross-sectional shape of the tongue surface based on the former distribution, and it is possible to accurately detect the thickness of the tongue.
 前記演算部は、前記画像データの分布の所定領域において、前記分布と近似する2次の多項式を求め、前記多項式の2次の係数に基づいて、舌の厚さの程度を分類してもよい。 The calculation unit may obtain a second order polynomial that approximates the distribution in a predetermined region of the distribution of the image data, and classify the degree of the thickness of the tongue based on the second order coefficient of the polynomial. .
 この場合、多項式の2次の係数の大きさおよび符号(正負)に基づいて、舌の厚さを検出することが可能となり、舌の厚さの程度を分類することが容易となる。 In this case, the thickness of the tongue can be detected based on the magnitude and sign (positive / negative) of the second-order coefficient of the polynomial, and the level of the thickness of the tongue can be easily classified.
 前記画像データの分布は、前記近赤外画像における舌の舌尖と舌根との間の舌中を通り、舌尖と舌根とを結ぶ方向に垂直な方向に並ぶ各画素の画像データの分布であってもよい。 The distribution of the image data is a distribution of image data of each pixel that passes through the tongue between the tongue tip and the tongue base of the tongue in the near-infrared image and is aligned in a direction perpendicular to the direction connecting the tongue tip and the tongue base. Also good.
 このように、近赤外画像における舌の中央(舌中)を通る水平方向の画像データの分布に基づいて、舌の厚さの程度を分類することにより、舌の水平方向の表面形状を把握して、舌の厚さを正確に検出することが可能となる。 In this way, the surface shape in the horizontal direction of the tongue is grasped by classifying the degree of the thickness of the tongue based on the distribution of horizontal image data passing through the center of the tongue (in the tongue) in the near-infrared image. Thus, the thickness of the tongue can be accurately detected.
 前記演算部は、前記近赤外画像における舌の一部の領域の画像データの度数分布に基づいて、前記診断項目としての舌の亀裂の程度を分類してもよい。 The calculation unit may classify the degree of cracking of the tongue as the diagnostic item based on a frequency distribution of image data of a partial region of the tongue in the near-infrared image.
 近赤外画像では、ノイズとなる舌表面の苔の影響がなくなるため、近赤外画像における舌の一部の領域の画像データの度数分布の広がりは、可視画像における上記領域と同じ領域の画像データの度数分布の広がりに比べて小さくなる。したがって、前者の度数分布に基づいて、舌表面の亀裂の有無を正確に検出することが可能となる。 In the near-infrared image, the influence of the moss on the tongue surface that causes noise is eliminated, so the spread of the frequency distribution of the image data of a partial region of the tongue in the near-infrared image is the same region as the above region in the visible image Smaller than the spread of the frequency distribution of data. Therefore, it is possible to accurately detect the presence or absence of a crack on the tongue surface based on the former frequency distribution.
 前記演算部は、前記度数分布の標準偏差または分散に基づいて、舌の亀裂の程度を分類してもよい。標準偏差または分散は、度数分布の広がりの程度を示すものであるため、舌表面の亀裂を正確にかつ確実に検出することが可能となる。 The calculation unit may classify the degree of tongue cracking based on the standard deviation or variance of the frequency distribution. Since the standard deviation or variance indicates the extent of the frequency distribution, it is possible to accurately and reliably detect cracks on the tongue surface.
 前記一部の領域は、前記近赤外画像における舌の舌尖と舌根との間の舌中の領域であってもよい。この場合、舌表面の舌中付近で多く発生しやすい舌の亀裂を、上記度数分布に基づいて確実に検出することが可能となる。 The partial region may be a region in the tongue between the tongue tip and the base of the tongue in the near-infrared image. In this case, it is possible to reliably detect a crack in the tongue that is likely to occur in the vicinity of the tongue surface on the tongue surface based on the frequency distribution.
 前記照明光は、可視光を含んでおり、前記演算部は、前記近赤外画像の画像データと、前記照明光に含まれる前記可視光を、前記器官の表面を介して受光することによって得られる可視画像における青色の画像データとに基づいて、前記診断項目としての舌の色の程度を分類してもよい。 The illumination light includes visible light, and the calculation unit receives the image data of the near-infrared image and the visible light included in the illumination light through the surface of the organ. The degree of tongue color as the diagnostic item may be classified based on the blue image data in the visible image.
 可視画像の青色(B)の画像データは、他の緑色(G)や赤色(R)の画像データに比べて、苔の存在によって変動しにくい(苔の影響を受けにくい)。このため、近赤外画像の画像データと、可視画像のBの画像データとに基づいて、舌の色の程度を分類することにより、苔の存在に関係なく、舌の色を正確に検出することが可能となる。 The blue (B) image data of the visible image is less likely to fluctuate due to the presence of moss (less susceptible to moss) than the other green (G) and red (R) image data. Therefore, the color of the tongue can be accurately detected regardless of the presence of moss by classifying the degree of the color of the tongue based on the image data of the near-infrared image and the B image data of the visible image. It becomes possible.
 前記器官の表面における一部の領域を、診断の対象領域としたとき、前記演算部は、前記近赤外画像の前記対象領域における画像データの平均値と、前記可視画像の前記対象領域における青色の画像データの平均値との比率に基づいて、舌の色の程度を分類してもよい。 When a partial region on the surface of the organ is a target region for diagnosis, the calculation unit calculates an average value of image data in the target region of the near-infrared image and a blue color in the target region of the visible image. The degree of tongue color may be classified based on the ratio to the average value of the image data.
 近赤外画像の対象領域における画像データの平均値(IRm)と、可視画像の対象領域におけるBの画像データの平均値(Bm)との相対比較によって、舌の赤さまたは青さを検出することができるため、舌の色の検出が容易となる。 Redness or blueness of the tongue is detected by relative comparison between the average value (IRm) of the image data in the target region of the near-infrared image and the average value (Bm) of the B image data in the target region of the visible image. Therefore, it is easy to detect the color of the tongue.
 本発明は、生体の器官を撮影して、その撮影画像から器官の診断項目を検出する装置に利用可能である。 The present invention can be used in an apparatus for photographing a living organ and detecting a diagnosis item of the organ from the photographed image.
   1   器官画像撮影装置
   2   照明部
   3   撮像部
  16   演算部
DESCRIPTION OF SYMBOLS 1 Organ imaging device 2 Illumination part 3 Imaging part 16 Calculation part

Claims (10)

  1.  近赤外光を含む照明光によって、生体の器官を照明する照明部と、
     前記器官の表面で反射される前記照明光を受光することにより、前記器官の画像を取得する撮像部と、
     前記撮像部にて取得される前記画像のうち、少なくとも、前記照明光に含まれる前記近赤外光の受光によって得られる近赤外画像の画像データに基づいて、前記器官の診断項目の程度を分類する演算部とを備えていることを特徴とする器官画像撮影装置。
    An illumination unit that illuminates a living organ with illumination light including near infrared light; and
    An imaging unit that acquires an image of the organ by receiving the illumination light reflected from the surface of the organ;
    Based on the image data of the near-infrared image obtained by receiving the near-infrared light included in the illumination light among the images acquired by the imaging unit, the degree of diagnosis items of the organ is determined. An organ image capturing apparatus comprising: an arithmetic unit for classifying.
  2.  前記器官は、舌であることを特徴とする請求項1に記載の器官画像撮影装置。 2. The organ imaging apparatus according to claim 1, wherein the organ is a tongue.
  3.  前記演算部は、前記近赤外画像における一方向の画像データの分布に基づいて、前記診断項目としての舌の厚さの程度を分類することを特徴とする請求項2に記載の器官画像撮影装置。 The organ imaging according to claim 2, wherein the calculation unit classifies the degree of the thickness of the tongue as the diagnosis item based on a distribution of image data in one direction in the near-infrared image. apparatus.
  4.  前記演算部は、前記画像データの分布の所定領域において、前記分布と近似する2次の多項式を求め、前記多項式の2次の係数に基づいて、舌の厚さの程度を分類することを特徴とする請求項3に記載の器官画像撮影装置。 The calculation unit obtains a second order polynomial that approximates the distribution in a predetermined region of the distribution of the image data, and classifies the degree of the thickness of the tongue based on the second order coefficient of the polynomial. The organ imaging apparatus according to claim 3.
  5.  前記画像データの分布は、前記近赤外画像における舌の舌尖と舌根との間の舌中を通り、舌尖と舌根とを結ぶ方向に垂直な方向に並ぶ各画素の画像データの分布であることを特徴とする請求項3または4に記載の器官画像撮影装置。 The distribution of the image data is a distribution of image data of each pixel that passes through the tongue between the tongue tip and the tongue base of the tongue in the near-infrared image and is aligned in a direction perpendicular to the direction connecting the tongue tip and the tongue base. The organ image photographing device according to claim 3 or 4, characterized in that:
  6.  前記演算部は、前記近赤外画像における舌の一部の領域の画像データの度数分布に基づいて、前記診断項目としての舌の亀裂の程度を分類することを特徴とする請求項2から5のいずれかに記載の器官画像撮影装置。 The said calculating part classify | categorizes the grade of the crack of the tongue as the said diagnostic item based on the frequency distribution of the image data of the partial region of the tongue in the said near-infrared image. An organ imaging apparatus according to any one of the above.
  7.  前記演算部は、前記度数分布の標準偏差または分散に基づいて、舌の亀裂の程度を分類することを特徴とする請求項6に記載の器官画像撮影装置。 The organ imaging apparatus according to claim 6, wherein the calculation unit classifies the degree of cracking of the tongue based on a standard deviation or variance of the frequency distribution.
  8.  前記一部の領域は、前記近赤外画像における舌の舌尖と舌根との間の舌中の領域であることを特徴とする請求項6または7に記載の器官画像撮影装置。 The organ imaging apparatus according to claim 6 or 7, wherein the partial region is a region in the tongue between a tongue tip and a tongue base in the near-infrared image.
  9.  前記照明光は、可視光を含んでおり、
     前記演算部は、前記近赤外画像の画像データと、前記照明光に含まれる前記可視光を、前記器官の表面を介して受光することによって得られる可視画像における青色の画像データとに基づいて、前記診断項目としての舌の色の程度を分類することを特徴とする請求項2から8のいずれかに記載の器官画像撮影装置。
    The illumination light includes visible light,
    The arithmetic unit is based on image data of the near-infrared image and blue image data in a visible image obtained by receiving the visible light included in the illumination light through the surface of the organ. The organ imaging apparatus according to claim 2, wherein the degree of color of the tongue as the diagnostic item is classified.
  10.  前記器官の表面における一部の領域を、診断の対象領域としたとき、
     前記演算部は、前記近赤外画像の前記対象領域における画像データの平均値と、前記可視画像の前記対象領域における青色の画像データの平均値との比率に基づいて、舌の色の程度を分類することを特徴とする請求項9に記載の器官画像撮影装置。
    When a region on the surface of the organ is a target region for diagnosis,
    The calculation unit calculates the degree of tongue color based on a ratio between an average value of image data in the target region of the near-infrared image and an average value of blue image data in the target region of the visible image. The organ imaging apparatus according to claim 9, wherein the organ imaging apparatus is classified.
PCT/JP2015/054706 2014-04-08 2015-02-20 Organ imaging apparatus WO2015156039A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-079375 2014-04-08
JP2014079375 2014-04-08

Publications (1)

Publication Number Publication Date
WO2015156039A1 true WO2015156039A1 (en) 2015-10-15

Family

ID=54287620

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/054706 WO2015156039A1 (en) 2014-04-08 2015-02-20 Organ imaging apparatus

Country Status (1)

Country Link
WO (1) WO2015156039A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110929740A (en) * 2019-11-21 2020-03-27 中电健康云科技有限公司 LGBM model-based tongue quality and tongue coating separation method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006149679A (en) * 2004-11-29 2006-06-15 Konica Minolta Holdings Inc Method, device and program for determining degree of health
CN101214140A (en) * 2007-12-29 2008-07-09 哈尔滨工业大学 Portable near-infrared sublingual vein image gathering instrument
JP2009028058A (en) * 2007-07-24 2009-02-12 Saieco:Kk System, apparatus, method and program for tongue diagnosis
KR101028999B1 (en) * 2010-07-16 2011-04-14 상지대학교산학협력단 An apparatus for photographing tongue diagnosis image using and method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006149679A (en) * 2004-11-29 2006-06-15 Konica Minolta Holdings Inc Method, device and program for determining degree of health
JP2009028058A (en) * 2007-07-24 2009-02-12 Saieco:Kk System, apparatus, method and program for tongue diagnosis
CN101214140A (en) * 2007-12-29 2008-07-09 哈尔滨工业大学 Portable near-infrared sublingual vein image gathering instrument
KR101028999B1 (en) * 2010-07-16 2011-04-14 상지대학교산학협력단 An apparatus for photographing tongue diagnosis image using and method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110929740A (en) * 2019-11-21 2020-03-27 中电健康云科技有限公司 LGBM model-based tongue quality and tongue coating separation method

Similar Documents

Publication Publication Date Title
US20170164888A1 (en) Organ imaging device
EP2992805B1 (en) Electronic endoscope system
WO2015029537A1 (en) Organ imaging apparatus
EP2000082A1 (en) Apparatus for obtaining oxygen-saturation information and method thereof
US20170311872A1 (en) Organ image capture device and method for capturing organ image
KR20150128916A (en) Estimating bilirubin levels
CN112469323B (en) Endoscope system
JP7229676B2 (en) Biological information detection device and biological information detection method
WO2016067892A1 (en) Degree-of-health outputting device, degree-of-health outputting system, and program
KR20200070062A (en) System and method for detecting lesion in capsule endoscopic image using artificial neural network
JPWO2020170791A1 (en) Medical image processing equipment and methods
US20210186315A1 (en) Endoscope apparatus, endoscope processor, and method for operating endoscope apparatus
JPWO2020008834A1 (en) Image processing equipment, methods and endoscopic systems
US11363177B2 (en) Endoscope system
WO2015156039A1 (en) Organ imaging apparatus
JP2016151584A (en) Organ image capturing device
WO2015049936A1 (en) Organ imaging apparatus
JP2016150024A (en) Organ imaging device
WO2022014258A1 (en) Processor device and processor device operation method
WO2015068494A1 (en) Organ image capturing device
JP2005094185A (en) Image processing system, image processing apparatus, and imaging control method
JP2015226599A (en) Apparatus for measuring chromaticity of living body
WO2015060070A1 (en) Organ image capture device
CN113556968A (en) Endoscope system
CN114376491A (en) Endoscopic imaging device, method and system thereof, and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15776341

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15776341

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP