WO2015166692A1 - 健康度判定装置および健康度判定システム - Google Patents
健康度判定装置および健康度判定システム Download PDFInfo
- Publication number
- WO2015166692A1 WO2015166692A1 PCT/JP2015/054703 JP2015054703W WO2015166692A1 WO 2015166692 A1 WO2015166692 A1 WO 2015166692A1 JP 2015054703 W JP2015054703 W JP 2015054703W WO 2015166692 A1 WO2015166692 A1 WO 2015166692A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- subject
- distribution
- feature amount
- unit
- determination
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0088—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1032—Determining colour for diagnostic purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1079—Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/45—For evaluating or diagnosing the musculoskeletal system or teeth
- A61B5/4538—Evaluating a particular part of the muscoloskeletal system or a particular medical condition
- A61B5/4542—Evaluating the mouth, e.g. the jaw
- A61B5/4552—Evaluating soft tissue within the mouth, e.g. gums or tongue
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
Definitions
- the present invention relates to a health level determination device and a health level determination system for determining the health level of a subject.
- Patent Document 1 proposes a system that comprehensively performs health degree determination using the Mahalanobis distance.
- Japanese Patent No. 4487535 see claim 1, paragraphs [0006], [0018], [0096], etc.
- Japanese Patent No. 4649965 see claim 1, paragraph [0008] etc.
- the present invention has been made to solve the above problems, and its purpose is to accurately determine the health level (physical condition) while reducing the influence of the constitutional bias, and to improve the constitutional bias.
- An object of the present invention is to provide a health level determination device and a health level determination system that can be easily confirmed.
- a health degree determination apparatus is a health degree determination apparatus that determines a subject's health degree, and is input from a feature amount that is extracted from a captured image of an organ of the subject person and used for the health degree determination.
- a determination unit that determines a bias in the constitution and a change in the physical condition of the subject, and when the determination unit determines the bias in the subject's constitution, the distribution of the feature quantity of the population targets another person
- the change in the physical condition of the subject is determined using the distribution of the feature quantity of the other person created as the characteristics of the subject person collected within the evaluation period for evaluating the physical condition as the distribution of the feature quantity of the population Use quantity distribution.
- the health degree can be accurately determined while reducing the influence of the constitutional bias, and the improvement effect of the constitutional bias can be easily confirmed.
- the numerical value range includes the values of the lower limit A and the upper limit B.
- FIG. 1 is an explanatory diagram illustrating a schematic configuration of a health degree determination system 50 according to the present embodiment.
- the health degree determination system 50 includes an organ image capturing device 1 and a terminal device 51.
- the organ image capturing apparatus 1 captures an organ of a living body and extracts (detects) information (features, diagnosis items) necessary for health diagnosis, and is configured by, for example, a multifunctional portable information terminal. Yes.
- the terminal device 51 determines the health level of the living body based on the information acquired by the organ imaging device 1, and is configured by a personal computer or the like.
- the organ image capturing apparatus 1 and the terminal apparatus 51 are communicably connected via a communication line.
- the communication line may be wired or wireless.
- the organ image capturing apparatus 1 may have at least a part of the functions of the terminal device 51, or conversely, the terminal apparatus 51 may have at least a part of the functions of the organ image capturing apparatus 1.
- details of the organ image photographing apparatus 1 and the terminal device 51 will be described. In the following, a case where the living body is a human and the organ is a tongue will be described as an example.
- FIG. 2 is a perspective view showing an appearance of the organ image photographing apparatus 1 of the present embodiment
- FIG. 3 is a block diagram showing a schematic configuration of the organ image photographing apparatus 1.
- the organ imaging apparatus 1 includes an illumination unit 2, an imaging unit 3, a display unit 4, an operation unit 5, a communication unit 6, and an audio output unit 7.
- the illumination unit 2 is provided in the housing 21, and the configuration other than the illumination unit 2 (for example, the imaging unit 3, the display unit 4, the operation unit 5, the communication unit 6, and the audio output unit 7) is provided in the housing 22.
- casing 22 are connected so that relative rotation is possible, rotation is not necessarily required and one side may be completely fixed to the other.
- etc., May be provided in the single housing
- the illuminating unit 2 illuminates a living organ (herein, tongue) that is a subject to be imaged, and is configured by an illuminator that illuminates the subject to be photographed from above.
- a light source of the illuminating unit 2 in order to improve color reproducibility, a light source that emits a daylight color such as a xenon lamp is used.
- the brightness of the light source varies depending on the sensitivity of the imaging unit 3 and the distance to the shooting target. As an example, it is possible to consider the brightness at which the illuminance of the shooting target is 1000 to 10000 lx.
- the illumination unit 2 includes a lighting circuit and a dimming circuit in addition to the above light source, and lighting / extinguishing and dimming are controlled by a command from the illumination control unit 11.
- the imaging unit 3 captures an image of an organ (here, a tongue) of a subject whose health is to be determined under illumination by the illumination unit 2, and acquires an image.
- An imaging lens and an area sensor (imaging element) Have.
- the aperture (brightness of the lens), shutter speed, and focal length of the imaging lens are set so that the entire range to be photographed is in focus. As an example, F number: 16, shutter speed: 1/120 seconds, focal length: 20 mm.
- the area sensor is composed of image sensors such as CCD (Charge Coupled Device) and CMOS (Complementary Metal Oxide Semiconductor), and the sensitivity and resolution are set so that the color and shape of the subject can be detected sufficiently.
- image sensors such as CCD (Charge Coupled Device) and CMOS (Complementary Metal Oxide Semiconductor)
- sensitivity and resolution are set so that the color and shape of the subject can be detected sufficiently.
- sensitivity 60 db
- resolution 10 million pixels.
- Imaging by the imaging unit 3 is controlled by the imaging control unit 12.
- the imaging unit 3 includes a focus mechanism (not shown), a diaphragm mechanism, a drive circuit, an A / D conversion circuit, and the like. Focus, aperture control, A / D conversion, and the like are controlled.
- the imaging unit 3 acquires, for example, data of 0 to 255 in 8 bits for each of red (R), green (G), and blue (B) as captured image data.
- FIG. 4 is an explanatory diagram showing the positional relationship between the illumination unit 2 and the imaging unit 3 with respect to the imaging target (tongue or face).
- the imaging unit 3 is arranged to face the subject to be photographed.
- the illumination unit 2 is arranged so as to illuminate the imaging target at an angle A of, for example, 0 ° to 45 ° with respect to the imaging optical axis X of the imaging unit 3 passing through the imaging target.
- the imaging optical axis X refers to the optical axis of the imaging lens that the imaging unit 3 has.
- the preferable range of the angle A at the time of illumination is 15 ° to 30 °.
- the display unit 4 includes a liquid crystal panel (not shown), a backlight, a lighting circuit, and a control circuit, and is calculated by an image acquired by photographing with the imaging unit 3 or a feature amount extraction unit 16 described later. Display the output information. In addition, the display unit 4 displays items necessary for an inquiry to be described later, or displays information acquired from the outside via the communication unit 6 (for example, results of diagnosis by transmitting information to an external medical institution). You can also. Display of various types of information on the display unit 4 is controlled by the display control unit 13.
- the operation unit 5 is an input unit for instructing imaging by the imaging unit 3, and includes an OK button (imaging execution button) 5a and a CANCEL button 5b.
- the display unit 4 and the operation unit 5 are configured by a common touch panel display device 41, and the display area of the display unit 4 and the display area of the operation unit 5 in the touch panel display device 41 are separated.
- the display of the operation unit 5 on the touch panel display device 41 is controlled by the operation control unit 14.
- the operation unit 5 may be configured by an input unit other than the touch panel display device 41 (the operation unit 5 may be provided at a position outside the display area of the touch panel display device 41).
- the operation unit 5 also functions as an information input unit for responding to items necessary for the inquiry displayed on the display unit 4.
- the display unit 4 and the operation unit 5 are configured by the touch panel display device 41 as described above, the display unit 4 displays the above items and the keys necessary for the operator to respond (for example, numeric keys, shift keys, OK). By displaying a button or the like, the operator can input necessary information via the displayed key. Therefore, in this case, the display unit 4 also functions as the information input unit of the operation unit 5.
- the communication unit 6 transmits the image data acquired by the imaging unit 3 and the information calculated and output by the feature amount extraction unit 16 described later to the outside via the communication line, or information from the outside It is an interface for receiving.
- the communication unit 6 constitutes a transmission unit that transmits information on the feature amount of the subject extracted by the feature amount extraction unit 16 to the terminal device 51. Transmission / reception of information in the communication unit 6 is controlled by the communication control unit 18.
- the audio output unit 7 outputs various types of information as audio, and is composed of, for example, a speaker.
- the information output by voice includes a result determined by a determination unit 53 (see FIG. 16) described later of the terminal device 51.
- the sound output in the sound output unit 7 is controlled by the sound output control unit 19.
- the organ image capturing apparatus 1 further includes an illumination control unit 11, an imaging control unit 12, a display control unit 13, an operation control unit 14, an image processing unit 15, a feature amount extraction unit 16, a storage unit 17, and a communication control unit 18. And an audio output control unit 19 and an overall control unit 20 for controlling each of these units.
- the illumination control unit 11, the imaging control unit 12, the display control unit 13, the operation control unit 14, the communication control unit 18, and the audio output control unit 19 are the illumination unit 2, the imaging unit 3, the display unit 4, and the operation.
- the unit 5, the communication unit 6, and the audio output unit 7 are controlled.
- the overall control unit 20 is composed of, for example, a CPU (Central Processing Unit).
- the illumination control unit 11, the imaging control unit 12, the display control unit 13, the operation control unit 14, the communication control unit 18, the audio output control unit 19, and the overall control unit 20 are integrated (for example, with one CPU). ) May be configured.
- the image processing unit 15 performs a process of extracting an organ outline from the image acquired by the imaging unit 3.
- the image processing unit 15 extracts the contour line of the tongue as an organ based on the luminance edge of the photographed image (the part where the brightness has suddenly changed in the image).
- Luminance edge extraction can be performed using an edge extraction filter as shown in FIG. 5, for example.
- the edge extraction filter is a filter that weights pixels in the vicinity of the target pixel when performing first-order differentiation (when obtaining a difference in image data between adjacent pixels).
- the edge extraction filter for example, for the G image data of each pixel of the captured image, the difference between the image data is calculated between the target pixel and the neighboring pixel, and the pixel whose difference value exceeds a predetermined threshold is extracted.
- a pixel that becomes a luminance edge can be extracted. Since there is a luminance difference due to the shadow around the tongue, the contour line of the tongue can be extracted by extracting pixels that become luminance edges as described above.
- G image data having the greatest influence on the luminance is used for the calculation, R or B image data may be used.
- the storage unit 17 stores image data acquired by the imaging unit 3, data acquired by the image processing unit 15, data calculated by the feature amount extraction unit 16, information received from the outside, and the like described above. It is a memory which memorize
- the feature amount extraction unit 16 extracts feature amounts used for determination of health from the image data acquired by the imaging unit 3. Examples of the feature amount include tongue color, moss color, moss shape (thickness), tooth marks, dampness (glossiness), fissure (crack), and tongue shape (thickness). These feature amounts can be extracted (detected) as follows, for example.
- FIG. 6 shows a photographed image of the tongue.
- the photographed image of the tongue when the region formed in the vertical direction at the center in the left-right direction of the tongue is divided into three regions arranged in the vertical direction, each region is divided into the upper region R1, the central region. These are referred to as R2 and lower region R3.
- R2 and lower region R3 These areas are defined by the size shown in FIG. 6 based on the width W in the horizontal direction and the length H in the vertical direction of the area surrounded by the contour of the tongue detected by the image processing unit 15.
- the illustrated size is an example, and the present invention is not limited to this.
- the tongue color reflects the color of blood, and therefore the R component or B component mainly increases or decreases in the RGB image data. Therefore, by obtaining the ratio of R (R / (R + G + B)) or the ratio of B (B / (R + G + B)) in the lower tongue region R3, the color of the tongue can be quantified and extracted.
- the RGB data an average value of R image data, an average value of G image data, and an average value of B image data among a plurality of pixels constituting the lower region R3 can be used. .
- the reason why the image data of the lower tongue region R3 is used when extracting the tongue color is as follows. That is, in tongue examination used in Kampo medicine, the color of the tongue is generally diagnosed at the left and right ends of the tongue without moss or at the lower center of the tongue, but the left and right ends of the tongue are This is because the way in which the illumination light strikes changes due to the unevenness, so that shading tends to occur, and the image data tends to fluctuate from a value indicating the original tongue color.
- the G component or (G + R) component mainly increases or decreases in the RGB image data. Therefore, it is possible to quantify and extract the color of moss by determining the ratio of G (G / (R + G + B)) or the ratio of (G + R) ((G + R) / (R + G + B)) in the upper region R1 of the tongue. it can.
- the RGB data an average value of R image data, an average value of G image data, and an average value of B image data among a plurality of pixels constituting the upper region R1 can be used. .
- the image data of the upper region R1 of the tongue is used because the moss is a keratinized papillary tissue of the lingual mucosa and exists from the upper part of the tongue to the center. This is because there are many in the upper part.
- the thickness of moss can be quantified and extracted by determining the ratio of R (R / (R + G + B)) or the ratio of G (G / (R + G + B)) in the central region R2 of the tongue.
- R R / (R + G + B)
- G G / (R + G + B)
- the RGB data an average value of R image data, an average value of G image data, and an average value of B image data among a plurality of pixels constituting the central region R2 can be used. .
- the image data of the tongue central region R2 is used when extracting the thickness of the moss because the moss exists at the upper part of the tongue as described above, and the color of the central region R2 is the upper region R1. This is because the thickness of the moss can be determined by whether or not the color of the moss is close.
- the thickness of moss can also be quantified.
- the feature amount extraction unit 16 approximates the contour line of the tongue obtained by the image processing unit 15 with an approximate curve, detects irregularities (smoothness) of the contour line based on the degree of correlation between the contour line and the approximate curve, As a result, the tooth mark on the tongue is detected. Since the approximate curve is a smooth curve without fine irregularities, it can be said that the closer the contour line is to the approximate curve, the smoother and less the tooth trace. That is, the higher the degree of correlation between the contour line and the approximate curve, the fewer the tooth traces, and the lower the degree of correlation, the more tooth traces.
- the state of the tooth trace is intermediate (level 2) between mild and severe.
- first to third indices can be used as indices representing the degree of correlation between the two.
- a first index representing the degree of correlation is a determination coefficient R 2 represented by the following equation.
- R 2 1 ⁇ ⁇ ( ⁇ (yi ⁇ fi) 2 ) / ( ⁇ (yi ⁇ Y) 2 ) ⁇
- i On the xy plane, let j be the x coordinate of one end of the contour line or approximate curve, Any value from j to k when the x-coordinate of the other end is set to k.
- i, j, and k are all integers, j ⁇ k, and j ⁇ i ⁇ k.
- ⁇ (yi ⁇ fi) 2 indicates the sum of (yi ⁇ fi) 2 when i is changed from j to k, and ⁇ (yi ⁇ Y) 2 changes i from j to k. Is the sum of (yi-Y) 2 .
- Figure 7 shows the tongue contour lines (see the solid line) and the approximate curve (see the broken line), and the polynomial representing the approximate curve, and a coefficient of determination R 2.
- the approximate curve is obtained by the least square method and is expressed by the following polynomial.
- the determination coefficient R 2 at this time is 0.9942.
- the notation “E ⁇ n” indicates ⁇ 10 ⁇ n .
- y 5 ⁇ 10 ⁇ 7 ⁇ x 4 + 6 ⁇ 10 ⁇ 6 ⁇ x 3 + 2 ⁇ 10 ⁇ 3 ⁇ x 2 + 6.29 ⁇ 10 ⁇ 2 ⁇ x + 21.213
- the difference between the contour line and the approximate curve is determined by the determination coefficient R even in the portion A where the tooth trace is severe (the contour contour is large) and in the portion B where the tooth trace is mild (the contour contour is small). Reflected in 2 . Therefore, if a method of detecting a tooth trace based on the determination coefficient R 2 is used, even if the tooth trace is only a slight tooth trace, the light tooth trace is detected based on the determination coefficient R 2. This makes it possible to improve the detection accuracy of tooth marks.
- the second index representing the degree of correlation is a value obtained based on the difference in the coordinate value (y coordinate) between the contour line and the approximate curve, and is the maximum value of
- FIG. 8 is a graph in which the difference (
- is larger in the part A where the tooth mark is severe and in the part B where the tooth mark is mild than in the part without the tooth mark. Therefore, even if the tooth mark is only a slight tooth mark, it is possible to detect a light tooth mark based on that value by detecting the maximum value of
- i, j, and k are all integers, j ⁇ k, and j ⁇ i ⁇ k. Further, ⁇
- FIG. 9 shows a photographed image of the tongue and a sectional shape of the tongue.
- the tongue is projected forward from the oral cavity. Since the surface of the protruding upper lip side of the tongue is photographed by the imaging unit 3, the tongue is curved so that the upper lip side surface is convex toward the imaging unit 3 (see the CC ′ cross section).
- the tongue may be defined in a specification or manual so that the tongue is guided to an appropriate photographing position.
- both the center and left and right ends of the tongue are recessed and curved in an M shape (see section D-D ').
- Such a cross-sectional shape is substantially the same from the upper part to the lower part of the tongue.
- the central part E of the tongue may have a crack pattern. Therefore, in this embodiment, the angle A at the time of illumination is set to 15 degrees, and the remaining area on the upper half of the tongue and avoiding the center and both ends in the left-right direction is suitable for detecting the glossiness. It is set as a detection area.
- the feature amount extraction unit 16 detects the upper and lower ends and left and right ends of the tongue from the contour line of the tongue detected by the image processing unit 15, and the vertical length H and the left and right widths of the tongue. G is detected, and glossiness detection areas R4 and R5 are set based on the contour line of the tongue so that the positional relationship shown in FIG. 10 is detected.
- FIG. 11 is a graph showing the spectral distribution of the tongue. Since the tongue has a mucosal structure and no epidermis, the color of the blood appears as the color of the tongue. Blood has a large amount of R component (wavelength 600 nm to 700 nm) and a small amount of B component (wavelength 500 nm or less). Further, when the color of the tongue is light, the ratio of the R component decreases, and when it is dark, the ratio of the R component increases.
- moss is formed of keratinized papillae cells and exhibits a white to yellow color. And when the moss is thin, the base tongue color appears, so the ratio of the R component increases as shown in the figure, and when the moss is white and dark, the G component (wavelength 500 nm to 600 nm) The ratio goes up.
- the glossiness of the tongue surface is detected as follows based on the B image data obtained from the photographed image of the tongue.
- the feature amount extraction unit 16 extracts B image data from each pixel in the detection regions R4 and R5 of the photographed image, and creates a frequency distribution thereof.
- FIG. 12 schematically shows the frequency distribution of the extracted B image data.
- the horizontal axis indicates the B pixel value (image data)
- the vertical axis indicates the frequency (number of pixels).
- the pixel value is a value from 1 to 100, and the larger the pixel value, the brighter the pixel value.
- a function that continuously indicates the frequency change may be obtained and smoothed to remove the noise, and then the pixel value Dp corresponding to the maximum frequency Np may be obtained.
- the number of upper pixels may be obtained by integrating the smoothed function over a predetermined interval.
- the frequency distribution of the B image data is only a distribution close to the normal distribution (first group G1) when there is no regular reflection on the surface of the tongue at the time of photographing, but when there is a regular reflection, the first distribution is first.
- a distribution (second group G2) having a large frequency on the high pixel value side is added to the group G1.
- the width of the first group G1 (the width from the minimum pixel value to the maximum pixel value of the first group G1) is This is narrower than the frequency distribution (normal distribution) of other R and G image data.
- the boundary between the first group G1 and the second group G2 is the value of the image data (pixel value Dp) and the image where the frequency is maximum. It clearly appears between the maximum value of data (pixel value Dm), and the first group G1 and the second group G2 can be easily identified.
- the feature amount extraction unit 16 sets a threshold value M larger than the pixel value Dp, and obtains the sum of the frequencies between the threshold value M and the pixel value Dm as the number of upper pixels, whereby the second group G2 A value close to the sum of the frequencies is obtained.
- the sum of the frequencies between the threshold value M and the pixel value Dm is obtained as the upper pixel number.
- the feature quantity extraction unit 16 can detect the glossiness by quantifying the glossiness from, for example, level 1 (high glossiness) to level 3 (low glossiness) based on the number of upper pixels.
- FIG. 13 shows a photographed image of a tongue having a crack on the surface.
- the imaging unit 3 captures the surface of the protruding upper lip side of the tongue.
- the upper and lower and right and left central portions (regions including the center in the vertical and horizontal directions) of the photographed image are suitable for crack detection. Is set as a detection area.
- the feature amount extraction unit 16 detects the upper and lower ends and left and right ends of the tongue from the contour line of the tongue extracted by the image processing unit 15, and the vertical length (vertical dimension) H of the tongue and The left and right widths (lateral dimensions) W are detected, and the center area of the tongue of the vertical H / 4 and the horizontal W / 4, which is determined by the dimensional relationship shown in FIG.
- the base of the tongue appears more than when there is no crack, so the range of values that can be taken by the image data of the pixels that make up the base is expanded for both RGB. For this reason, when the frequency distribution of the image data of the captured image is created, the width of the frequency distribution is widened.
- the width of the frequency distribution is remarkably widened for R and B contained in a large amount of blood color compared to the case where there is no crack. It is known that such a tendency appears regardless of the thickness of the moss on the tongue surface and the length of the crack.
- the feature amount extraction unit 16 creates a frequency distribution of, for example, B image data from the photographed image of the tongue (particularly, the image of the detection region D described above) and shows the spread of the frequency distribution.
- a standard deviation ⁇ indicating variation in image data is obtained by calculation.
- the standard deviation ⁇ is a positive square root of the variance ⁇ 2 represented by the following equation when the value of the image data takes N values x 1 , x 2 ,.
- FIG. 14 schematically shows the frequency distribution of the B image data created by the feature amount extraction unit 16.
- the upper frequency distribution shows a case where there is no crack on the tongue surface
- the lower frequency distribution shows the tongue distribution.
- the case where there is a crack on the surface is shown.
- the horizontal axis of these frequency distributions indicates the B pixel value (image data)
- the vertical axis indicates the frequency (number of pixels).
- the pixel value is a value from 0 to 255, and the larger the pixel value, the brighter the pixel value.
- the feature quantity extraction unit 16 can detect the cracks numerically from, for example, level 1 (crack many) to level 3 (crack few) based on the standard deviation of the frequency distribution.
- the crack is detected using the frequency distribution of the B image data, but the crack can be detected using the frequency distribution of other R and G image data.
- FIG. 15 is a distribution of image data obtained when the surface of the tongue is imaged by the imaging unit 4 under illumination by the illuminating unit 2, and the captured image in the horizontal direction passing through a substantially vertical center of the tongue surface.
- the distribution of RGB image data is shown. However, the upper distribution is for the case where the tongue is thin, and the lower distribution is for the case where the tongue is thick.
- the solid line indicates the distribution of R image data, the alternate long and short dash line indicates the distribution of G image data, and the broken line indicates the distribution of B image data.
- the tongue When the tongue is thick, the tongue includes a portion that protrudes upward from its end to the center. Since such a convex portion on the tongue surface is brightly illuminated as it approaches the illumination unit 2, the value of the image data increases in the portion corresponding to the convex portion in the photographed image of the tongue. Conversely, when the tongue is thin, the surface of the tongue includes a portion that is substantially flat from the end to the center or concave downward. Since the flat part and recessed part of the tongue surface move away from the illumination part 2 compared with said convex part, even if illuminated, it is darker than a convex part. For this reason, in the photographed image of the tongue, the value of the image data decreases in the portion corresponding to the flat portion or the concave portion on the surface compared to the portion corresponding to the convex portion. This tendency is the same for all RGB image data.
- the feature amount extraction unit 16 determines whether the tongue is based on the unevenness of the horizontal image data distribution (monochromatic distribution) of one of the RGB colors in the captured image of the tongue obtained under illumination of the illumination unit 2. Thickness or thinness can be detected. That is, the horizontal distribution of RGB image data included in the photographed image of the tongue is used as a data distribution indicating the degree of unevenness on the tongue surface, and the degree of unevenness is set to level 1 (thick tongue), for example. By quantifying to level 5 (thin tongue is thin), tongue thickness can be detected accurately (regardless of the outer shape of the tongue).
- the degree of unevenness of the data distribution is calculated by, for example, calculating an approximate curve (for example, a quadratic equation) approximating the center of the data distribution by the least square method, etc., and looking at the quadratic coefficient of the approximate curve and its absolute value. Can be determined. Incidentally, if the quadratic coefficient of the approximate curve is positive, it is a concave shape, and if it is negative, it is a convex shape.
- an approximate curve for example, a quadratic equation
- the distribution of the image data the distribution of data indicating the R component ratio (R / (R + G + B)), the G component ratio (G / (R + G + B)), and the B component ratio (B / (R + G + B)). Even if it is used, the tongue thickness can be detected with high accuracy as described above.
- Patent Document 1 The method described in Patent Document 1 may be used to extract feature quantities such as the shape of the tongue, tooth marks, and crests. Further, as described in Patent Document 2, the captured image of the tongue may be divided into a plurality of regions, and RGB data may be read for each region to extract the feature amount.
- FIG. 16 is a block diagram illustrating a schematic configuration of the terminal device 51.
- the terminal device 51 is a health level determination device that determines the health level of the subject, and includes a storage unit 52, a determination unit 53, a communication unit 54, and a control unit 55.
- the storage unit 52 is a memory that stores captured image data of a plurality of other organs (here, tongues) and data transmitted from an external device including the organ image capturing apparatus 1.
- the communication unit 54 is an interface for performing communication with an external device.
- the communication unit 54 constitutes a feature amount input unit that is extracted from a photographed image of the organ of the subject and is used to input a feature amount used to determine the health level. It constitutes a receiving unit that receives information on the feature amount of the subject transmitted from the image capturing device 1).
- the terminal device 51 may include a reading unit that reads a feature amount from a recording medium in which the feature amount extracted from the captured image of the organ of the subject is recorded.
- the terminal device 51 can acquire the feature amount by reading information with the reading unit. Therefore, in this case, the reading unit can function as the feature amount input unit.
- the control unit 55 is configured by a CPU, for example, and controls the operation of each unit of the terminal device 51.
- the determination unit 53 compares the feature amount of the target person transmitted from the organ image capturing apparatus 1 and input via the communication unit 54 as the feature amount input unit with the distribution of the feature amount of the population, so that The difference in the constitution of the subject corresponding to the individual difference and the change in the physical condition of the subject are determined.
- the distribution of the feature quantity of the above-mentioned population the distribution of the feature quantity of another person or the distribution of the feature quantity of the target person can be used.
- These distributions are created by the determination unit 53. That is, the distribution of the feature quantity of the other person is created by extracting the feature quantity from the captured image data of the other person stored in the storage unit 52 by the same extraction method as the feature quantity extraction unit 16.
- the distribution of the feature amount of the target person is created using the feature amount extracted by the feature amount extraction unit 16 within the evaluation period for evaluating the physical condition.
- FIG. 17 schematically shows the distribution of the feature amount of another person created by the determination unit 53.
- the x mark in the figure indicates the plot position of the feature value of each other person.
- the distribution in the case where the feature amount is two-dimensional (Xi, Xj) is shown.
- the color of the tongue can be considered as Xi
- the color of moss can be considered as Xj.
- the horizontal axis of a 1 and a 2 represents a value corresponding to the different colors
- b 1 and b 2 of the vertical axis also shows the values corresponding to the different colors.
- the origin O of this feature quantity distribution is a point indicating the average of Xi and the average of Xj.
- the distribution of Xi and Xj is biased and there is a correlation between them. That is, in a 1 side of the origin the value of Xi is O, the value of Xj is often more of b 1-side value than b 2 side, in a 2 side of the origin the value of Xi is O, the value of Xj is b 1 side b 2 side of there are more than. Even when the feature quantity is multi-dimensional with three or more dimensions, it is considered that some kind of correlation occurs.
- the distance from the origin O of the distribution of the feature quantity of the other person to the feature quantity (for example, point A) of the input subject is set as an evaluation value (first evaluation value) of the physical condition.
- This evaluation value is not the Euclidean distance but the Mahalanobis distance.
- the Mahalanobis distance is a distance scale calculated in consideration of variations and correlations for many measurement items (features).
- the number of population is n
- the average value of data for each measurement item is m
- the standard deviation is ⁇
- the element of the inverse matrix for the correlation matrix representing the correlation between the measurement items is aij
- the number of measurement items is k
- the average vector of m1, m2,... mk is the origin of the distribution (space) created by the population.
- the Mahalanobis distance D is the following number: It is obtained by the two formulas.
- the average of n Mahalanobis distances that are all objects of the population space is “1”.
- the origin of the scale indicating the physical condition evaluation value and the unit of distance are determined. Note that the smaller the Mahalanobis distance, the closer the subject is to the population, and the greater the Mahalanobis distance, the farther from the population.
- FIG. 18 shows an image of a long-term change in physical condition for a patient A with a disease, a healthy person B, and a subject C who is a target for determining the health level.
- the origin O in the figure corresponds to the origin O of the other person's feature quantity distribution shown in FIG. 17, the horizontal axis indicates the time since birth, and the vertical axis indicates the above-described physical condition evaluation. The value is shown.
- a unit amount U on the vertical axis indicates a unit of distance from the origin O.
- FIG. 19 shows the physical condition change in a certain evaluation period in an enlarged manner for the subject C in FIG.
- his / her physical condition changes due to poor gastrointestinal tract, motives and shortness of breath, cold and fever. If such a condition exceeds the limit, it becomes manifest as a disease and requires treatment.
- the change in the evaluation value of the subject C within a certain evaluation period is defined as “change in physical condition”.
- a basic processing flow including a determination method by the determination unit 53 will be described.
- FIG. 20 is a flowchart showing the flow of processing in the health level determination system 50 of the present embodiment.
- the following first to third steps are sequentially performed.
- a tongue image of a young healthy person (other person) with no constitution bias is collected by a photographing apparatus (for example, a digital camera) equipped with a flash.
- said healthy person belongs to the population P1 of FIG. It is said that the appearance of the tongue will be almost the same as adults in their 10s.
- images of healthy individuals in the early 10s who do not have a constitutional bias due to lifestyle such as adult diseases are collected as samples.
- the standard for the number of samples is preferably about 10 to 100 examples. In order to obtain the Mahalanobis distance described above, more samples than the number of features are required (in order to calculate an inverse matrix).
- the collected image data is sent to the terminal device 51 and stored in the storage unit 52.
- the determination unit 53 of the terminal device 51 extracts the feature amount of the other person used for the health degree determination from the other person's captured image data stored in the storage unit 52 by the same extraction method as the feature amount extraction unit 16. .
- tongue color, moss color, moss shape (thickness), tooth marks, dampness (glossiness), cleft (crack), tongue shape (thickness), etc. are extracted as feature quantities of others. .
- feature quantities used in the Oriental medicine diagnosis method may be extracted.
- the determination unit 53 sets an origin O 1 and a unit amount U 1 of a distribution (for example, a distribution as shown in FIG. 17 if the feature amount is two-dimensional) having the extracted feature amount of another person as a population.
- the origin O 1 is a point indicating the average of the feature quantities of others, and the unit amount U 1 indicates a unit of distance from the origin O 1 .
- These settings can be made by multivariate analysis using the Mahalanobis distance described above. As other multivariate analysis methods, there are a method using a linear function and a method using regression analysis, but any method may be used.
- the imaging unit 3 of the organ image capturing apparatus 1 collects the daily tongue images of the subject in the same manner as described above.
- the standard for the number of samples is preferably about twice the above (about 20 to 200 cases).
- the feature amount extraction unit 16 extracts a feature amount such as a tongue color from the tongue image data of the subject by the above-described method.
- the image captured by the imaging unit 3 and the feature amount information extracted by the feature amount extraction unit 16 are transmitted to the terminal device 51 and stored in the storage unit 52.
- the determination unit 53 of the terminal device 51 compares the feature amount of the subject and the distribution of the population (distribution of the feature amount of the other person) to obtain a difference (Mahalanobis distance) from the population, and the Mahalanobis distance with respect to time. Is determined as a change in the evaluation value (first evaluation value) of the subject's physical condition.
- FIG. 21 shows a change in the evaluation value of the physical condition of the subject at this time. Note that the difference between the feature amount of the subject and the population corresponds to the distance (1) in FIG.
- the determination part 53 calculates
- FIG. 22 shows enlarged changes in the evaluation value of the physical condition of the subject during the evaluation period of FIG.
- the determination unit 53 uses the distance M as a threshold value, and extracts an image of (shooting date) on the date and time when the evaluation value is equal to or less than the threshold value within the evaluation period.
- the distance M (average value ⁇ standard deviation) or the like may be used as a threshold, and an image with an evaluation value equal to or less than the threshold may be extracted (an image with a higher health degree may be extracted ). In this case, it is desirable to increase the number of samples by a considerable amount.
- the determination unit 53 extracts the feature amount of the target person from the captured images of the target person for all the dates and times when the evaluation value is equal to or less than the threshold, and the distribution with the extracted feature amount of the target person as a population is extracted.
- the origin O 2 is a point indicating the average of the feature quantities of the subject, and the unit quantity U 2 is a unit of distance from the origin O 2 .
- the determination unit 53 compares the feature amount of the subject at the time of determination with the distribution of the population (distribution of the feature amount of the subject), obtains a difference (Mahalanobis distance) from the population, and calculates the Mahalanobis distance with respect to time. Is determined as a change in the evaluation value (second evaluation value) of the subject's physical condition.
- FIG. 23 shows a change in the evaluation value of the physical condition of the subject at this time. Note that the difference between the feature amount of the subject and the population corresponds to the distance (2) in FIG.
- the determination unit 53 determines a change in the distance (2) in the evaluation period as a change in the physical condition of the subject. Since the distribution of the feature quantity of the subject person is used as the population, there is no increase due to the constitutional bias, and daily physical condition changes can be measured with high sensitivity.
- FIG. 24 shows a medium-term change in the evaluation value (first evaluation value) of the subject's physical condition.
- evaluation period 1 When a certain evaluation period (for example, evaluation period 1) ends, the process proceeds to the next evaluation period (for example, evaluation period 2).
- the population used for determining the change in physical condition that is, the distribution of the feature amount of the subject whose first evaluation value is equal to or less than the threshold value is updated at regular intervals.
- the population P2 distributed of the feature amount of the subject whose first evaluation value is equal to or less than the distance M
- the distance M is used when determining a change in physical condition.
- population P3 distributed of feature quantity of the subject whose first evaluation value is equal to or less than distance N
- the distance N is the average of the distances (Mahalanobis distances) from the origin O 1 of the first evaluation value in the evaluation period 2. Updating the population may be performed by replacing all data at once, or by adding new ones every day and deleting old ones.
- FIG. 24 shows an example of the former.
- the determination unit 53 uses the distance N described above as a threshold, extracts an image on the shooting date in which the first evaluation value is equal to or less than the threshold within the evaluation period 2, and distributes the feature amount distribution (mother of the subject) The origin O 2 and the unit amount U 2 of the group P3 distribution) are set. Then, the distance between the feature amount of the subject at the time of determination in the evaluation period 2 and the population P3 is defined as a change in the physical condition of the subject in the evaluation period 2.
- Q1 indicates the width of the physical condition change in the evaluation period 1
- Q2 indicates the width of the physical condition change in the evaluation period 2.
- the difference between the distance M and the distance N is an improvement in the constitution from the evaluation period 1 to the evaluation period 2. Note that the difference from the population P2 in the previous evaluation period 1 may be the distance from the population in the evaluation period 2.
- the population P1 described above does not have to be changed.
- the distance M is also updated to the distance N by updating from the population P2 to P3, but the distance N is similarly updated in the subsequent evaluation period. Note that the distance (2) shown in FIG. 23 is updated by daily shooting.
- the terminal device 51 may graph daily fluctuations in distance and present it to the subject (organ image capturing apparatus 1).
- the improvement method may be presented together.
- the method of improvement presented may be based on oriental medicine or on Western medicine.
- the improvement method may be specifically shown using the contribution degree of the feature amount. For example, a treatment for improving blood flow can be presented when the contribution of the tongue color is large to a decrease in health, and a treatment for improving immunity can be presented when the contribution of moss thickness is large.
- a certain standard As a reference at that time, the person's past history may be used. For example, if there is a tongue image of the subject at the time of illness in the past, it is transmitted to the terminal device 51, and the terminal device 51 extracts the feature amount from the tongue image, and the population at that time (other person) The distance from the origin of the distribution of the feature amount (the average of the distances when there are a plurality of tongue images at the time of illness) may be obtained and used as the reference.
- determining whether or not the physical condition is good using such a standard it is possible to determine whether or not the current physical condition of the subject is close to the physical condition at the time of past illness. Can present a physical condition improvement method at a more appropriate timing.
- the obtained constitutional deviation (distance M) of the subject may be further compared with a certain standard to determine whether or not the constitutional deviation is large.
- a certain standard For example, if the target person is in their 50s, it is more individual difference at the same age to judge the constitutional bias compared to others of the same age than to judge the constitutional bias compared to others in the 10s Therefore, it is possible to accurately grasp the constitutional bias.
- the determination unit 53 of the terminal device 51 determines the bias of the subject's constitution
- the distribution of the feature quantity of the other person is set as the population distribution, and this is compared with the feature quantity of the subject person.
- the determination unit 53 determines a change in the physical condition of the subject
- the distribution of the feature quantity of the subject person collected within the evaluation period is set as a population distribution, and this is compared with the feature quantity of the subject person. set to target. Since the distribution of the feature quantity of the target person is used as a comparison target, there is almost no influence of individual differences from others, and the physical condition (health level) of the target person is accurately determined while reducing the influence of the constitutional bias. be able to.
- the determination unit 53 obtains the distance from the origin O 1 of the distribution of the feature quantity of the other person as the first evaluation value of the input feature quantity of the subject, and the first change that changes with time in the evaluation period. Since the average of the evaluation values is determined as the bias of the subject's constitution, even if the first evaluation value varies or varies within the evaluation period, the bias of the subject's constitution can be appropriately determined.
- the determination unit 53 collects feature quantities of dates and times when the first evaluation value that changes over time within the evaluation period is equal to or less than a threshold, and creates a distribution of the feature quantities of the subject person.
- the distance from the origin O 2 of the distribution is obtained as a second evaluation value, and a change in the second evaluation value that changes with time in the evaluation period is determined as a change in the physical condition of the subject.
- the distribution of the feature amount of the target person described above is a distribution of feature amounts in which the first evaluation value is equal to or less than a threshold, and it can be determined that the constitutional bias is small.
- the second evaluation value is obtained using the distribution of the feature amount of the target person as a comparison target, and the change in the physical condition is determined using the second evaluation value, thereby reliably reducing the influence of the constitutional bias.
- the change in physical condition can be determined.
- the determination unit 53 compares the input feature quantity of the subject person with the distribution of the feature quantity of the population using the Mahalanobis distance. Even if the feature quantity is multidimensional, it is possible to appropriately determine the constitutional bias and the physical condition change by comparing with the population distribution on a single scale called Mahalanobis distance.
- the distribution of the feature amount of the other person used as the distribution of the population is the characteristics of the other person in their 10s, that is, the age of 10 to 19 years old. Distribution of quantity. Young people in their 10s are considered to have almost no constitutional bias due to lifestyle habits, and therefore, using the above distribution makes it possible to accurately determine the subject's constitutional bias.
- the distribution of the feature amount of the subject as the population distribution used when the determination unit 53 determines the change in the physical condition may be the distribution of the feature amount of the subject who is 20 years or older.
- constitutional bias occurs, and thus the distribution of the population may include constitutional bias.
- the above distribution is effective.
- the distribution is a distribution of feature quantities of a healthy subject who is 20 years of age or older, it is possible to determine a change in physical condition including a constitutional deviation based on a healthy state. .
- the organ to be imaged is a tongue.
- the organ may be other than the tongue.
- a region where swelling occurs due to the quality of water metabolism, such as eyelids may be taken as an object to be imaged, and a feature amount of such a region may be extracted to determine a constitutional bias or a change in physical condition.
- the color of the tongue, the color of the moss, the shape of the tongue (thickness), the thickness of the moss (shape), and the like are used as the feature amount of the subject. That is, the feature amount of the subject includes information about at least one of the color and shape of the organ. By using at least information on the color and shape of the organ as a feature amount, it is possible to determine the constitutional bias and the change in physical condition.
- the above-mentioned items necessary for the inquiry are displayed on the display unit 4 of the organ imaging apparatus 1.
- the subject operates the operation unit 5 and inputs a response to the above item.
- the input response is transmitted to the terminal device 51.
- the determination unit 53 of the terminal device 51 is based on the comparison with the population distribution of the feature amount of the target person and the response result input to the terminal device 51 and transmitted via the operation unit 5.
- the bias of constitution and the change in the physical condition of the subject are determined. For example, when determining the change in physical condition, it is determined whether or not the change in physical condition is large by referring to the response results for the items 1 to 5, 9, 10, and 14.
- the response results to the items 6 to 8 and 11 to 13 are referred to determine whether or not the constitutional bias is large.
- the determination result and the improvement method of the constitution and the physical condition are fed back to the organ imaging apparatus 1 and displayed on the display unit 4.
- a service for obtaining a consideration by providing a determination result and an improvement method to a target person may be constructed.
- the contents to be provided are three patterns: (a) only a change in image feature amount, (b) a change in physical condition in addition to (a), and (c) a change in constitution in addition to (b)
- the price may be set higher in the order of (a), (b) and (c).
- the price may be differentiated according to the time from image capture to answering the health degree determination result.
- FIG. 25 shows a frequency distribution when the feature amount is univariate (for example, tongue color). Note that the value on the vertical axis indicates tongue color image data (for example, R / (R + G + B)). However, the origin corresponds to the average value of the tongue color.
- the origin of the distribution in FIG. 25 is made to correspond to the origin in FIGS. 22 and 23, and ⁇ 3 ⁇ ( ⁇ : standard deviation) in FIG. 25 is made to correspond to the unit quantity in FIG. 22 and FIG. Changes in physical condition can be determined.
- any image photographed before, at the same time as, or after the determination of the health level may be used.
- any image taken before, at the same time, or after the subject's age of 40 is used. May be.
- any image photographed before, at the same time as, or after the determination of the health level may be used. By comparing with the characteristic amount of another person's unhealthy state, the health state of the subject can be determined more clearly.
- a photographed image of a person who is healthy or unhealthy in their 10s may be used. Since it is not necessary to compare with others, a simple comparison is possible.
- a distribution of feature amounts extracted from the following images may be used. That is, any image taken before or after the determination may be used as long as it is an image of the person who is healthy or unhealthy at any age. By increasing the number of types of images and comparing with a plurality of unhealthy states, the health state can be determined more clearly.
- the health level determination device and the health level determination system described above can be expressed as follows, thereby producing the following effects.
- the health level determination device described above is a health level determination device that determines the health level of a subject, and is a feature that is extracted from a photographed image of the organ of the subject and features that are used to determine the health level are input.
- the feature quantity of the subject person corresponding to individual differences from others by comparing the feature quantity of the subject inputted via the quantity input unit and the feature quantity input unit with the distribution of the feature quantity of the population
- a determination unit that determines a change in the physical condition of the subject, and the determination unit is created for another person as a distribution of the feature quantity of the population when determining the bias of the subject's constitution
- the change in the physical condition of the subject is determined using the distribution of the feature quantity of the other person, as the distribution of the feature quantity of the population, the distribution of the feature quantity of the subject person collected within the evaluation period for evaluating the physical condition Is used.
- the determination unit determines the subject's constitutional bias and physical condition change by comparing the feature amount of the subject input via the feature amount input unit with the distribution of the feature amount of the population.
- the distribution of the feature quantity of another person is used as the distribution of the feature quantity of the population.
- the distribution of the feature quantity of the subject himself collected within the evaluation period is used as the distribution of the feature quantity of the population. Since the distribution of the feature quantity of the target person is used as a comparison target, the influence of individual differences from others is eliminated. Thus, the physical condition (health level) of the subject person can be accurately determined while reducing the influence of the constitutional bias.
- the determination unit obtains a distance from the origin of the distribution of the feature quantity of the other person as a first evaluation value of the feature quantity of the target person input via the feature quantity input unit, and changes over time in the evaluation period. You may determine the average of the said 1st evaluation value which changes to as a bias
- the bias of the subject's constitution can be determined regardless of fluctuations or variations in the first evaluation value.
- the determination unit collects feature amounts of dates and times when the first evaluation value that changes over time within the evaluation period is equal to or less than a threshold value, creates a feature amount distribution of the subject, and the feature amount input unit The distance from the origin of the distribution of the feature amount of the subject input via the second time is obtained as a second evaluation value, and the change in the second evaluation value that changes over time during the evaluation period is determined as the physical condition of the subject. It may be determined as a change in.
- the distribution of feature quantities of the target person which is a collection of feature quantities for the date and time when the first evaluation value is equal to or less than the threshold, is a distribution of feature quantities that can be determined to have a small individual difference from others, that is, a constitutional bias. Therefore, by obtaining a second evaluation value using such a distribution of feature values as a comparison target and determining a change in physical condition using the second evaluation value, while reliably reducing the influence of the constitutional bias, Changes in physical condition can be determined.
- the determination unit may perform a comparison between the feature amount of the subject input via the feature amount input unit and the distribution of the feature amount of the population using the Mahalanobis distance. In this case, even if the feature value input via the feature value input unit is multidimensional, it is possible to appropriately determine the constitutional bias and the change in physical condition.
- the distribution of the feature quantity of the other person may be the distribution of the feature quantity of another person who is 10 to 19 years old.
- the distribution of the feature quantity of the other person may be the distribution of the feature quantity of a healthy other person who is 10 to 19 years old.
- the distribution of the feature amount of the subject may be the distribution of the feature amount of the subject who is 20 years old or older.
- the constitutional bias will occur, so the distribution of the population may contain the constitutional bias.
- it is desired to determine the change in the physical condition of the subject person including the bias in the constitution it is effective to use the above distribution.
- the distribution of the feature amount of the subject may be a distribution of the feature amount of a healthy subject who is over 20 years old.
- the distribution of the population may include constitutional bias
- the distribution of feature values of healthy subjects is used as a comparison target (distribution of feature values of the population), so that the healthy state is used as a reference. It becomes possible to determine a change in physical condition including a constitutional bias.
- the organ may be a tongue.
- the above-described effect can be obtained in the configuration in which the health level of the subject is determined by photographing the tongue.
- the feature amount of the subject input via the feature amount input unit may include information on at least one of the color and shape of the organ. By using at least information on the color and shape of the organ as a feature amount, it is possible to determine the constitutional bias and the change in physical condition.
- the feature amount input unit may include a receiving unit that receives feature amount information of a subject transmitted from the outside.
- the above-described effect can be obtained in the configuration in which the information about the feature amount of the subject is acquired by the receiving unit.
- the health level determination system described above is formed by connecting the above-described health level determination apparatus and an organ image capturing apparatus through a communication line so that they can communicate with each other.
- a transmission unit that transmits information to the health degree determination apparatus.
- the organ imaging apparatus may include a display unit that displays information related to the constitutional bias and physical condition determined by the determination unit of the health level determination device. Because information on the subject's constitutional bias and physical condition changes is displayed on the display unit, the subject can easily improve both the constitutional improvement and physical condition (health level) based on the information displayed on the display unit. Can grasp.
- the organ imaging apparatus further includes a display unit for displaying items necessary for an inquiry, and an information input unit for responding to the items displayed on the display unit, and the health level determination device includes: The determination unit compares the feature amount of the subject included in the information transmitted from the organ imaging apparatus with the distribution of the feature amount of the population, and is input by the information input unit to the health determination device. Based on the transmitted response result, the bias in the subject's constitution and the change in the subject's physical condition may be determined.
- Judgment of constitutional bias and changes in physical condition based not only on comparison of sent feature values and distribution of feature values of population, but also based on results returned by information input unit for interview items By doing so, a more appropriate determination can be made.
- the present invention can be used in a system that extracts a feature amount from a photographed image of an organ of a subject and determines the health level of the subject.
Abstract
Description
図1は、本実施形態の健康度判定システム50の概略の構成を示す説明図である。健康度判定システム50は、器官画像撮影装置1と、端末装置51とを有している。器官画像撮影装置1は、生体の器官を撮影して、健康度の診断に必要な情報(特徴量、診断項目)を抽出(検出)するものであり、例えば多機能携帯情報端末で構成されている。端末装置51は、器官画像撮影装置1で取得された情報をもとに、生体の健康度を判定するものであり、パーソナルコンピュータ等で構成されている。
図2は、本実施形態の器官画像撮影装置1の外観を示す斜視図であり、図3は、器官画像撮影装置1の概略の構成を示すブロック図である。器官画像撮影装置1は、照明部2、撮像部3、表示部4、操作部5、通信部6および音声出力部7を備えている。照明部2は筐体21に設けられており、照明部2以外の構成(例えば撮像部3、表示部4、操作部5、通信部6、音声出力部7)は、筐体22に設けられている。筐体21と筐体22とは相対的に回転可能に連結されているが、必ずしも回転は必要ではなく、一方が他方に完全に固定されていてもよい。なお、上記の照明部2等は、単一の筐体に設けられていてもよい。
図6は、舌の撮影画像を示している。以下では、舌の撮影画像において、舌の左右方向の中央で上下方向に帯状に形成される領域を上下方向に並ぶ3つの領域に分けたときに、それぞれの領域を、上部領域R1、中央領域R2、下部領域R3と称することとする。なお、これらの領域は、画像処理部15で検出される舌の輪郭線で囲まれる領域の左右方向の幅W、上下方向の長さHをもとに、図6に示すサイズで規定されるが、図示したサイズは一例であり、これに限定されるわけではない。
特徴量抽出部16は、画像処理部15で得られる舌の輪郭線を近似曲線で近似し、輪郭線と近似曲線との相関度に基づいて、輪郭線の凹凸(滑らかさ)を検出し、それによって舌の歯痕を検出する。近似曲線は細かい凹凸の無い滑らかな曲線であるため、輪郭線がこの近似曲線に近いほど、滑らかで歯痕が少ないと言える。すなわち、輪郭線と近似曲線との相関度が高いほど歯痕が少なく、相関度が低いほど歯痕が多いことになる。
R2=1-{(Σ(yi-fi)2)/(Σ(yi-Y)2)}
ただし、
i :xy平面上で、輪郭線または近似曲線の一端部のx座標をjとし、
他端部のx座標をkとしたときの、jからkまでのいずれかの値
yi:xy平面上で、輪郭線上の点のx座標iにおけるy座標の値
fi:xy平面上で、近似曲線上の点のx座標iにおけるy座標の値
Y :輪郭線上の全ての点についてのyiの平均値
である。なお、i,j,kは、いずれも整数であり、j<kであり、j≦i≦kである。また、Σ(yi-fi)2は、iをjからkまで変化させたときの(yi-fi)2の総和を指し、Σ(yi-Y)2は、iをjからkまで変化させたときの(yi-Y)2の総和を指す。
y=5×10-7・x4+6×10-6・x3+2×10-3・x2+6.29×10-2・x+21.213
i :xy平面上で、輪郭線または近似曲線の一端部のx座標をjとし、
他端部のx座標をkとしたときの、jからkまでのいずれかの値
yi:xy平面上で、輪郭線上の点のx座標iにおけるy座標の値
fi:xy平面上で、近似曲線上の点のx座標iにおけるy座標の値
である。なお、i,j,kは、いずれも整数であり、j<kであり、j≦i≦kである。
A=Σ|yi-fi|
ただし、
i :xy平面上で、輪郭線または近似曲線の一端部のx座標をjとし、
他端部のx座標をkとしたときの、jからkまでのいずれかの値
yi:xy平面上で、輪郭線上の点のx座標iにおけるy座標の値
fi:xy平面上で、近似曲線上の点のx座標iにおけるy座標の値
である。なお、i,j,kは、いずれも整数であり、j<kであり、j≦i≦kである。また、Σ|yi-fi|は、iをjからkまで変化させたときの|yi-fi|の総和(以下では「座標値差の総和」とも言う)を指す。
図9は、舌の撮影画像と舌の断面形状とを示している。舌を撮影する際、舌は口腔から前方に突き出される。その突き出された舌の上唇側の表面を撮像部3で撮影するため、舌はその上唇側の表面が撮像部3側に凸になるように湾曲している(C-C’断面参照)。なお、必要に応じて、仕様書やマニュアルに舌の出し方を規定しておき、舌を適切な撮影位置に案内するようにしてもよい。
図13は、表面に亀裂がある舌の撮影画像を示している。舌を撮影する際、舌は口腔から前方に突き出される。その突き出された舌の上唇側の表面を撮像部3で撮影する。一般に、舌表面の亀裂は、舌の中心付近で多く発生するため、本実施形態では、撮影画像における舌の上下左右の中心部(上下左右方向の中心を含む領域)を、亀裂の検出に適した検出領域として設定している。
図15は、照明部2による照明下で、撮像部4にて舌の表面を撮影したときに得られる画像データの分布であって、舌表面の上下方向のほぼ中心を通る水平方向における撮影画像のRGBの画像データの分布を示している。ただし、上段の分布は、舌が薄い場合のものであり、下段の分布は、舌が厚い場合のものである。なお、実線はRの画像データの分布を示し、1点鎖線はGの画像データの分布を示し、破線はBの画像データの分布を示している。
舌の形、歯痕、裂紋等の特徴量を抽出するにあたり、特許文献1に記載の方法を用いてもよい。また、特許文献2に記載のように、舌の撮影画像を複数の領域に分割し、個々の領域ごとにRGBのデータを読み出して特徴量を抽出するようにしてもよい。
次に、端末装置51について説明する。図16は、端末装置51の概略の構成を示すブロック図である。端末装置51は、対象者の健康度を判定する健康度判定装置であり、記憶部52、判定部53、通信部54および制御部55を有している。
図20は、本実施形態の健康度判定システム50における処理の流れを示すフローチャートである。健康度判定システム50では、以下の第1~第3ステップが順に行われる。
(1-1.健常者の舌画像収集)
まず、フラッシュを備えた撮影装置(例えばデジタルカメラ)により、体質の偏りの無い、若い健常者(他人)の舌画像を収集する。なお、上記の健常者は、図18の母集団P1に属する。舌の様相は、10歳代でほぼ成人と同じになると言われている。ここでは、成人病など生活習慣による体質の偏りが現れていない、10歳代前半の健常者の画像をサンプルとして収集する。サンプル数の目安は、10~100例程度が望ましい。上述したマハラノビス距離を求めるには、特徴量の数以上のサンプルが必要となる(逆行列を演算するため)。収集した画像のデータは、端末装置51に送られて記憶部52に記憶される。
端末装置51の判定部53は、記憶部52に記憶されている他人の撮影画像のデータから、健康度の判定に用いる他人の特徴量を、特徴量抽出部16と同様の抽出方法で抽出する。これにより、舌の色、苔の色、苔の形(厚さ)、歯痕、湿り気(光沢度)、裂紋(亀裂)、舌の形(厚さ)などが他人の特徴量として抽出される。その他、東洋医学の診断方法で用いられる特徴量を抽出してもよい。
次に、判定部53は、抽出した他人の特徴量を母集団とする分布(特徴量が2次元であれば例えば図17のような分布)の原点O1と単位量U1を設定する。原点O1は、他人の特徴量の平均を示す点であり、単位量U1は、原点O1からの距離の単位を示す。これらの設定は、上述したマハラノビス距離を用いた多変量解析によって行うことができる。多変量解析の方法としては、この他にも、線形関数を用いる方法や、回帰分析を用いる方法があるが、どの方法を用いてもよい。
器官画像撮影装置1の撮像部3により、対象者の日々の舌画像を上記と同様に収集する。サンプル数の目安は、上記の約2倍(20~200例程度)が望ましい。特徴量抽出部16は、対象者の舌画像のデータから、舌色等の特徴量を上述の方法で抽出する。撮像部3にて撮影された画像、および特徴量抽出部16にて抽出された特徴量の情報は、端末装置51に送信され、記憶部52に記憶される。
端末装置51の判定部53は、対象者の特徴量と母集団の分布(他人の特徴量の分布)とを比較して、母集団との差異(マハラノビス距離)を求め、時間経過に対するマハラノビス距離の変化を、対象者の体調の評価値(第1の評価値)の変化として求める。図21は、このときの対象者の体調の評価値の変化を示している。なお、対象者の特徴量と母集団との差異は、図21の距離(1)に相当する。判定部53は、評価期間における距離(1)の平均(距離M)を求め、この距離Mを対象者の体質の偏りとして判定する。
(2-1.対象者の健康時の舌画像収集)
図22は、図21の評価期間における対象者の体調の評価値の変化を拡大して示している。判定部53は、上記の距離Mを閾値とし、評価期間内で評価値が閾値以下となる日時の(撮影日)の画像を抽出する。このとき、距離Mの代わりに、(平均値-標準偏差)などを閾値として、評価値が閾値以下となる画像を抽出してもよい(より健康度の高い日の画像を抽出してもよい)。この場合、サンプル数を相当分増やしておくことが望ましい。
次に、判定部53は、評価値が閾値以下となる全ての日時の対象者本人の撮影画像から、対象者の特徴量を抽出し、抽出した対象者の特徴量を母集団とする分布の原点O2と単位量U2とを設定する。原点O2は、対象者の特徴量の平均を示す点であり、単位量U2は、原点O2からの距離の単位を示すものである。これらの設定は、上述したマハラノビス距離を用いた多変量解析によって行うことができ、また、線形関数や回帰分析を用いて多変量解析を行うこともできる。
判定部53は、判定時の対象者の特徴量と母集団の分布(対象者の特徴量の分布)とを比較して、母集団との差異(マハラノビス距離)を求め、時間経過に対するマハラノビス距離の変化を、対象者の体調の評価値(第2の評価値)の変化として求める。図23は、このときの対象者の体調の評価値の変化を示している。なお、対象者の特徴量と母集団との差異は、図23の距離(2)に相当する。判定部53は、評価期間における距離(2)の変化を、対象者の体調の変化として判定する。対象者本人の特徴量の分布を母集団としているため、体質の偏りによる底上げが無く、日々の体調変化を感度良く測定することができる。
(3-1.データの更新)
図24は、対象者の体調の評価値(第1の評価値)の中期的な変化を示している。ある評価期間(例えば評価期間1)が終了すると、次の評価期間(例えば評価期間2)に移行する。このとき、体調の変化を判定する際に用いる母集団、すなわち、第1の評価値が閾値以下となる対象者の特徴量の分布を、一定期間ごとに更新する。図24の例では、評価期間1では、体調の変化を判定する際に、母集団P2(第1の評価値が距離M以下となる対象者の特徴量の分布)を用いているが、評価期間2では、母集団P3(第1の評価値が距離N以下となる対象者の特徴量の分布)を用いる。なお、距離Nは、評価期間2における、第1の評価値の原点O1からの距離(マハラノビス距離)の平均である。母集団の更新は、総てのデータを一度に入れ替えてもよいし、日々新しいものを追加し、古いものを削除する方法でもよい。図24は、前者の例を示している。
評価期間において判定された体質の偏りおよび体調の変化に関する情報(図22の距離Mまたは図24の距離Nの値、図23の距離(2)の値)は、端末装置51から器官画像撮影装置1に送信され、表示部4にて表示される。
端末装置51は、日々の距離の変動をグラフ化して対象者(器官画像撮影装置1)に提示してもよい。また、体調変化を示す距離(2)の増大により、対象者の健康度が低下していたり、体質の偏りを示す距離Mが大きい場合、その改善の方法を併せて提示してもよい。提示する改善の方法は、東洋医学に基づくものでも、西洋医学に基づくものでもよい。
漢方の問診により、体の状態を診断する方法が知られている(例えば、寺澤捷年、「症例から学ぶ和漢診療学(第2版)」、医学書院、1998 参照)。画像の特徴量にこれらの項目を加えて健康度の指標としてもよい。
1.身体がだるい
2.気力がない
3.疲れやすい
4.日中の睡気
5.食欲不振
6.風邪をひきやすい
7.物事に驚きやすい
8.眼光・音声に力が無い
9.舌が淡白紅・腫大
10.脈が弱い
11.腹力が軟弱
12.内臓のアトニー
13.下腹部が軟弱
14.下痢傾向
本実施形態の健康度判定システム50において、判定結果や改善方法を対象者に提供して対価を得るサービスを構築してもよい。この場合、提供する内容に応じて、代金に差を付けることもできる。例えば、提供する内容が、(a)画像特徴量の変化のみ、(b)(a)に加えて体調の変化、(c)(b)に加えて体質の変化、の3パターンであった場合、(a)(b)(c)の順に代金を高く設定してもよい。また、画像の撮影から健康度の判定結果の回答までの時間に応じて、代金に差をつけてもよい。
以上では、特徴量が多次元(多変量)の場合について説明したが、特徴量は1次元(1変量)であってもよい。図25は、特徴量が1変量(例えば舌色)の場合の度数分布を示している。なお、縦軸の値は、舌色の画像データ(例えばR/(R+G+B))を示している。ただし、原点は、舌色の平均値に対応している。図25の分布の原点を、図22および図23の原点に対応させ、図25の±3σ(σ:標準偏差)を、図22および図23の単位量に対応させることにより、体質の偏りや体調の変化を判定することができる。
体質の偏りを判定する際の比較対象として、以下の画像から抽出される特徴量の分布を用いてもよい。
3 撮像部
4 表示部
5 操作部(情報入力部)
6 通信部(送信部)
16 特徴量抽出部
50 健康度判定システム
51 端末装置(健康度判定装置)
53 判定部
54 通信部(特徴量入力部、受信部)
Claims (14)
- 対象者の健康度を判定する健康度判定装置であって、
対象者の器官の撮影画像から抽出され、健康度の判定に用いる特徴量が入力される特徴量入力部と、
前記特徴量入力部を介して入力された対象者の特徴量を、母集団の特徴量の分布と比較することにより、他人との個人差に相当する対象者の体質の偏りと、対象者の体調の変化とを判定する判定部とを備え、
前記判定部は、対象者の体質の偏りを判定する場合、前記母集団の特徴量の分布として、他人を対象として作成される、他人の特徴量の分布を用い、対象者の体調の変化を判定する場合、前記母集団の特徴量の分布として、体調を評価する評価期間内で集めた対象者本人の特徴量の分布を用いることを特徴とする健康度判定装置。 - 前記判定部は、前記特徴量入力部を介して入力された対象者の特徴量の、前記他人の特徴量の分布の原点からの距離を第1の評価値として求め、前記評価期間において経時的に変化する前記第1の評価値の平均を、対象者の体質の偏りとして判定することを特徴とする請求項1に記載の健康度判定装置。
- 前記判定部は、前記評価期間内において経時的に変化する前記第1の評価値が閾値以下となる日時の特徴量を集めて対象者の特徴量の分布を作成し、前記特徴量入力部を介して入力された対象者の特徴量の、前記分布の原点からの距離を第2の評価値として求め、前記評価期間において経時的に変化する第2の評価値の変化を、対象者の体調の変化として判定することを特徴とする請求項2に記載の健康度判定装置。
- 前記判定部は、前記特徴量入力部を介して入力された対象者の特徴量と前記母集団の特徴量の分布との比較を、マハラノビス距離を用いて行うことを特徴とする請求項1から3のいずれかに記載の健康度判定装置。
- 前記他人の特徴量の分布は、10歳以上19歳以下の他人の特徴量の分布であることを特徴とする請求項1から4のいずれかに記載の健康度判定装置。
- 前記他人の特徴量の分布は、10歳以上19歳以下で、かつ、健康な他人の特徴量の分布であることを特徴とする請求項1から5のいずれかに記載の健康度判定装置。
- 前記対象者の特徴量の分布は、20歳以上である対象者の特徴量の分布であることを特徴とする請求項1から6のいずれかに記載の健康度判定装置。
- 前記対象者の特徴量の分布は、20歳以上で、かつ、健康な対象者の特徴量の分布であることを特徴とする請求項1から7のいずれかに記載の健康度判定装置。
- 前記器官は、舌であることを特徴とする請求項1から8のいずれかに記載の健康度判定装置。
- 前記特徴量入力部を介して入力された対象者の特徴量は、器官の色および形の少なくとも一方についての情報を含むことを特徴とする請求項1から9のいずれかに記載の健康度判定装置。
- 前記特徴量入力部は、外部から送信される対象者の特徴量の情報を受信する受信部で構成されていることを特徴とする請求項1から10のいずれかに記載の健康度判定装置。
- 請求項1から11のいずれかに記載の健康度判定装置と、器官画像撮影装置とを、通信回線を介して通信可能に接続してなり、
前記器官画像撮影装置は、
対象者の器官を撮影する撮像部と、
前記撮像部にて撮影された画像から、健康度の判定に用いる特徴量を抽出する特徴量抽出部と、
前記特徴量抽出部にて抽出された対象者の特徴量の情報を、前記健康度判定装置に送信する送信部とを備えていることを特徴とする健康度判定システム。 - 前記器官画像撮影装置は、
前記健康度判定装置の前記判定部により判定された体質の偏りおよび体調の変化に関する情報を表示する表示部を備えていることを特徴とする請求項12に記載の健康度判定システム。 - 前記器官画像撮影装置は、
問診に必要な項目を表示する表示部と、
前記表示部に表示された前記項目に対して返答するための情報入力部とをさらに備え、
前記健康度判定装置の前記判定部は、前記器官画像撮影装置から送信された情報に含まれる対象者の特徴量と前記母集団の特徴量の分布との比較と、前記情報入力部により入力されて該健康度判定装置に送信される返答結果とに基づいて、対象者の体質の偏りと、対象者の体調の変化とを判定することを特徴とする請求項12に記載の健康度判定システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15785990.1A EP3139339A1 (en) | 2014-04-30 | 2015-02-20 | Health degree assessing device and health degree assessing system |
JP2015530180A JP5800119B1 (ja) | 2014-04-30 | 2015-02-20 | 健康度判定装置および健康度判定システム |
CN201580022577.5A CN106462926A (zh) | 2014-04-30 | 2015-02-20 | 健康度判定装置以及健康度判定系统 |
US15/307,737 US20170112429A1 (en) | 2014-04-30 | 2015-02-20 | Level-of-health determining device and level-of-health determining system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014093475 | 2014-04-30 | ||
JP2014-093475 | 2014-04-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015166692A1 true WO2015166692A1 (ja) | 2015-11-05 |
Family
ID=54358433
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/054703 WO2015166692A1 (ja) | 2014-04-30 | 2015-02-20 | 健康度判定装置および健康度判定システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20170112429A1 (ja) |
EP (1) | EP3139339A1 (ja) |
JP (1) | JP5800119B1 (ja) |
CN (1) | CN106462926A (ja) |
WO (1) | WO2015166692A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019208805A (ja) * | 2018-06-04 | 2019-12-12 | 麻衣子 下郷 | 生命予後予測方法 |
JP2022052910A (ja) * | 2020-09-24 | 2022-04-05 | 株式会社東芝 | 体調評価システム、サーバ、プログラムおよび体調評価サービス提供方法 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI614624B (zh) * | 2017-04-24 | 2018-02-11 | 太豪生醫股份有限公司 | 雲端醫療影像分析系統與方法 |
US20190014996A1 (en) * | 2017-07-14 | 2019-01-17 | Hai Rong Qian | Smart medical diagnosis system |
JP6512648B1 (ja) * | 2017-11-15 | 2019-05-15 | 前田商事株式会社 | ソフトウェア、健康状態判定装置及び健康状態判定方法 |
CN113470820A (zh) * | 2021-08-25 | 2021-10-01 | 郑州大学 | 一种艾灸机器人智能控制方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001314376A (ja) * | 2000-05-11 | 2001-11-13 | Res Dev Corp Of Japan | 舌診用システムと問診用システム、及び舌診問診教育支援装置 |
JP2004113581A (ja) * | 2002-09-27 | 2004-04-15 | Asahi:Kk | 健康管理装置 |
JP2006149679A (ja) * | 2004-11-29 | 2006-06-15 | Konica Minolta Holdings Inc | 健康度判定方法、装置、及びプログラム |
US20060247502A1 (en) * | 2005-04-28 | 2006-11-02 | Eastman Kodak Company | Method for diagnosing disease from tongue image |
JP2008054835A (ja) * | 2006-08-30 | 2008-03-13 | Toshiba Corp | 健康状態診断システム |
-
2015
- 2015-02-20 JP JP2015530180A patent/JP5800119B1/ja not_active Expired - Fee Related
- 2015-02-20 US US15/307,737 patent/US20170112429A1/en not_active Abandoned
- 2015-02-20 CN CN201580022577.5A patent/CN106462926A/zh active Pending
- 2015-02-20 EP EP15785990.1A patent/EP3139339A1/en not_active Withdrawn
- 2015-02-20 WO PCT/JP2015/054703 patent/WO2015166692A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001314376A (ja) * | 2000-05-11 | 2001-11-13 | Res Dev Corp Of Japan | 舌診用システムと問診用システム、及び舌診問診教育支援装置 |
JP2004113581A (ja) * | 2002-09-27 | 2004-04-15 | Asahi:Kk | 健康管理装置 |
JP2006149679A (ja) * | 2004-11-29 | 2006-06-15 | Konica Minolta Holdings Inc | 健康度判定方法、装置、及びプログラム |
US20060247502A1 (en) * | 2005-04-28 | 2006-11-02 | Eastman Kodak Company | Method for diagnosing disease from tongue image |
JP2008054835A (ja) * | 2006-08-30 | 2008-03-13 | Toshiba Corp | 健康状態診断システム |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019208805A (ja) * | 2018-06-04 | 2019-12-12 | 麻衣子 下郷 | 生命予後予測方法 |
JP2022052910A (ja) * | 2020-09-24 | 2022-04-05 | 株式会社東芝 | 体調評価システム、サーバ、プログラムおよび体調評価サービス提供方法 |
JP7183232B2 (ja) | 2020-09-24 | 2022-12-05 | 株式会社東芝 | 体調評価システム、サーバ、プログラムおよび体調評価サービス提供方法 |
Also Published As
Publication number | Publication date |
---|---|
EP3139339A1 (en) | 2017-03-08 |
JP5800119B1 (ja) | 2015-10-28 |
CN106462926A (zh) | 2017-02-22 |
US20170112429A1 (en) | 2017-04-27 |
JPWO2015166692A1 (ja) | 2017-04-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5800119B1 (ja) | 健康度判定装置および健康度判定システム | |
WO2015114950A1 (ja) | 器官画像撮影装置 | |
WO2016076059A1 (ja) | 器官画像撮影装置およびプログラム | |
KR102317478B1 (ko) | 상처의 평가 및 관리를 위한 방법 및 시스템 | |
US20140313303A1 (en) | Longitudinal dermoscopic study employing smartphone-based image registration | |
US20140378810A1 (en) | Physiologic data acquisition and analysis | |
US20120078113A1 (en) | Convergent parameter instrument | |
CN104244802B (zh) | 图像处理装置、图像处理方法以及图像处理程序 | |
US20080260218A1 (en) | Medical Imaging Method and System | |
US11069057B2 (en) | Skin diagnostic device and skin diagnostic method | |
WO2016067892A1 (ja) | 健康度出力装置、健康度出力システムおよびプログラム | |
JP4649965B2 (ja) | 健康度判定装置、及びプログラム | |
Cassidy et al. | Artificial intelligence for automated detection of diabetic foot ulcers: A real-world proof-of-concept clinical evaluation | |
KR100926769B1 (ko) | 혀 영상을 이용한 후태 박태 판별 방법 | |
WO2015049936A1 (ja) | 器官画像撮影装置 | |
WO2015068494A1 (ja) | 器官画像撮影装置 | |
JP2005094185A (ja) | 画像処理システム、画像処理装置、および撮像制御方法 | |
WO2015060070A1 (ja) | 器官画像撮影装置 | |
WO2019092723A1 (en) | System and method for determining pathological status of a laryngopharyngeal area in a patient | |
JP2016198140A (ja) | 器官画像撮影装置 | |
US20230031995A1 (en) | Motion-Based Cardiopulmonary Function Index Measuring Device, and Senescence Degree Prediction Apparatus and Method | |
KR20090055171A (ko) | 스테레오 영상을 이용한 안면 진단 방법 | |
JP2022000686A (ja) | 画像表示システム、及び画像表示方法 | |
US20220051399A1 (en) | Method and system for determining well-being indicators | |
JP2016198141A (ja) | 器官画像撮影装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2015530180 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15785990 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2015785990 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015785990 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15307737 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |