WO2015114950A1 - 器官画像撮影装置 - Google Patents
器官画像撮影装置 Download PDFInfo
- Publication number
- WO2015114950A1 WO2015114950A1 PCT/JP2014/082393 JP2014082393W WO2015114950A1 WO 2015114950 A1 WO2015114950 A1 WO 2015114950A1 JP 2014082393 W JP2014082393 W JP 2014082393W WO 2015114950 A1 WO2015114950 A1 WO 2015114950A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- ratio
- tongue
- image
- image data
- calculation
- Prior art date
Links
- 210000000056 organ Anatomy 0.000 title claims abstract description 42
- 238000003384 imaging method Methods 0.000 claims abstract description 60
- 230000036541 health Effects 0.000 claims abstract description 33
- 230000036772 blood pressure Effects 0.000 claims description 51
- 206010010774 Constipation Diseases 0.000 claims description 15
- 206010012735 Diarrhoea Diseases 0.000 claims description 14
- 230000036760 body temperature Effects 0.000 claims description 14
- 230000017531 blood circulation Effects 0.000 claims description 10
- 206010049565 Muscle fatigue Diseases 0.000 claims description 9
- 210000000621 bronchi Anatomy 0.000 claims description 5
- 208000024891 symptom Diseases 0.000 description 25
- 238000005286 illumination Methods 0.000 description 21
- 238000000034 method Methods 0.000 description 14
- 238000004891 communication Methods 0.000 description 13
- 230000035488 systolic blood pressure Effects 0.000 description 11
- 238000003745 diagnosis Methods 0.000 description 10
- 230000008859 change Effects 0.000 description 7
- 238000012937 correction Methods 0.000 description 7
- 230000007423 decrease Effects 0.000 description 7
- 206010068319 Oropharyngeal pain Diseases 0.000 description 6
- 201000007100 Pharyngitis Diseases 0.000 description 6
- 238000000605 extraction Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000004044 response Effects 0.000 description 6
- 230000035487 diastolic blood pressure Effects 0.000 description 5
- 229940124595 oriental medicine Drugs 0.000 description 5
- 241001465754 Metazoa Species 0.000 description 3
- 230000000875 corresponding effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 239000000820 nonprescription drug Substances 0.000 description 2
- 238000011002 quantification Methods 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 206010006429 Bronchial conditions Diseases 0.000 description 1
- 206010010904 Convulsion Diseases 0.000 description 1
- 206010020772 Hypertension Diseases 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000036461 convulsion Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000002405 diagnostic procedure Methods 0.000 description 1
- 238000012631 diagnostic technique Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 206010016256 fatigue Diseases 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 244000144972 livestock Species 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 210000004877 mucosa Anatomy 0.000 description 1
- 230000004118 muscle contraction Effects 0.000 description 1
- 230000003387 muscular Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 229910052724 xenon Inorganic materials 0.000 description 1
- FHNFHKCVQCLJFQ-UHFFFAOYSA-N xenon atom Chemical compound [Xe] FHNFHKCVQCLJFQ-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/45—For evaluating or diagnosing the musculoskeletal system or teeth
- A61B5/4538—Evaluating a particular part of the muscoloskeletal system or a particular medical condition
- A61B5/4542—Evaluating the mouth, e.g. the jaw
- A61B5/4552—Evaluating soft tissue within the mouth, e.g. gums or tongue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
- A61B5/004—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
- A61B5/015—By temperature mapping of body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/026—Measuring blood flow
- A61B5/0261—Measuring blood flow using optical means, e.g. infrared light
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/42—Detecting, measuring or recording for evaluating the gastrointestinal, the endocrine or the exocrine systems
- A61B5/4222—Evaluating particular parts, e.g. particular organs
- A61B5/4255—Intestines, colon or appendix
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/45—For evaluating or diagnosing the musculoskeletal system or teeth
- A61B5/4519—Muscles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6814—Head
- A61B5/682—Mouth, e.g., oral cavity; tongue; Lips; Teeth
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6898—Portable consumer electronic devices, e.g. music players, telephones, tablet computers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7278—Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7282—Event detection, e.g. detecting unique waveforms indicative of a medical condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/743—Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0233—Special features of optical sensors or probes classified in A61B5/00
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
- A61B2576/02—Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1032—Determining colour for diagnostic purposes
Definitions
- the present invention relates to an organ image photographing apparatus for photographing a tongue which is a living organ and extracting information necessary for diagnosis of health.
- Patent Document 1 proposes a system that comprehensively performs health degree determination using the Mahalanobis distance.
- Japanese Patent No. 4487535 see claim 1, paragraphs [0006], [0018], [0096], etc.
- Japanese Patent No. 4649965 see claim 1, paragraph [0008] etc.
- Patent Documents 1 and 2 perform comprehensive (single) health level determination from a photographed tongue image, and include a plurality of indicators (for example, blood pressure and body temperature) regarding the health level. Information about is not available. As a result, users are not able to grasp their physical condition and symptoms in detail, such as purchasing over-the-counter drugs that match their physical condition and symptoms, and consulting medical institutions that match their physical condition and symptoms. Is not easy to determine.
- the present invention has been made to solve the above-mentioned problems, and its purpose is to manage daily physical condition changes of users, compare current information with past history and information of others, and An object of the present invention is to provide an organ imaging apparatus that makes it easy to determine a specific response that matches a symptom.
- An organ imaging apparatus provides an imaging unit that images a tongue of a living body, and an operation using chromaticity values obtained from data of a captured image of the tongue acquired by the imaging unit.
- a calculation unit that calculates and outputs a numerical value indicating a state of at least one of the plurality of indices related to the degree.
- the user's daily physical condition change based on numerical values calculated for at least one of a plurality of indicators relating to health, the user's daily physical condition change, comparison with past history and other data, physical condition and symptoms It is easy to determine a specific response according to the situation.
- It is explanatory drawing which shows the positional relationship of the illumination part and imaging part of the said organ image imaging device with respect to imaging
- It is explanatory drawing which shows the picked-up image of the tongue by the said imaging part, the edge extraction filter, and the outline of the tongue extracted from the said picked-up image using the said edge extraction filter.
- the numerical value range includes the values of the lower limit A and the upper limit B.
- FIG. 1 is a perspective view showing an external appearance of an organ image photographing apparatus 1 of the present embodiment
- FIG. 2 is a block diagram showing a schematic configuration of the organ image photographing apparatus 1.
- the organ image capturing apparatus 1 captures a tongue, which is a living organ, and extracts information necessary for diagnosis of health.
- the organ image photographing apparatus 1 includes an illumination unit 2, an imaging unit 3, a display unit 4, an operation unit 5, a communication unit 6, and an audio output unit 7.
- the illumination unit 2 is provided in the housing 21, and the configuration other than the illumination unit 2 (for example, the imaging unit 3, the display unit 4, the operation unit 5, the communication unit 6, and the audio output unit 7) is provided in the housing 22. ing.
- casing 22 are connected so that relative rotation is possible, rotation is not necessarily required and one side may be completely fixed to the other.
- etc., May be provided in the single housing
- the organ image photographing device 1 may be composed of a multifunctional portable information terminal.
- the illumination unit 2 is composed of an illuminator that illuminates a subject to be photographed from above.
- a light source that emits a daylight color such as a xenon lamp is used.
- the brightness of the light source varies depending on the sensitivity of the imaging unit 3 and the distance to the shooting target. As an example, it is possible to consider the brightness at which the illuminance of the shooting target is 1000 to 10000 lx.
- the illumination unit 2 includes a lighting circuit and a dimming circuit in addition to the above light source, and lighting / extinguishing and dimming are controlled by a command from the illumination control unit 11.
- the imaging unit 3 captures an image of a living organ (here, tongue) under illumination by the illumination unit 2 and acquires an image, and includes an imaging lens and an area sensor (imaging element).
- the aperture (brightness of the lens), shutter speed, and focal length of the imaging lens are set so that the entire range to be photographed is in focus.
- F number 16
- shutter speed 1/120 seconds
- focal length 20 mm.
- the area sensor is composed of image sensors such as CCD (Charge Coupled Device) and CMOS (Complementary Metal Oxide Semiconductor), and the sensitivity and resolution are set so that the color and shape of the subject can be detected sufficiently.
- image sensors such as CCD (Charge Coupled Device) and CMOS (Complementary Metal Oxide Semiconductor)
- sensitivity and resolution are set so that the color and shape of the subject can be detected sufficiently.
- sensitivity 60 db
- resolution 10 million pixels.
- Imaging by the imaging unit 3 is controlled by the imaging control unit 12.
- the imaging unit 3 includes a focus mechanism (not shown), a diaphragm mechanism, a drive circuit, an A / D conversion circuit, and the like. Focus, aperture control, A / D conversion, and the like are controlled.
- the imaging unit 3 acquires, for example, data of 0 to 255 in 8 bits for each of red (R), green (G), and blue (B) as captured image data.
- FIG. 3 is an explanatory diagram showing a positional relationship between the illumination unit 2 and the imaging unit 3 with respect to an imaging target (tongue or face).
- the imaging unit 3 is arranged to face the subject to be photographed.
- the illumination unit 2 is arranged so as to illuminate the imaging target at an angle A of, for example, 0 ° to 45 ° with respect to the imaging optical axis X of the imaging unit 3 passing through the imaging target.
- the imaging optical axis X refers to the optical axis of the imaging lens that the imaging unit 3 has.
- the preferable range of the angle A at the time of illumination is 15 ° to 30 °.
- the display unit 4 includes a liquid crystal panel (not shown), a backlight, a lighting circuit, and a control circuit.
- the display unit 4 calculates and outputs an image acquired by photographing with the imaging unit 3 and a calculation unit 16 described later. Displayed information.
- the display unit 4 can also display information acquired from the outside via the communication unit 6 (for example, a result of diagnosis by transmitting information to an external medical institution). Display of various types of information on the display unit 4 is controlled by the display control unit 13.
- the operation unit 5 is an input unit for instructing imaging by the imaging unit 3, and includes an OK button (imaging execution button) 5a and a CANCEL button 5b.
- the display unit 4 and the operation unit 5 are configured by a common touch panel display device 31, and the display area of the display unit 4 and the display area of the operation unit 5 in the touch panel display device 31 are separated.
- the display of the operation unit 5 on the touch panel display device 31 is controlled by the operation control unit 14.
- the operation unit 5 may be configured by an input unit other than the touch panel display device 31 (the operation unit 5 may be provided at a position outside the display area of the touch panel display device 31).
- the communication unit 6 transmits the image data acquired by the imaging unit 3 and the information calculated and output by the calculation unit 16 described below to the outside via a communication line (including wired and wireless). It is an interface for receiving information from the outside. Transmission / reception of information in the communication unit 6 is controlled by the communication control unit 18.
- the audio output unit 7 outputs various types of information as audio, and is composed of, for example, a speaker.
- the information output by voice includes the result (numerical value for each index) calculated by the calculation unit 16.
- the sound output in the sound output unit 7 is controlled by the sound output control unit 19.
- the organ imaging apparatus 1 further includes an illumination control unit 11, an imaging control unit 12, a display control unit 13, an operation control unit 14, an image processing unit 15, a calculation unit 16, a storage unit 17, a communication control unit 18, and a voice.
- An output control unit 19 and an overall control unit 20 that controls these units are provided.
- the illumination control unit 11, the imaging control unit 12, the display control unit 13, the operation control unit 14, the communication control unit 18, and the audio output control unit 19 are the illumination unit 2, the imaging unit 3, the display unit 4, and the operation.
- the unit 5, the communication unit 6, and the audio output unit 7 are controlled.
- the overall control unit 20 is composed of, for example, a CPU (Central Processing Unit).
- the illumination control unit 11, the imaging control unit 12, the display control unit 13, the operation control unit 14, the communication control unit 18, the audio output control unit 19, and the overall control unit 20 are integrated (for example, with one CPU). ) May be configured.
- the image processing unit 15 has a function of extracting an organ outline from the image acquired by the imaging unit 3. Extraction of the outline of the organ can be performed by extracting the luminance edge of the captured image (the portion in which the brightness changes abruptly in the image). The luminance edge is extracted, for example, as shown in FIG. Such an edge extraction filter can be used.
- the edge extraction filter is a filter that weights pixels in the vicinity of the target pixel when performing first-order differentiation (when obtaining a difference in image data between adjacent pixels).
- the edge extraction filter for example, for the G image data of each pixel of the captured image, the difference between the image data is calculated between the target pixel and the neighboring pixel, and the pixel whose difference value exceeds a predetermined threshold is extracted. Thus, a pixel that becomes a luminance edge can be extracted. Since there is a luminance difference due to the shadow around the tongue, the contour line of the tongue can be extracted by extracting pixels that become luminance edges as described above.
- G image data having the greatest influence on the luminance is used for the calculation, R or B image data may be used.
- the speckled white portion found in the center of the tongue is called moss, and its color is called moss.
- the color of the red part other than the above in the tongue is called the tongue color.
- the storage unit 17 stores image data acquired by the imaging unit 3, data acquired by the image processing unit 15, data calculated by the calculation unit 16, information received from the outside, and the like.
- the calculation unit 16 obtains a chromaticity value from data of a captured image of the tongue acquired by the imaging unit 3, and calculates at least one of a plurality of indicators (determination items) related to health by calculation using the obtained chromaticity value. A numerical value indicating the state of one is calculated and output.
- the above-mentioned indicators include, for example, systolic blood pressure / diastolic blood pressure, constipation state, diarrhea state, fatigue state due to fatigue (whether there is general fatigue), blood circulation state, body temperature, bronchial state, muscle fatigue There is a state.
- the numerical value calculated by the calculation unit 16 is displayed on the display unit 4 or output from the voice output unit 7 by voice. Therefore, it can be said that the display unit 4 and the audio output unit 7 constitute an output unit that displays or outputs the above numerical values calculated and output by the calculation unit 16. Further, the numerical data calculated by the calculation unit 16 may be output to an external terminal device via the communication unit 6.
- the chromaticity value used for calculating the numerical value is an optimum value corresponding to the index in any one of the three regions of the upper, middle, and lower portions of the center portion in the horizontal direction of the captured image of the tongue. Required in the area.
- the three areas will be described first.
- FIG. 5 shows a positional relationship between the contour line Q of the tongue and the three regions R1, R2, and R3 for calculating chromaticity values.
- Each region R1, R2, and R3 corresponds to each of the upper, middle, and lower regions at the center in the left-right direction of the captured image of the tongue.
- Each region R1, R2, R3 has the size and position shown in the figure, where the vertical length of the contour line Q extracted by the image processing unit 15 is H, and the horizontal length (width) is W. Set by relationship. Note that the sizes of the regions R1, R2, and R3 are merely examples, and are not limited thereto. The reason why the three regions R1, R2, and R3 are set as regions for calculating chromaticity values in this way is as follows.
- Moss is a keratinized papillary tissue of the lingual mucosa, and is present from the upper part to the middle part in a band-like region above and below the center in the left-right direction of the tongue, and particularly in the upper region. Therefore, the change in the color of the moss appears as a change in the chromaticity of the upper region R1. That is, the moss color changes from white to brown depending on the amount of the keratinized tissue, and therefore the G component mainly increases or decreases in the RGB image data in the upper region R1.
- the color of the tongue is generally diagnosed at the left and right ends without moss or at the bottom.
- the left and right end portions of the tongue are subject to variations in how the illumination light strikes due to surface irregularities, so that light and shade are likely to occur. For this reason, it is desirable to determine the color of the tongue using the image data of the lower region R3. Since the color of the tongue reflects the color of blood, the R component or the B component mainly increases or decreases in the RGB image data in the region R3. That is, when the color of the tongue changes, the chromaticity of the lower region R3 also changes.
- the thickness of the moss can be detected using a difference from the color of the tongue as a base. That is, at the center in the left-right direction in the photographed image of the tongue, the moss is thick when the middle color between the upper part and the lower part is close to the upper color, and the moss is light when it is close to the lower color. Therefore, the change in the thickness of the moss appears as a change in the chromaticity of the middle region R2. That is, as the moss becomes thicker, the red color of the tongue changes to the white color of the moss, and therefore, the R component or the G component mainly increases or decreases in the RGB image data in the region R2.
- a diagnosis method (tongue diagnosis) of Oriental medicine that diagnoses the degree of health by judging the color or shape of the tongue or moss from the chromaticity value. Can be reproduced.
- the ratio of R image data (R ratio), the ratio of G image data (G ratio) to the reference data composed of the sum of RGB image data in each of the regions R1 to R3, or
- R ratio ratio of R image data
- G ratio ratio of G image data
- B ratio ratio of the B image data
- the R ratio in each of the regions R1 to R3 can be considered as an average of the R ratio of each pixel in each of the regions R1 to R3.
- the average of the G ratio and the B ratio in each of the regions R1 to R3 can be considered.
- chromaticity values described above may be quantified using various color systems other than RGB (for example, Yxy, Lab, etc.), and in any case, the Oriental medicine diagnostic method (tongue diagnosis) is reproduced. be able to.
- the R ratio, G ratio, and B ratio shown below are the ratio of R image data to the total sum of RGB image data (R / (R + G + B)) and the ratio of G image data (G / (R + G + B)), the ratio of B image data (B / (R + G + B)).
- the inventor of the present application investigated the relationship between the feature amount extracted from the photographed image of the tongue and the inspection item related to the health level.
- the extracted feature amounts are the R ratio, G ratio, and B ratio (total of 9) in each of the three regions R1 to R3 in the photographed image of the tongue.
- the inspection items are a total of 35 items including 30 items related to living conditions, physical condition, etc. and 5 items such as body temperature, pulse, blood pressure and the like.
- the answer was acquired by the individual question, and the latter 5 items were investigated by actual measurement. As a result, we found that some items are related.
- FIG. 6 is a graph showing the relationship between the B ratio in the upper region (region R1) of the photographed image of the tongue and the systolic blood pressure / diastolic blood pressure. From the figure, it can be seen that the B ratio in the upper region is decreased regardless of whether the highest blood pressure or the lowest blood pressure is increased, and both are correlated. Therefore, if a regression equation (approximate equation) indicating the relationship between the two can be obtained, the systolic blood pressure / minimum blood pressure can be estimated from the B ratio in the upper region using this regression equation.
- the regression equation is obtained from the discrete data shown in FIG. 6 by the least square method, the following equation is obtained.
- the variable x is the B ratio of the upper region
- the variable y is the blood pressure (mmHg).
- Maximum blood pressure: y -950x + 400
- Minimum blood pressure: y -1000x + 400
- the calculation unit 16 obtains the B ratio of the upper region from the data of the captured image of the tongue (RGB image data) and substitutes it in one of the regression equations described above, thereby obtaining the maximum health index.
- the value of hypertension / diastolic blood pressure can be calculated.
- FIG. 7 is a graph showing the relationship between the R ratio in the upper region (region R1) of the photographed image of the tongue and the state of constipation.
- the constipation state is indicated by a rank (0 to 3) when the subjective symptoms are quantified in four levels. At this time, constipation becomes worse as the rank goes from 0 to 3. From the figure, it can be seen that when the constipation becomes severe, the R ratio in the upper region increases and there is a correlation between the two. Therefore, if a regression equation showing the relationship between the two can be obtained, the state of constipation can be estimated from the R ratio of the upper region using this regression equation.
- the calculation unit 16 calculates the numerical value (rank) indicating the state of constipation as an index related to the health degree by calculating the R ratio of the upper region from the data of the photographed image of the tongue and substituting this into the regression equation. can do.
- the correction of the individual difference is the same as the case of calculating the maximum blood pressure / minimum blood pressure.
- FIG. 8 is a graph showing the relationship between the G ratio in the middle region (region R2) of the tongue image and the diarrhea state.
- region R2 region of the tongue image
- FIG. 8 shows by the rank (0-3) when the subjective symptom is digitized in four steps.
- diarrhea becomes worse as the rank goes from 0 to 3. From the figure, it can be seen that when the diarrhea becomes severe, the G ratio in the middle region increases and there is a correlation between the two. Therefore, if a regression equation showing the relationship between the two can be obtained, the state of diarrhea can be estimated from the G ratio in the middle region using this regression equation.
- the regression equation is obtained from the discrete data shown in FIG. 8 by the least square method, the following equation is obtained.
- the variable x is the G ratio of the middle region
- the variable y is a numerical value (rank) indicating the state of diarrhea.
- the calculation unit 16 calculates a numerical value (rank) indicating the state of diarrhea as an index relating to the degree of health by obtaining the G ratio of the middle region from the photographed image data of the tongue and substituting it into the regression equation. can do.
- the correction of the individual difference is the same as the case of calculating the maximum blood pressure / minimum blood pressure.
- FIG. 9 is a graph showing the relationship between the G ratio in the middle region (region R2) of the photographed image of the tongue and general fatigue.
- the subjective symptoms are expressed in ranks (0 to 3) when the subjective symptoms are quantified in four stages.
- the fatigue increases (becomes worse) as the rank goes from 0 to 3. From the figure, it can be seen that when the feeling of fatigue increases, the G ratio in the middle region decreases, and both are correlated. Therefore, if a regression equation showing the relationship between the two can be obtained, the state of fatigue due to fatigue can be estimated from the G ratio in the middle region using this regression equation.
- the calculation unit 16 obtains the G ratio of the middle region from the data of the captured image of the tongue, and substitutes this in the regression equation, thereby indicating a numerical value (rank) indicating the state of fatigue due to fatigue as an index relating to health. Can be calculated.
- the correction of the individual difference is the same as the case of calculating the maximum blood pressure / minimum blood pressure.
- FIG. 10 is a graph showing the relationship between the B ratio in the middle region (region R2) of the photographed image of the tongue and the state of the bear under the eyes.
- region R2 the state of a bear
- the rank (0-3) when the degree is digitized in four steps by self-evaluation.
- bears appear darker as the rank goes from 0 to 3, and blood circulation is deteriorated.
- the B ratio in the middle region increases and there is a correlation between the two. Therefore, if a regression equation showing the relationship between the two can be obtained, the state of blood circulation can be estimated from the B ratio in the middle region using this regression equation.
- the regression equation is obtained from the discrete data shown in FIG. 10 by the least square method, the following equation is obtained.
- the variable x is the B ratio of the middle region
- the variable y is a numerical value (rank) indicating the state of the bear under the eyes.
- the calculation unit 16 obtains the B ratio of the middle region from the data of the photographed image of the tongue, and substitutes this in the regression equation, thereby obtaining a numerical value (rank) indicating the state of blood circulation as an index relating to health. Can be calculated.
- the correction of the individual difference is the same as the case of calculating the maximum blood pressure / minimum blood pressure.
- FIG. 11 is a graph showing the relationship between the R ratio in the lower region (region R3) of the photographed image of the tongue and the body temperature. From the figure, it can be seen that as the body temperature rises, the R ratio in the lower region increases and there is a correlation between the two. Therefore, if a regression equation showing the relationship between the two can be obtained, the body temperature can be estimated from the R ratio in the lower region using this regression equation.
- the calculation unit 16 can calculate the body temperature numerical value as an index related to the health degree by obtaining the R ratio of the lower region from the photographed image data of the tongue and substituting it into the regression equation.
- the correction of the individual difference is the same as the case of calculating the maximum blood pressure / minimum blood pressure.
- FIG. 12 is a graph showing the relationship between the G ratio in the lower region (region R3) of the tongue image and the sore throat.
- the subjective symptom is expressed as a rank (0 to 3) when quantified in four stages.
- the rank goes from 0 to 3
- sore throat increases and the bronchial state deteriorates.
- the G ratio in the lower region decreases and there is a correlation between the two. Therefore, if a regression equation showing the relationship between the two can be obtained, the bronchial state can be estimated from the G ratio in the lower region using this regression equation.
- the calculation unit 16 can calculate a numerical value indicating the state of the bronchus as an index relating to the health degree by obtaining the G ratio of the lower region from the data of the captured image of the tongue and substituting the G ratio in the regression equation. it can.
- the correction of the individual difference is the same as the case of calculating the maximum blood pressure / minimum blood pressure.
- FIG. 13 is a graph showing the relationship between the G ratio in the lower region (region R3) of the photographed image of the tongue and the degree of foot balance (degree of muscle contraction / convulsion).
- the degree of foot balance is indicated by the rank (0 to 3) when the subjective symptoms are digitized in four stages.
- the rank goes from 0 to 3
- the foot balance increases and muscle fatigue increases. From this figure, it can be seen that when the foot balance increases and muscle fatigue increases, the G ratio in the lower region increases and there is a correlation between the two. Therefore, if a regression equation showing the relationship between the two can be obtained, the state of muscle fatigue can be estimated from the G ratio in the lower region using this regression equation.
- the calculation unit 16 calculates a numerical value indicating the state of muscle fatigue as an index relating to the health level by calculating the G ratio of the lower region from the data of the photographed image of the tongue and substituting this into the regression equation. Can do.
- the correction of the individual difference is the same as the case of calculating the maximum blood pressure / minimum blood pressure.
- FIG. 14 is a flowchart showing an operation flow in the organ image photographing apparatus 1 of the present embodiment.
- the illumination control unit 11 turns on the illumination unit 2 (S1) and sets photographing conditions such as illuminance (S1). S2).
- the imaging control unit 12 controls the imaging unit 3 to photograph the tongue that is the photographing target (S3).
- the image processing unit 15 extracts the tongue outline Q from the photographed image of the tongue (S4). Then, the calculation unit 16 detects the upper and lower ends and left and right ends of the tongue from the extracted contour line Q, and as shown in FIG. 5, the three regions R1 for obtaining the chromaticity value with the contour line Q as a reference. To R3 are set (S5).
- the calculation unit 16 obtains chromaticity values in each of the regions R1 to R3, that is, the R ratio, the G ratio, and the B ratio from the captured image data of the tongue (S6), and sets the color to the regression equation set in advance.
- the degree value is substituted, and a numerical value indicating the state of each of the plurality of indices relating to the health degree is calculated (S7).
- the arithmetic unit 16 only needs to calculate a numerical value indicating a state of at least one of the plurality of indices, and may calculate a numerical value for only one index.
- the calculated numerical value is output from the calculation unit 16 and displayed on the display unit 4 and is also stored in the storage unit 17. However, the calculated numerical value is output from the audio output unit 7 by voice as necessary, or an output device (not shown). Or transferred to the outside via the communication unit 6 (S8).
- the calculation unit 16 does not collectively output the state of a plurality of indicators related to the health level as one piece of comprehensive information, but at least one of the plurality of indicators. Since each state is calculated and output as a specific numerical value, the user quantitatively determines the current physical condition and symptoms based on the information at the output destination (for example, the display unit 4). It becomes possible. Thereby, it becomes easy to compare the information regarding the current physical condition or symptom with the past history or information of another person, or to determine a specific response according to the physical condition or symptom. For example, it becomes easy to select and purchase a commercially available drug that matches the physical condition and symptoms, and to select and receive a medical institution that matches the physical condition and symptoms. In particular, the calculation unit 16 calculates a numerical value indicating the state of a plurality of indices, so that the user can determine the physical condition and symptoms more finely than when calculating the numerical value for only one index. Become.
- the calculation unit 16 performs calculation using the tongue image (data) photographed by the imaging unit 3, it takes a long time to obtain the result as in a general health checkup.
- the numerical value of the index can be acquired at low cost and in a short time. As a result, the user can easily manage (monitor) daily physical condition changes using the calculated numerical values.
- the numerical value for each index calculated by the calculation unit 16 is displayed on the display unit 4 or output by voice from the voice output unit 7, so that the user can immediately display the numerical value by display or voice. Can grasp. As a result, it is possible to reliably obtain the above-described effects such as easy comparison of current information with past history and other person's data.
- the user can grasp the numerical value by the terminal device.
- a specialist can obtain the numerical value by the terminal device.
- grasping it is also possible to grasp the user's physical condition and symptoms in detail, and to give appropriate advice in selecting over-the-counter drugs and medical institutions.
- the calculation unit 16 is configured to obtain an R ratio, a G ratio, and an R area obtained in an optimum area according to the index among the upper area (area R1), the middle area (area R2), and the lower area (R3) of the captured image of the tongue.
- a numerical value indicating the state of each index is calculated by calculation using one of the B ratios as a chromaticity value. Therefore, it becomes easy for the user to grasp the specific state for each index and to specifically determine the response.
- FIG. 15 is a graph showing the relationship between the B ratio in the upper region (region R1) of the photographed image of the tongue, and the maximum blood pressure and the minimum blood pressure.
- the B ratio is a ratio of B image data to reference data, that is, a B / R ratio when only R image data is considered as reference data. Also in this figure, when the systolic blood pressure and the systolic blood pressure rise, the B / R ratio in the upper region decreases, and it can be seen that there is a correlation between both.
- the regression equation was obtained from the discrete data shown in FIG. 15 by the least square method, the following equation was obtained.
- the variable x is the B / R ratio of the upper region
- the variable y is the blood pressure (mmHg).
- Maximum blood pressure: y -180x + 270
- Minimum blood pressure: y -150x + 200
- the numerical value of the systolic blood pressure / minimum blood pressure can be calculated from the above regression equation.
- FIG. 16 is a graph showing the relationship between the B ratio in the upper region (region R1) of the photographed image of the tongue, and the maximum blood pressure and the minimum blood pressure.
- the B ratio here is the ratio of the B image data to the reference data when considering the sum of the R image data and the G image data as the reference data, that is, the B / (R + G) ratio. is there.
- the B / (R + G) ratio in the upper region decreases, and it can be seen that there is a correlation between both.
- the regression equation was obtained from the discrete data shown in FIG. 16 by the least square method, the following equation was obtained.
- the variable x is the B / (R + G) ratio of the upper region
- the variable y is the blood pressure (mmHg).
- Maximum blood pressure: y -500x + 340
- Minimum blood pressure: y -530x + 300
- the maximum blood pressure / minimum blood pressure has been described as an example of the index, but it is known that the numerical value indicating the index state can be calculated for other indexes (such as constipation state) as in the above. That is, the numerical value indicating the state of the index can be calculated using the ratio of any of the RGB image data to the reference data other than the sum of the RGB image data as the chromaticity value.
- the ratio of the R image data to the reference data is 1 regardless of the size of the R image data, and there is no point in using the ratio as the chromaticity value. The same applies to the ratio of the G or B image data to the reference data.
- the chromaticity value used for the calculation of the index is the ratio of the R image data, the ratio of the G image data, or the ratio of the B image data to the reference data including at least one of the RGB image data.
- any value other than 1 ((image data / reference data) other than R / R, G / G, B / B) may be used.
- the organ image capturing apparatus described above can be expressed as follows, and has the following effects.
- the organ image capturing apparatus described above includes a plurality of images related to health by an image capturing unit that captures a tongue of a living body and a calculation using chromaticity values obtained from data of a captured image of the tongue acquired by the image capturing unit.
- An arithmetic unit that calculates and outputs a numerical value indicating a state of at least one of the indices.
- the state of at least one of the plurality of indices related to the health level is calculated and output as a numerical value in the calculation unit, for example, if the output destination is a display unit, the user If the output can be grasped by the display on the display unit and the output destination is an audio output unit, the user can grasp the numerical value by the sound output from the audio output unit, and the output destination is an external terminal device If so, the user can grasp the numerical value by the terminal device. Thereby, the user can quantitatively and finely judge the physical condition and the symptom based on the figure of the grasped index.
- the numerical value can be acquired at a low cost and in a short time from the start of photographing the tongue. For this reason, it becomes easy for the user to manage daily physical condition changes by using the above numerical values.
- the chromaticity value is a value obtained in a region corresponding to the index among three regions of an upper portion, a middle portion, and a lower portion of a central portion in the horizontal direction of the photographed image of the tongue.
- a diagnostic technique for diagnosing a health condition or medical condition by observing the state of a human tongue.
- the color of the moss, the shape (thickness) of the moss, and the color of the tongue appear in the chromaticities of the upper, middle, and lower regions at the center in the left-right direction of the image of the tongue. Therefore, when the calculation unit performs calculation using the chromaticity value obtained in at least one of the three regions, an oriental medical diagnosis method (diagnosis of health condition or medical condition) can be reproduced.
- the calculation unit performs calculation using the chromaticity value obtained in the optimum region corresponding to the index among the above three regions, so that the numerical value indicating the state of the index can be accurately obtained (actual health state (As a numerical value close to).
- the chromaticity value is a ratio of red image data, a ratio of green image data, or a ratio of blue image data to reference data including at least one of red, green, and blue image data. Other values may be used.
- the image is bright at the time of shooting, at least one of the RGB image data increases.
- the ratio of the image data of a desired color with respect to the reference data as the chromaticity value, it is possible to reduce the influence of brightness at the time of shooting. That is, it is possible to reduce fluctuations in the numerical value of the calculated index due to fluctuations in chromaticity values depending on the brightness at the time of shooting.
- the ratio of the R image data to the reference data is always 1 regardless of the size of the R image data, and it means that the ratio is used as the chromaticity value. Therefore, the selection of the reference data is excluded (the above effect can be obtained when the chromaticity value is a value other than 1).
- the chromaticity value is a ratio of the blue image data to the reference data in the upper area of the photographed image of the tongue, and the calculation unit calculates the maximum blood pressure as the index by calculation using the ratio.
- the numerical value of the minimum blood pressure may be calculated. In this case, it becomes easy for the user to specifically grasp the maximum blood pressure or the minimum blood pressure from the above numerical values and to specifically determine the correspondence.
- the chromaticity value is a ratio of red image data to the reference data in the upper area of the photographed image of the tongue, and the calculation unit calculates constipation as the index by calculation using the ratio.
- a numerical value indicating the state may be calculated. In this case, it becomes easy for the user to specifically grasp the state of constipation from the above numerical values and to specifically determine the correspondence.
- the chromaticity value is a ratio of the green image data to the reference data in the middle region of the photographed image of the tongue, and the calculation unit calculates diarrhea as the index by calculation using the ratio.
- a numerical value indicating the state may be calculated. In this case, the user can easily grasp the state of diarrhea from the above numerical values and determine the correspondence specifically.
- the chromaticity value is a ratio of the green image data to the reference data in the middle region of the photographed image of the tongue, and the calculation unit is caused by fatigue as the index by calculation using the ratio.
- a numerical value indicating the state of fatigue may be calculated. In this case, it becomes easy for the user to specifically grasp the state of fatigue due to fatigue from the above numerical values and to specifically determine the response.
- the chromaticity value is a ratio of the blue image data to the reference data in the middle region of the photographed image of the tongue, and the calculation unit calculates blood circulation as the index by calculation using the ratio.
- a numerical value indicating the state may be calculated. In this case, the user can easily grasp the state of blood circulation from the above numerical values, and easily determine the correspondence specifically.
- the chromaticity value is a ratio of the red image data to the reference data in the lower region of the photographed image of the tongue, and the calculation unit calculates the body temperature as the index by the calculation using the ratio.
- a numerical value may be calculated. In this case, it becomes easy for the user to specifically grasp the body temperature from the above numerical values and to specifically determine the correspondence.
- the chromaticity value is a ratio of the green image data to the reference data in the lower region of the photographed image of the tongue, and the calculation unit calculates the bronchus as the index by the calculation using the ratio.
- a numerical value indicating the state may be calculated. In this case, the user can easily grasp the state of the bronchus from the above numerical values and determine the correspondence specifically.
- the chromaticity value is a ratio of green image data with respect to the reference data in the lower region of the photographed image of the tongue, and the calculation unit calculates muscle fatigue as the index by calculation using the ratio.
- a numerical value indicating the state may be calculated. In this case, the user can easily grasp the state of muscular fatigue from the above numerical values and determine the correspondence specifically.
- the reference data may be the sum of red, green and blue image data. Since the RGB image data increases when it is bright at the time of shooting, the ratio of the image data of the desired color to the reference data consisting of the sum of the RGB image data is considered as the chromaticity value, so that the brightness at the time of shooting is The impact can be reliably reduced.
- the organ imaging apparatus may further include an output unit that displays or outputs the numerical value output from the arithmetic unit.
- the user can immediately grasp the numerical value for each index from the display or output sound in the output unit, monitor the daily physical condition change, past history and data of others. It becomes even easier to compare and determine specific responses according to physical condition and symptoms.
- the present invention can be used for an apparatus that captures a tongue that is an organ of a living body and extracts information necessary for health diagnosis.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Physiology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Dentistry (AREA)
- Radiology & Medical Imaging (AREA)
- Artificial Intelligence (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Rheumatology (AREA)
- Signal Processing (AREA)
- Psychiatry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Hematology (AREA)
- Gastroenterology & Hepatology (AREA)
- Physical Education & Sports Medicine (AREA)
- Endocrinology (AREA)
- Cardiology (AREA)
- Multimedia (AREA)
- Pulmonology (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
図1は、本実施形態の器官画像撮影装置1の外観を示す斜視図であり、図2は、器官画像撮影装置1の概略の構成を示すブロック図である。器官画像撮影装置1は、生体の器官である舌を撮影して、健康度の診断に必要な情報を抽出するものである。
図5は、舌の輪郭線Qと、色度値を演算する3つの領域R1、R2、R3との位置関係を示している。各領域R1、R2、R3は、舌の撮影画像の左右方向の中央部の、上部、中部、下部の各領域にそれぞれ対応している。各領域R1、R2、R3は、画像処理部15にて抽出された輪郭線Qの上下方向の長さをHとし、左右方向の長さ(幅)をWとして、同図に示すサイズおよび位置関係で設定される。なお、各領域R1、R2、R3のサイズは一例であり、これに限定されるわけではない。このように色度値を演算する領域として、3つの領域R1、R2、R3を設定した理由は、以下の通りである。
次に、演算部16が上記色度値を用いた演算によって、健康度に関する複数の指標を数値化する具体例について説明する。なお、以下で示すR比率、G比率、B比率とは、上記と同様に、RGBの画像データの総和に対するRの画像データの比率(R/(R+G+B))、Gの画像データの比率(G/(R+G+B))、Bの画像データの比率(B/(R+G+B))のことを指すものとする。
図6は、舌の撮影画像の上部領域(領域R1)におけるB比率と、最高血圧/最低血圧との関係を示すグラフである。同図より、最高血圧、最低血圧のどちらが上昇しても、上部領域のB比率が低下し、両者に相関があることがわかる。したがって、両者の関係を示す回帰式(近似式)を求めることができれば、この回帰式を利用して、上部領域のB比率から最高血圧/最低血圧を推定することができる。
最高血圧:y=-950x+400
最低血圧:y=-1000x+400
図7は、舌の撮影画像の上部領域(領域R1)におけるR比率と、便秘の状態との関係を示すグラフである。なお、便秘の状態については、その自覚症状を4段階で数値化したときのランク(0~3)で示す。このとき、ランクが0から3に向かうにつれて便秘がひどくなるものとする。同図より、便秘がひどくなると、上部領域のR比率が増加し、両者に相関があることがわかる。したがって、両者の関係を示す回帰式を求めることができれば、この回帰式を利用して、上部領域のR比率から便秘の状態を推定することができる。
y=80x-30
図8は、舌の撮影画像の中部領域(領域R2)におけるG比率と、下痢の状態との関係を示すグラフである。なお、下痢の状態については、その自覚症状を4段階で数値化したときのランク(0~3)で示す。このとき、ランクが0から3に向かうにつれて下痢がひどくなるものとする。同図より、下痢がひどくなると、中部領域のG比率が増加し、両者に相関があることがわかる。したがって、両者の関係を示す回帰式を求めることができれば、この回帰式を利用して、中部領域のG比率から下痢の状態を推定することができる。
y=190x-60
図9は、舌の撮影画像の中部領域(領域R2)におけるG比率と、全身の倦怠感との関係を示すグラフである。なお、倦怠感については、その自覚症状を4段階で数値化したときのランク(0~3)で示す。このとき、ランクが0から3に向かうにつれて倦怠感が増大する(ひどくなる)ものとする。同図より、倦怠感が増大すると、中部領域のG比率が減少し、両者に相関があることがわかる。したがって、両者の関係を示す回帰式を求めることができれば、この回帰式を利用して、中部領域のG比率から疲労による倦怠の状態を推定することができる。
y=-180x+60
図10は、舌の撮影画像の中部領域(領域R2)におけるB比率と、眼下のクマの状態との関係を示すグラフである。なお、クマの状態については、その程度を自己評価により4段階で数値化したときのランク(0~3)で示す。このとき、ランクが0から3に向かうにつれてクマが濃く現れ、血液循環が悪くなっているものとする。同図より、クマが濃く、血液循環が悪くなると、中部領域のB比率が増加し、両者に相関があることがわかる。したがって、両者の関係を示す回帰式を求めることができれば、この回帰式を利用して、中部領域のB比率から血液循環の状態を推定することができる。
y=30x-10
図11は、舌の撮影画像の下部領域(領域R3)におけるR比率と、体温との関係を示すグラフである。同図より、体温が上昇すると、下部領域のR比率が増加し、両者に相関があることがわかる。したがって、両者の関係を示す回帰式を求めることができれば、この回帰式を利用して、下部領域のR比率から体温を推定することができる。
y=15x+30
図12は、舌の撮影画像の下部領域(領域R3)におけるG比率と、喉の痛みとの関係を示すグラフである。なお、喉の痛みについては、その自覚症状を4段階で数値化したときのランク(0~3)で示す。このとき、ランクが0から3に向かうにつれて喉の痛みが増大し、気管支の状態が悪くなっているものとする。同図より、喉の痛みが増大し、気管支の状態が悪くなると、下部領域のG比率が減少し、両者に相関があることがわかる。したがって、両者の関係を示す回帰式を求めることができれば、この回帰式を利用して、下部領域のG比率から気管支の状態を推定することができる。
y=-80x+20
図13は、舌の撮影画像の下部領域(領域R3)におけるG比率と、足のつり具合(足の筋肉の収縮・けいれんの度合い)との関係を示すグラフである。なお、足のつり具合については、その自覚症状を4段階で数値化したときのランク(0~3)で示す。このとき、ランクが0から3に向かうにつれて足のつり具合が増大し、筋肉疲労が増大しているものとする。同図より、足のつり具合が増大し、筋肉疲労が増大すると、下部領域のG比率が増加し、両者に相関があることがわかる。したがって、両者の関係を示す回帰式を求めることができれば、この回帰式を利用して、下部領域のG比率から筋肉疲労の状態を推定することができる。
y=170x-50
図14は、本実施形態の器官画像撮影装置1における動作の流れを示すフローチャートである。器官画像撮影装置1は、操作部5または不図示の入力部により、撮影指示を受け付けると、照明制御部11は照明部2を点灯させて(S1)、照度等の撮影条件の設定を行う(S2)。撮影条件の設定が終了すると、撮像制御部12は撮像部3を制御して撮影対象である舌を撮影する(S3)。
以上では、色度値として、RGBの画像データの総和からなる基準データに対する、Rの画像データの比率、Gの画像データの比率、またはBの画像データの比率を用いた例について説明したが、上記基準データは、RGBの画像データの総和でなくてもよい。
最高血圧:y=-180x+270
最低血圧:y=-150x+200
最高血圧:y=-500x+340
最低血圧:y=-530x+300
以上では、撮影対象が人間の舌である場合について説明したが、生体(生きているもの)であれば人間でなくてもよく、人間以外の動物であってもよい。例えば、ペットや家畜などの動物の舌であっても、本実施形態の手法を適用して指標の数値を演算し、その数値に基づいて体調や症状を判断することが可能である。この場合、意思の伝達ができない動物の体調不良を速やかに、かつ的確に判断することができる。
3 撮像部
4 表示部(出力部)
7 音声出力部(出力部)
16 演算部
Q 輪郭線
R1 領域(上部の領域)
R2 領域(中部の領域)
R3 領域(下部の領域)
Claims (13)
- 生体の舌を撮影する撮像部と、
前記撮像部にて取得される舌の撮影画像のデータから得られる色度値を用いた演算により、健康度に関する複数の指標のうちの少なくとも1つについての状態を示す数値を算出し、出力する演算部とを備えていることを特徴とする器官画像撮影装置。 - 前記色度値は、前記舌の撮影画像の左右方向の中央部の、上部、中部、下部の3つの領域のうち、前記指標に応じた領域で得られる値であることを特徴とする請求項1に記載の器官画像撮影装置。
- 前記色度値は、赤、緑、青の画像データの少なくともいずれかを含む基準データに対する、赤の画像データの比率、緑の画像データの比率、または青の画像データの比率であって、1以外の値であることを特徴とする請求項2に記載の器官画像撮影装置。
- 前記色度値は、前記舌の撮影画像の前記上部の領域における、前記基準データに対する青の画像データの比率であり、
前記演算部は、前記比率を用いた演算により、前記指標としての最高血圧または最低血圧の数値を算出することを特徴とする請求項3に記載の器官画像撮影装置。 - 前記色度値は、前記舌の撮影画像の前記上部の領域における、前記基準データに対する赤の画像データの比率であり、
前記演算部は、前記比率を用いた演算により、前記指標としての便秘の状態を示す数値を算出することを特徴とする請求項3または4に記載の器官画像撮影装置。 - 前記色度値は、前記舌の撮影画像の前記中部の領域における、前記基準データに対する緑の画像データの比率であり、
前記演算部は、前記比率を用いた演算により、前記指標としての下痢の状態を示す数値を算出することを特徴とする請求項3から5のいずれかに記載の器官画像撮影装置。 - 前記色度値は、前記舌の撮影画像の前記中部の領域における、前記基準データに対する緑の画像データの比率であり、
前記演算部は、前記比率を用いた演算により、前記指標としての疲労による倦怠の状態を示す数値を算出することを特徴とする請求項3から6のいずれかに記載の器官画像撮影装置。 - 前記色度値は、前記舌の撮影画像の前記中部の領域における、前記基準データに対する青の画像データの比率であり、
前記演算部は、前記比率を用いた演算により、前記指標としての血液循環の状態を示す数値を算出することを特徴とする請求項3から7のいずれかに記載の器官画像撮影装置。 - 前記色度値は、前記舌の撮影画像の前記下部の領域における、前記基準データに対する赤の画像データの比率であり、
前記演算部は、前記比率を用いた演算により、前記指標としての体温の数値を算出することを特徴とする請求項3から8のいずれかに記載の器官画像撮影装置。 - 前記色度値は、前記舌の撮影画像の前記下部の領域における、前記基準データに対する緑の画像データの比率であり、
前記演算部は、前記比率を用いた演算により、前記指標としての気管支の状態を示す数値を算出することを特徴とする請求項3から9のいずれかに記載の器官画像撮影装置。 - 前記色度値は、前記舌の撮影画像の前記下部の領域における、前記基準データに対する緑の画像データの比率であり、
前記演算部は、前記比率を用いた演算により、前記指標としての筋肉疲労の状態を示す数値を算出することを特徴とする請求項3から10のいずれかに記載の器官画像撮影装置。 - 前記基準データは、赤、緑、青の画像データの総和であることを特徴とする請求項3から11のいずれかに記載の器官画像撮影装置。
- 前記演算部でから出力された前記数値を表示または音声出力する出力部をさらに備えていることを特徴とする請求項1から12のいずれかに記載の器官画像撮影装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP14881298.5A EP3100672A1 (en) | 2014-01-30 | 2014-12-08 | Organ image capturing device |
US15/115,517 US20170164888A1 (en) | 2014-01-30 | 2014-12-08 | Organ imaging device |
JP2015559768A JPWO2015114950A1 (ja) | 2014-01-30 | 2014-12-08 | 器官画像撮影装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-015411 | 2014-01-30 | ||
JP2014015411 | 2014-01-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015114950A1 true WO2015114950A1 (ja) | 2015-08-06 |
Family
ID=53756544
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/082393 WO2015114950A1 (ja) | 2014-01-30 | 2014-12-08 | 器官画像撮影装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20170164888A1 (ja) |
EP (1) | EP3100672A1 (ja) |
JP (1) | JPWO2015114950A1 (ja) |
WO (1) | WO2015114950A1 (ja) |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI439960B (zh) | 2010-04-07 | 2014-06-01 | Apple Inc | 虛擬使用者編輯環境 |
DK179978B1 (en) | 2016-09-23 | 2019-11-27 | Apple Inc. | IMAGE DATA FOR ENHANCED USER INTERACTIONS |
KR102210150B1 (ko) | 2016-09-23 | 2021-02-02 | 애플 인크. | 아바타 생성 및 편집 |
KR102435337B1 (ko) | 2017-05-16 | 2022-08-22 | 애플 인크. | 이모지 레코딩 및 전송 |
DK179867B1 (en) | 2017-05-16 | 2019-08-06 | Apple Inc. | RECORDING AND SENDING EMOJI |
US20190014996A1 (en) * | 2017-07-14 | 2019-01-17 | Hai Rong Qian | Smart medical diagnosis system |
CN110403611B (zh) * | 2018-04-28 | 2022-06-07 | 张世平 | 血液中糖化血红蛋白成分值预测方法、装置、计算机设备和存储介质 |
CN110046020B (zh) * | 2018-05-07 | 2021-01-15 | 苹果公司 | 电子设备、计算机可读存储介质及电子设备处执行的方法 |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
DK201870378A1 (en) | 2018-05-07 | 2020-01-13 | Apple Inc. | DISPLAYING USER INTERFACES ASSOCIATED WITH PHYSICAL ACTIVITIES |
DK201870374A1 (en) | 2018-05-07 | 2019-12-04 | Apple Inc. | AVATAR CREATION USER INTERFACE |
US11107261B2 (en) | 2019-01-18 | 2021-08-31 | Apple Inc. | Virtual avatar animation based on facial feature movement |
DK201970530A1 (en) | 2019-05-06 | 2021-01-28 | Apple Inc | Avatar integration with multiple applications |
DK202070624A1 (en) | 2020-05-11 | 2022-01-04 | Apple Inc | User interfaces related to time |
US11921998B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Editing features of an avatar |
US11714536B2 (en) | 2021-05-21 | 2023-08-01 | Apple Inc. | Avatar sticker editor user interfaces |
US11776190B2 (en) | 2021-06-04 | 2023-10-03 | Apple Inc. | Techniques for managing an avatar on a lock screen |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004113581A (ja) * | 2002-09-27 | 2004-04-15 | Asahi:Kk | 健康管理装置 |
JP2004209245A (ja) * | 2002-12-28 | 2004-07-29 | Samsung Electronics Co Ltd | 舌映像からの関心領域の抽出方法及び舌映像を利用した健康モニタリング方法及び装置 |
JP2006149679A (ja) * | 2004-11-29 | 2006-06-15 | Konica Minolta Holdings Inc | 健康度判定方法、装置、及びプログラム |
JP2006166990A (ja) * | 2004-12-13 | 2006-06-29 | Olympus Corp | 医用画像処理方法 |
JP4487535B2 (ja) | 2003-11-10 | 2010-06-23 | コニカミノルタホールディングス株式会社 | 健康度測定システムおよびプログラム |
CN102509312B (zh) * | 2011-09-20 | 2013-10-02 | 哈尔滨工业大学 | 人体数字舌图像颜色色域空间及其提取方法 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080139966A1 (en) * | 2006-12-07 | 2008-06-12 | The Hong Kong Polytechnic University | Automatic tongue diagnosis based on chromatic and textural features classification using bayesian belief networks |
-
2014
- 2014-12-08 WO PCT/JP2014/082393 patent/WO2015114950A1/ja active Application Filing
- 2014-12-08 US US15/115,517 patent/US20170164888A1/en not_active Abandoned
- 2014-12-08 JP JP2015559768A patent/JPWO2015114950A1/ja not_active Withdrawn
- 2014-12-08 EP EP14881298.5A patent/EP3100672A1/en not_active Withdrawn
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004113581A (ja) * | 2002-09-27 | 2004-04-15 | Asahi:Kk | 健康管理装置 |
JP2004209245A (ja) * | 2002-12-28 | 2004-07-29 | Samsung Electronics Co Ltd | 舌映像からの関心領域の抽出方法及び舌映像を利用した健康モニタリング方法及び装置 |
JP4487535B2 (ja) | 2003-11-10 | 2010-06-23 | コニカミノルタホールディングス株式会社 | 健康度測定システムおよびプログラム |
JP2006149679A (ja) * | 2004-11-29 | 2006-06-15 | Konica Minolta Holdings Inc | 健康度判定方法、装置、及びプログラム |
JP4649965B2 (ja) | 2004-11-29 | 2011-03-16 | コニカミノルタホールディングス株式会社 | 健康度判定装置、及びプログラム |
JP2006166990A (ja) * | 2004-12-13 | 2006-06-29 | Olympus Corp | 医用画像処理方法 |
CN102509312B (zh) * | 2011-09-20 | 2013-10-02 | 哈尔滨工业大学 | 人体数字舌图像颜色色域空间及其提取方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2015114950A1 (ja) | 2017-03-23 |
EP3100672A1 (en) | 2016-12-07 |
US20170164888A1 (en) | 2017-06-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015114950A1 (ja) | 器官画像撮影装置 | |
US9468356B2 (en) | Lesion evaluation information generator, and method and computer readable medium therefor | |
JP7068487B2 (ja) | 電子内視鏡システム | |
WO2016067892A1 (ja) | 健康度出力装置、健康度出力システムおよびプログラム | |
KR101998595B1 (ko) | 이미지 기반 황달 진단 방법 및 장치 | |
WO2016076059A1 (ja) | 器官画像撮影装置およびプログラム | |
JP5800119B1 (ja) | 健康度判定装置および健康度判定システム | |
JP2022137109A (ja) | 画像表示システム、及び画像表示方法 | |
WO2015037316A1 (ja) | 器官画像撮影装置および器官画像撮影方法 | |
US20160210746A1 (en) | Organ imaging device | |
WO2015068494A1 (ja) | 器官画像撮影装置 | |
WO2015060070A1 (ja) | 器官画像撮影装置 | |
JP2015226599A (ja) | 生体色度計測装置 | |
JP7104913B2 (ja) | 撮像装置と撮像プログラムと画像判定装置と画像判定プログラムと画像処理システム | |
WO2015068495A1 (ja) | 器官画像撮影装置 | |
JP2016198140A (ja) | 器官画像撮影装置 | |
WO2015156039A1 (ja) | 器官画像撮影装置 | |
CN113747825B (zh) | 电子内窥镜系统 | |
JP2016168303A (ja) | 器官画像撮影装置 | |
KR20230081189A (ko) | 스마트폰을 이용한 동공반응 검사방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14881298 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015559768 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2014881298 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014881298 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15115517 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |