WO2006112227A1 - 画像処理装置及び画像処理方法 - Google Patents
画像処理装置及び画像処理方法 Download PDFInfo
- Publication number
- WO2006112227A1 WO2006112227A1 PCT/JP2006/305022 JP2006305022W WO2006112227A1 WO 2006112227 A1 WO2006112227 A1 WO 2006112227A1 JP 2006305022 W JP2006305022 W JP 2006305022W WO 2006112227 A1 WO2006112227 A1 WO 2006112227A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- region
- classification
- feature amount
- unit
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 370
- 238000000034 method Methods 0.000 title abstract description 62
- 210000004877 mucosa Anatomy 0.000 claims abstract description 48
- 238000003672 processing method Methods 0.000 claims description 162
- 238000003384 imaging method Methods 0.000 claims description 80
- 210000001156 gastric mucosa Anatomy 0.000 claims description 61
- 238000004364 calculation method Methods 0.000 claims description 59
- 238000001514 detection method Methods 0.000 claims description 56
- 230000000740 bleeding effect Effects 0.000 claims description 45
- 210000004400 mucous membrane Anatomy 0.000 claims description 45
- 230000002093 peripheral effect Effects 0.000 claims description 39
- 210000000056 organ Anatomy 0.000 claims description 26
- 238000011156 evaluation Methods 0.000 claims description 25
- 230000003902 lesion Effects 0.000 claims description 25
- 239000006260 foam Substances 0.000 claims description 22
- 210000000813 small intestine Anatomy 0.000 claims description 21
- 210000002429 large intestine Anatomy 0.000 claims description 10
- 238000012217 deletion Methods 0.000 claims description 9
- 230000037430 deletion Effects 0.000 claims description 9
- 210000002784 stomach Anatomy 0.000 claims description 7
- 239000000284 extract Substances 0.000 claims description 5
- 238000000605 extraction Methods 0.000 claims description 5
- 238000003708 edge detection Methods 0.000 claims description 3
- 239000013598 vector Substances 0.000 description 133
- 239000002775 capsule Substances 0.000 description 103
- 230000006870 function Effects 0.000 description 82
- 238000010586 diagram Methods 0.000 description 65
- 239000011159 matrix material Substances 0.000 description 26
- 230000000694 effects Effects 0.000 description 15
- 210000003608 fece Anatomy 0.000 description 15
- 230000005540 biological transmission Effects 0.000 description 13
- 230000001419 dependent effect Effects 0.000 description 10
- 238000004891 communication Methods 0.000 description 5
- 230000002550 fecal effect Effects 0.000 description 5
- 239000000835 fiber Substances 0.000 description 5
- 238000007781 pre-processing Methods 0.000 description 5
- 238000001914 filtration Methods 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 230000010354 integration Effects 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 210000001035 gastrointestinal tract Anatomy 0.000 description 3
- 238000003780 insertion Methods 0.000 description 3
- 230000037431 insertion Effects 0.000 description 3
- 230000001788 irregular Effects 0.000 description 3
- 239000000758 substrate Substances 0.000 description 3
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 2
- 101100457042 Dictyostelium discoideum mgst gene Proteins 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000003321 amplification Effects 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 210000003238 esophagus Anatomy 0.000 description 2
- 125000001475 halogen functional group Chemical group 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000002572 peristaltic effect Effects 0.000 description 2
- -1 stool Substances 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 241000167880 Hirundinidae Species 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000003705 background correction Methods 0.000 description 1
- 229910052797 bismuth Inorganic materials 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 210000001217 buttock Anatomy 0.000 description 1
- 210000001072 colon Anatomy 0.000 description 1
- 230000000112 colonic effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- OMZSGWSJDCOLKM-UHFFFAOYSA-N copper(II) sulfide Chemical compound [S-2].[Cu+2] OMZSGWSJDCOLKM-UHFFFAOYSA-N 0.000 description 1
- 210000001198 duodenum Anatomy 0.000 description 1
- 239000010794 food waste Substances 0.000 description 1
- 238000001727 in vivo Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 210000003097 mucus Anatomy 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000009747 swallowing Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/041—Capsule endoscopes for imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00011—Operational features of endoscopes characterised by signal transmission
- A61B1/00016—Operational features of endoscopes characterised by signal transmission using wireless means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
- A61B5/065—Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00025—Operational features of endoscopes characterised by power management
- A61B1/00036—Means for power saving, e.g. sleeping mode
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30092—Stomach; Gastric
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Definitions
- the present invention relates to an image processing apparatus and an image processing method, and more particularly to an image processing apparatus and an image processing method that can exclude an image in which an image of a biological mucous membrane surface is not satisfactorily captured.
- the endoscope device has, for example, an elongated insertion portion that is inserted into a body cavity as a living body, and forms an image by an objective optical system disposed at the distal end portion of the insertion portion.
- the image of the body cavity is picked up by a solid-state imaging element or the like and output as an imaging signal, and the image and image of the body cavity are displayed on a monitor or the like based on the imaging signal.
- the user observes, for example, an organ in the body cavity based on the image of the image in the body cavity displayed on the monitor or the like. Further, the endoscope apparatus can directly take an image of the digestive tract mucosa. Therefore, the user can comprehensively observe various findings such as the color of the mucous membrane, the shape of the lesion, and the fine structure of the mucosal surface.
- a capsule endoscope apparatus has been proposed as an imaging apparatus that can be expected to have substantially the same usefulness as the endoscope apparatus as described above.
- a capsule endoscope apparatus is placed in a body cavity when a subject swallows it from the mouth, and a capsule endoscope that transmits the captured image of the body cavity as an imaging signal to the outside.
- a receiver that accumulates the received imaging signal after receiving the captured imaging signal outside the body cavity, and an observation device for observing an image of the image in the body cavity based on the imaging signal accumulated in the receiver Composed.
- the capsule endoscope that constitutes the capsule endoscope apparatus is advanced by peristaltic movement of the digestive tract, for example, it takes several hours until the anal force is discharged after being inserted into the body cavity from the mouth. It usually takes time power S.
- the capsule endoscope enters the body cavity. For example, the number of still images as frame images accumulated in the receiver in a moving image for several hours in order to continuously output the imaging signal to the receiver until it is discharged. Will be enormous.
- the present invention has been made in view of the above points, and an object of the present invention is to provide an image processing apparatus and an image processing method capable of improving the observation efficiency by a user. Means for solving the problem
- the first image processing apparatus of the present invention determines an image input unit that inputs a medical image composed of a plurality of color signals, and whether or not the input medical image has sufficiently captured a biological mucous membrane. And a control unit that controls at least one of display or storage of the medical image based on a determination result in the determination unit.
- the second image processing apparatus of the present invention is the first image processing apparatus, further comprising: an image dividing unit that divides the medical image into a plurality of regions; and the plurality of regions of the medical image.
- a feature quantity calculation unit that calculates a feature quantity in each of the plurality of areas is identified to belong to a plurality of classes, and each of the plurality of areas is classified based on the identification result.
- a classification determination value calculation for calculating a ratio of the area group classified into a predetermined class to the plurality of areas based on a classification result by the area classification unit.
- An image classification unit for classifying the image having the region group classified into the predetermined class based on the ratio calculated by the classification unit, the classification determination value calculation unit, and a predetermined threshold related to the ratio And comprising
- the determination unit determines that the medical image is an image in which the biological mucous membrane is not sufficiently captured when the ratio is equal to or less than a predetermined threshold based on the classification result of the image classification unit.
- the medical image is determined to be an image obtained by sufficiently capturing the biological mucous membrane.
- the control unit displays the medical image determined by the determination unit that the biological mucous membrane has not been sufficiently imaged. It is characterized in that it is controlled as Shina.
- the control unit stores the medical image determined by the determination unit as not sufficiently imaging the biological mucous membrane. It is characterized in that it is controlled as Shina.
- the control unit stores the medical image determined by the determination unit that the biological mucous membrane is not sufficiently imaged. It is characterized in that it is controlled as Shina.
- a sixth image processing apparatus of the present invention is the above-described second image processing apparatus, further comprising: An image deletion unit that deletes the medical image determined by the determination unit as not sufficiently imaging the biological mucous membrane is provided.
- a seventh image processing apparatus of the present invention is an image in which the third image processing apparatus further deletes the medical image determined by the determination unit that the biological mucous membrane is not sufficiently imaged.
- a deletion unit is provided.
- An eighth image processing apparatus of the present invention is an image in which, in the fourth image processing apparatus, an image for deleting the medical image that has been determined by the determination unit that the biological mucous membrane has not been sufficiently imaged.
- a deletion unit is provided.
- a ninth image processing apparatus of the present invention is an image in which the fifth image processing apparatus further deletes the medical image determined by the determination unit that the biological mucous membrane is not sufficiently imaged.
- a deletion unit is provided.
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by.
- An eleventh image processing apparatus of the present invention includes an image input unit that inputs a plurality of medical images having a plurality of color signal powers, and a region setting that sets a plurality of regions for the input medical image.
- a feature amount calculation unit that calculates a feature amount in each of the plurality of regions of the medical image; Based on the density value of the green component of the medical image based on a region classification unit that classifies the plurality of regions into each of a plurality of classes including a class relating to biological mucosa and a class relating to non-biological mucosa, An edge detection unit that detects an area having an edge in a plurality of areas, and a bleeding part determination that determines whether the area having an edge includes a bleeding part based on the density value of the red component of the image Part and said A classification result determination unit that determines whether the classification result of the region classification unit is correct based on the determination result of the bleeding site determination unit, and the detection unit identifies a region including the bleeding site as a presence of a lesion This is characterized in that it is detected as a suspected region.
- the region determination unit classifies one region into a class relating to a biological mucous membrane based on a classification result of the region classification unit.
- the one region is determined to be a region obtained by imaging the biological mucous membrane.
- the detection result determination unit is based on a detection result of the detection unit and a determination result of the region determination unit.
- the detection result of the detection unit is determined to be correct.
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- a sixteenth image processing apparatus of the present invention includes an image input unit that inputs a plurality of medical images captured continuously in time, a region setting unit that sets a plurality of regions in the medical image, A determination unit for determining imaging targets in the plurality of regions set by the region setting unit, a specification unit for specifying an organ imaged in the medical image based on a determination result of the determination unit, and the specification And a specific result display unit for displaying a specific result in the unit.
- the seventeenth image processing apparatus of the present invention is the sixteenth image processing apparatus, further comprising: an image dividing unit that divides the medical image into a plurality of regions; and each of the plurality of regions of the medical image.
- a feature amount calculation unit for calculating a feature amount, and a region for identifying which of the plurality of regions each belongs to a plurality of classes based on the feature amount and classifying each of the plurality of regions based on the identification result
- a classification determination value calculation unit that calculates a ratio of a region group classified into a predetermined class among the plurality of classes based on a classification result by the classification unit and the region classification unit; and Based on the ratio calculated by the classification determination value calculation unit and a predetermined threshold related to the ratio, the predetermined An image classifying unit that classifies the image having the region group classified into the class, and the specifying unit is captured in the medical image based on the classification result of the image classifying unit, It is characterized by specifying the organ to be.
- the plurality of classes include at least a gastric mucosa class, a villus class, and a stool class. .
- the specifying unit is captured in the medical image. It is characterized in that it is determined that the organ is a stomach.
- the specifying unit is configured to detect an organ imaged in the medical image. It is characterized by determining that it is a small intestine.
- the specifying unit is captured in the medical image. Is determined to be the large intestine.
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- a twenty-third image processing apparatus of the present invention includes an image signal input unit that inputs an image signal based on an image picked up by a medical device having an image pickup function, and an image signal input in the image signal input unit.
- An image dividing unit that divides an image captured by the medical device into a plurality of regions, a feature amount calculating unit that calculates a feature amount in each of the plurality of regions divided by the image dividing unit, and the feature
- a first region classification unit that classifies the plurality of regions into one of a plurality of classes based on the feature amount calculated by the amount calculation unit and a predetermined first classification criterion; and the feature amount;
- a classification criterion setting unit that sets a second classification criterion based on a classification result by the first region classification unit; the feature amount; and the plurality of regions based on the second classification criterion.
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- the first region classification unit includes a statistical classifier using a parameter that defines the first classification criterion. And classifying the plurality of regions into any of the plurality of classes, wherein the second region classification unit uses the statistical classifier using a parameter that defines the second classification criterion. A plurality of areas are classified into the plurality of classes, respectively, and shifted.
- a twenty-sixth image processing apparatus of the present invention includes an image signal input unit that inputs an image signal based on an image captured by a medical device having an imaging function, and an image signal input by the image signal input unit.
- An image dividing unit that divides an image captured by the medical device into a plurality of regions, a feature amount calculating unit that calculates a feature amount in each of the plurality of regions divided by the image dividing unit, and the feature
- a first region classification unit that classifies the plurality of regions into one of a plurality of classes based on the feature amount calculated by the amount calculation unit and a predetermined first classification criterion; and Among them, the classification result of the first region by the first region classification unit is evaluated by calculating the evaluation value based on the classification result of the first region classification unit of the region located in the vicinity of the one region. Evaluation value to A detecting section, based on the evaluation value in the evaluation value calculation unit, characterized in that said one region anda second area classifying unit which classifies the one of the multiple classes.
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- a twenty-eighth image processing device of the present invention is an image signal input unit that inputs an image signal based on an image captured by a medical device having an imaging function, and an image signal input in the image signal input unit.
- An image dividing unit that divides an image captured by the medical device into a plurality of regions based on the image, a feature amount calculating unit that calculates a feature amount in each of the plurality of regions divided by the image dividing unit, and the plurality of the plurality of regions
- An attention area setting unit that sets one area as an attention area, and an area that is a predetermined distance away from the attention area.
- a near-peripheral region detection unit that detects a certain near-periphery region; and a substantially circular shape detection unit that detects that at least a part of a substantially circular contour portion exists in the near-outer peripheral region based on the feature amount; And a region extracting unit for extracting the region of interest when the substantially circular shape is detected by the substantially circular shape detecting unit.
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- the substantially circular shape detection unit is at least one of the substantially circular contour portions in the peripheral region.
- a substantially circular shape is detected when it is determined that the ratio of the region where the portion exists is equal to or greater than a predetermined threshold, and the region extracting unit extracts the region of interest as a region where the central portion of the substantially circular shape exists It is characterized by doing.
- the substantially circular shape is a bubble.
- a thirty-second image processing apparatus of the present invention is characterized in that, in the thirtieth image processing apparatus, the substantially circular shape is a bubble.
- a thirty-third image processing apparatus of the present invention includes an image signal input unit that inputs an image signal based on an image captured by a medical device having an image capturing function, and an image signal input by the image signal input unit.
- An image dividing unit that divides an image captured by the medical device into a plurality of regions
- a feature amount calculating unit that calculates a feature amount in each of the plurality of regions divided by the image dividing unit
- the feature An area classification unit that classifies each of the plurality of areas into one of a plurality of classes based on the feature amount calculated by the quantity calculation unit and a predetermined classification criterion, and structurally among the plurality of classes.
- An area detection unit that detects an area classified into a predetermined class preset as a class having a clear feature from the plurality of areas, and a previous area included in the area detected by the area detection unit Based on the feature quantity, characterized by comprising a classification criteria setting unit for setting the predetermined classification criterion in the area classifying unit.
- the predetermined class is at least one of a foam class and a villus class. .
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- a first image processing method of the present invention includes an image input step of inputting an image captured by a medical device having an imaging function, an area dividing step of dividing the image into a plurality of areas, A feature amount calculating step for calculating a feature amount from each of the plurality of regions, and based on the feature amount, each of the plurality of regions is either a region obtained by imaging a biological mucosa surface or a region obtained by imaging a non-biological mucosa And an area classification step for classifying the area.
- a second image processing method of the present invention includes an image input step of inputting an image captured by a medical device having an imaging function, an area dividing step of dividing the image into a plurality of regions, A feature amount calculating step for calculating a feature amount from each of the plurality of regions, and a region classification step for classifying each of the plurality of regions into one of a plurality of different classes based on the feature amount.
- the region classification step corresponds to both the region obtained by imaging the biological mucosal surface, the region obtained by imaging the non-biological mucosa, and both the biological mucosal surface and the non-biological mucosa. Each of the plurality of areas is classified into any of the areas not to be processed.
- a third image processing method of the present invention includes an image input step of inputting an image captured by a medical device having an imaging function, an area dividing step of dividing the image into a plurality of regions, A feature amount calculating step for calculating a feature amount from each of the plurality of regions, and a region classification step for classifying each of the plurality of regions into one of a plurality of different classes based on the feature amount.
- an exclusive class setting step for setting a combination of exclusive classes that are not allowed to be mixed on the image, and which class is given priority among the combinations of the exclusive classes.
- a priority class setting step to set wherein the region classification step If there is an area classified into the class of the combination of exclusive classes set in the exclusive class setting step or any other class, the area is set to the class set in the priority class setting step. It is characterized by classifying.
- a fourth image processing method of the present invention is the class included in the exclusive class combination in the third image processing method, further based on the classification result in the region classification step.
- a classification determination value calculating step for calculating a ratio of a region group classified into one class to the plurality of regions, wherein the priority class setting step includes the classification determination value calculation step.
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- An eighth image processing method of the present invention is the image processing method according to the first image processing method, wherein the image further captures a surface of a biological mucous membrane based on a classification result of each of the plurality of regions in the region classification step.
- a ninth image processing method of the present invention is the second image processing method according to the second image processing method, wherein the image images a surface of a biological mucous membrane based on a classification result of each of the plurality of regions in the region classification step.
- a tenth image processing method of the present invention is the same as the fourth image processing method, further comprising: A determination step of determining whether or not the image captures a surface of a biological mucous membrane based on a classification result of each of the plurality of regions in the recording region classification step.
- An eleventh image processing method of the present invention is the above-described eighth image processing method, further comprising the step of assigning a predetermined class among the plurality of different classes based on the classification result obtained in the region classification step.
- An image classification step for classifying the image having the region group classified into the class, and the determination step is based on a classification result of the image classification step, and the ratio is equal to or less than a predetermined threshold value. If the image is determined to be an image that does not sufficiently capture the biological mucosa, and the ratio is greater than a predetermined threshold, the image is sufficiently captured to the biological mucosa. And judging that the image.
- a twelfth image processing method is the ninth image processing method, further comprising a predetermined class among the plurality of different classes based on a classification result obtained by the region classification step.
- a predetermined class among the plurality of different classes based on a classification result obtained by the region classification step.
- An image classification step for classifying the image having the region group classified into the class, and the determination step is based on a classification result of the image classification step, and the ratio is equal to or less than a predetermined threshold value. If the image is determined to be an image that does not sufficiently capture the biological mucosa, and the ratio is greater than a predetermined threshold, the image is sufficiently captured to the biological mucosa. And judging that the image.
- a thirteenth image processing method of the present invention is the tenth image processing method according to the tenth image processing method, further classified into a predetermined class among the plurality of different classes based on a classification result in the region classification step.
- the classification determination value calculating step for calculating the proportion of the plurality of regions occupied in the plurality of regions, the ratio calculated by the classification determination value calculation step, and a predetermined threshold relating to the ratio, Classified area
- An image classification step for classifying the images having a group, and the determination step is based on a classification result of the image classification step, and when the ratio is equal to or less than a predetermined threshold, It is determined that the image is an image in which the mucous membrane is not sufficiently captured, and when the ratio is larger than a predetermined threshold, it is determined that the image is an image in which the biological mucosa is sufficiently captured.
- a fourteenth image processing method of the present invention includes an image input step of inputting an image captured by a medical device having an imaging function, an area dividing step of dividing the image into a plurality of areas, and the plurality of the plurality of image processing methods.
- a feature amount calculating step of calculating a feature amount in each of the regions, and identifying which of the plurality of regions each belongs to a plurality of classes based on the feature amount, and identifying each of the plurality of regions based on the identification result An area classification step for classification, and an imaging organ estimation step for estimating an organ imaged by the medical device based on the classification result in the area classification step.
- the plurality of classes are further classified into a predetermined class based on a classification result in the region classification step.
- the classification determination value calculation step for calculating the ratio of the area group to the plurality of areas
- the ratio calculated by the classification determination value calculation step and the predetermined threshold value for the ratio
- the predetermined class is assigned.
- An image classification step of classifying the image having the classified region group, and the imaging organ estimation step specifies an organ imaged by the image based on a classification result of the image classification step. This is the special number.
- the plurality of classes include at least a gastric mucosa class, a villus class, and a stool class. .
- the specifying unit captures the organ imaged in the image. Is determined to be the stomach.
- the specific unit is an organ imaged in the image. Is determined to be the small intestine.
- the specifying unit determines that the organ imaged in the image is a large intestine. It is determined that it is.
- the feature amount has at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- a twenty-first image processing method of the present invention is based on an image signal input in an image signal input unit that inputs an image signal corresponding to an image captured by a medical device having an imaging function.
- a feature amount calculating step for calculating feature amounts in each of the plurality of regions divided by the image dividing step, and the feature amount calculating step
- a classification criterion setting step for setting a second classification criterion based on the classification result of the region classification step; and the plurality of regions based on the feature amount and the second classification criterion.
- a second region classification step for classifying each of the number into one of a number of classes.
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- the first region classification step includes a statistical classifier using a parameter that defines the first classification criterion.
- the plurality of regions are each classified into one of the plurality of classes, and the second region classification step uses the statistical classifier using a parameter that defines the second classification criterion.
- a plurality of areas are classified into the plurality of classes, respectively, and shifted.
- an image is picked up by a medical device having an image pickup function.
- an image division step for dividing the image captured by the medical device into a plurality of regions, and an image division step
- the feature amount calculating step for calculating the feature amount in each of the plurality of divided regions, the feature amount calculated in the feature amount calculating step, and a predetermined first classification criterion
- the plurality of regions are A region located in the vicinity of the first region, and a classification result obtained by the first region classification step of one region out of the plurality of regions.
- An evaluation value calculation step for evaluating by calculating an evaluation value based on a classification result in the first region classification step, and an evaluation value in the evaluation value calculation step Based on
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- a twenty-sixth image processing method of the present invention is based on an image signal input in an image signal input unit that inputs an image signal corresponding to an image captured by a medical device having an imaging function.
- An image dividing step for dividing the captured image into a plurality of regions, a feature amount calculating step for calculating a feature amount in each of the plurality of regions divided by the image dividing step, and among the plurality of regions
- a region-of-interest setting step for setting one region as a region of interest; a region of detecting a nearby outer region that is a region away from the region of interest by a predetermined distance; and a step of detecting a peripheral region of the neighborhood.
- a substantially circular shape detecting step for detecting the presence of at least a part of a substantially circular contour portion based on the feature amount, and the substantially circular shape detecting step. If the substantially circular shape is detected, characterized by comprising a, a region extraction step of extracting the region of interest.
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- the substantially circular shape detecting step at least a part of the substantially circular contour portion in the vicinity outer peripheral region is A substantially circular shape is detected when it is determined that the ratio of the existing regions is equal to or greater than a predetermined threshold, and the region extracting step extracts the region of interest as a region where the central portion of the substantially circular shape exists.
- the substantially circular shape is a bubble.
- the substantially circular shape is a bubble.
- a thirty-first image processing method of the present invention is based on an image signal input in an image signal input unit that inputs an image signal corresponding to an image captured by a medical device having an imaging function, and the medical device
- an image dividing step for dividing the captured image into a plurality of regions
- a feature amount calculating step for calculating feature amounts in each of the plurality of regions divided by the image dividing step
- the feature amount calculating step A region classification step for classifying the plurality of regions into one of a plurality of classes based on the calculated feature quantity and a predetermined classification criterion, and a structurally clear feature of the plurality of classes.
- a region that is classified into a predetermined class that is set in advance as a class to be detected is detected by the region detection step that detects from the plurality of regions and the region detection step. Based on the feature amount the area has which is characterized in that and a classification criterion setting step for setting the predetermined classification criterion in the area classifying step.
- the feature amount includes at least one of a feature amount related to a color tone and a feature amount related to a texture. It is characterized by that.
- a thirty-third image processing method of the present invention is characterized in that, in the thirty-first image processing method, the predetermined class is at least one of a foam class and a villus class. .
- FIG. 1 An image processing apparatus and peripheral devices in which an image processing operation according to the first embodiment is performed.
- FIG. 2 is an enlarged cross-sectional view showing a main part of a capsule endoscope that generates predetermined image information to be processed in the image processing apparatus according to the first embodiment.
- FIG. 3 A block diagram showing a schematic internal configuration of a capsule endoscope apparatus that supplies predetermined image information to the image processing apparatus of the first embodiment.
- FIG. 4 A diagram showing an example of use of the capsule endoscope apparatus that supplies predetermined image information to the image processing apparatus of the first embodiment.
- FIG. 5 A timing chart showing an example of the signal output from the capsule endoscope shown in FIG.
- FIG. 7 An enlarged cross-sectional view showing the main part of the antenna unit when the capsule endoscope apparatus shown in FIG. 3 is used.
- FIG. 9 An explanatory diagram for explaining a state in which the external device of the capsule endoscope apparatus shown in FIG. 3 is attached to the subject.
- FIG. 10 A block diagram showing the electrical configuration of the capsule endoscope shown in FIG.
- FIG. 11 is a flowchart showing an image processing operation according to the first embodiment.
- FIG. 12 is a diagram showing an example when an input image is divided into (m X n) regions in the image processing operation according to the first embodiment.
- FIG. 13 is a diagram showing an example of an image of the gastric mucosa among a plurality of images constituting the teacher data.
- 14 A diagram showing an example of a villi image among a plurality of images constituting teacher data.
- 15 A diagram showing an example of a fecal image among a plurality of images constituting teacher data.
- Sono 16 A diagram showing an example of a bubble image among a plurality of images constituting teacher data.
- FIG. 17 A schematic diagram showing an example of an image of a body cavity imaged by a capsule endoscope. 18] A diagram showing an example of the classification result of the image shown in FIG.
- FIG. 19 is a flowchart showing an image processing operation different from FIG. 11 in the image processing operation according to the first embodiment.
- FIG. 20 is a diagram showing an example of a main menu screen among the images on the view displayed on the display.
- FIG. 21 is a flowchart showing an image processing operation according to the second embodiment.
- FIG. 22 is a flowchart showing an image processing operation according to the third embodiment.
- FIG. 23 is a flowchart showing an image processing operation according to the fourth embodiment.
- FIG. 24 is a flowchart showing a part of an image processing operation according to the fifth embodiment.
- FIG. 25 is a flowchart showing an image display control operation performed after the processing shown in the flowchart of FIG. 24 is performed as part of the image processing operation according to the fifth embodiment.
- FIG. 26 is a flowchart showing a part of an image processing operation according to the sixth embodiment.
- FIG. 29A is a diagram showing one of eight directions serving as an index when determining an edge feature amount (edge feature vector) in an image processing operation according to the sixth embodiment.
- FIG. 29B is a diagram showing one direction different from FIG. 29A among eight directions serving as an index when determining the edge feature amount in the image processing operation according to the sixth embodiment.
- FIG. 29C is a diagram showing one direction different from FIG. 29A and FIG. 29B among the eight directions that serve as indices when determining the edge feature amount in the image processing operation according to the sixth embodiment.
- FIG. 29D is a diagram showing one direction different from FIGS. 29A to 29C out of the eight directions serving as indices when determining the edge feature amount in the image processing operation according to the sixth embodiment.
- FIG. 29E is a diagram showing one direction different from FIGS. 29A to 29D out of the eight directions serving as indices when determining the edge feature amount in the image processing operation according to the sixth embodiment.
- FIG. 29F is a diagram showing one direction different from FIGS. 29A to 29E out of the eight directions serving as indices when determining the edge feature amount in the image processing operation according to the sixth embodiment.
- FIG. 29G is a diagram showing one direction different from FIGS. 29A to 29F out of the eight directions serving as indices when determining the edge feature amount in the image processing operation according to the sixth embodiment.
- FIG. 29H In the image processing operation according to the sixth embodiment, among the eight directions that serve as indices when determining the edge feature amount, a diagram showing one direction different from that shown in Figs. 29A to 29G.
- FIG. 10 is a diagram showing a positional relationship between a central area and an outermost peripheral area set in an image processing operation according to a sixth embodiment.
- FIG. 31 A diagram showing an angle formed by the direction of the edge feature vector and the direction of the vector VI set in the image processing operation according to the sixth embodiment.
- FIG. 32 is a flowchart showing a part of an image processing operation according to the seventh embodiment.
- FIG. 34 A diagram showing an example when an input image is divided into (m X n) regions in the image processing operation according to the seventh embodiment.
- FIG. 35 is a flowchart showing an image processing operation in the image processing apparatus according to the eighth embodiment.
- FIG. 36 is a flowchart showing an image processing operation in the image processing apparatus according to the eighth embodiment.
- FIG. 37 is a diagram showing an example of determining a neighboring region in one region in the image processing operation according to the eighth embodiment. .
- FIG. 37 A schematic view showing an example of an image of a body cavity image taken by a capsule endoscope used in the image processing operation of the eighth embodiment.
- FIG. 39 is a diagram showing an example of the image classification result shown in FIG. 38.
- FIG. 40 A diagram showing the reclassification result after performing the image processing operation of the eighth embodiment based on the classification result shown in FIG.
- FIG. 41 is a flowchart showing an image processing operation in the ninth embodiment.
- FIG. 42 is a flowchart showing an image processing operation in the ninth embodiment.
- FIG. 43 is a diagram showing an example of an array of numbers virtually assigned to each small rectangular area having 4 ⁇ 4 pixels in the image processing operation according to the ninth embodiment.
- FIG. 44 is a diagram showing a positional relationship of a neighboring outer peripheral area Ht with respect to one rectangular area RO in an image processing operation according to the ninth embodiment.
- FIG. 45 is a diagram showing an example of an angle ⁇ t formed by an approximate gradient vector Ygi and a direction vector: m in the image processing operation according to the ninth embodiment.
- FIG. 46 is a schematic diagram showing an example of an image of a body cavity imaged by a capsule endoscope used in the ninth embodiment.
- FIG. 47 is a diagram showing an example of the image classification result shown in FIG. 46.
- FIG. 1 is an external front view showing the external appearance of an image processing apparatus and peripheral devices that perform an image processing operation according to the first embodiment of the present invention.
- FIG. 2 is an enlarged cross-sectional view showing a main part of a capsule endoscope that generates predetermined image information to be processed by the image processing apparatus of the present embodiment.
- FIG. 3 is a block diagram showing a schematic internal configuration of the capsule endoscope apparatus that supplies predetermined image information to the image processing apparatus of the present embodiment.
- FIG. 4 is a diagram showing an example of use of the capsule endoscope apparatus that supplies predetermined image information to the image processing apparatus of the present embodiment.
- FIG. 5 is a timing chart showing an example of signals output from the capsule endoscope shown in FIG. FIG.
- FIG. 6 is an explanatory diagram for explaining position detection of the capsule endoscope shown in FIG.
- FIG. 7 is an enlarged cross-sectional view showing a main part of the antenna unit when the capsule endoscope apparatus shown in FIG. 3 is used.
- FIG. 8 is an explanatory view for explaining a shield jacket when the capsule endoscope apparatus shown in FIG. 3 is used.
- FIG. 9 is an explanatory view for explaining a state in which the external device of the capsule endoscope apparatus shown in FIG. 3 is attached to the subject.
- FIG. 10 is a block diagram showing an electrical configuration of the capsule endoscope shown in FIG.
- FIG. 11 is a flowchart showing an image processing operation according to the present embodiment.
- FIG. 12 shows the case where the input image is (m X n) in the image processing operation according to the present embodiment. It is a figure which shows an example at the time of dividing
- FIG. 13 is a diagram illustrating an example of an image of the gastric mucosa among a plurality of images constituting the teacher data.
- FIG. 14 is a diagram illustrating an example of a villi image among a plurality of images constituting the teacher data.
- FIG. 15 is a diagram illustrating an example of a fecal image among a plurality of images constituting the teacher data.
- FIG. 16 is a diagram illustrating an example of a bubble image among a plurality of images constituting the teacher data.
- FIG. 13 is a diagram illustrating an example of an image of the gastric mucosa among a plurality of images constituting the teacher data.
- FIG. 14 is a diagram illustrating an example of a villi image among a plurality of images constituting
- FIG. 17 is a schematic diagram illustrating an example of an image of a body cavity imaged by a capsule endoscope.
- FIG. 18 is a diagram showing an example of the image classification result shown in FIG.
- FIG. 19 is a flowchart showing an image processing operation different from FIG. 11 in the image processing operation according to the present embodiment.
- FIG. 20 is a diagram showing an example of the main menu screen among the images on the view displayed on the display.
- the capsule endoscope apparatus 1 for supplying predetermined image information to the image processing apparatus according to the first embodiment of the present invention includes a capsule endoscope 3 and an antenna unit. 4 and the external device 5 constitute a main part.
- the capsule endoscope 3 as a medical device is placed in the body cavity by being swallowed into the locus of the patient 2 as the subject, and then placed in the extinguishing tube by a peristaltic motion. And has an imaging function for imaging the inside of the body cavity and generating the captured image information, and a transmission function for transmitting the captured image information to the outside of the body.
- the antenna unit 4 is installed on the body surface of the patient 2 and receives a plurality of receiving antennas that receive captured image information transmitted from the capsule endoscope 3.
- the external device 5 has an outer shape formed in a box shape, which will be described in detail later.
- a liquid crystal monitor 1 2 for displaying the captured image on the exterior surface of the external equipment 5, and an operation unit 13 for performing an operation instruction of various functions are provided.
- the external device 5 is provided with an LED for displaying a warning regarding the remaining amount of the battery for the driving power source and an operation unit 13 including a switch such as a power switch on the surface of the exterior.
- the capsule endoscope 3 is provided with a calculation execution unit using a CPU and a memory.
- the calculation execution unit receives and stores captured image information. A configuration in which an image processing method according to the present invention described later is executed may be employed.
- the external device 5 is detachably attached to the body of the patient 2 and is attached to the cradle 6 as shown in FIG. 1, whereby image processing according to the first embodiment of the present invention is performed.
- a device (hereinafter referred to as a terminal device) 7 is detachably connected.
- the terminal device 7 is, for example, a personal computer, and displays a terminal main body 9 having various data processing functions and storage functions, a keyboard 8a and a mouse 8b for various operation processing inputs, and various processing results. And a display 8c having a function as a display unit.
- the terminal device 7 has, as a basic function, for example, captured image information stored in the external device 5 via the cradle 6, and a rewritable memory built in the terminal body 9 or the terminal body 9 It has a function of writing and storing it in a portable memory such as a rewritable semiconductor memory that is detachable and displaying the stored captured image information on the display 8c.
- the captured image information stored in the external device 5 may be taken into the terminal device 7 by a USB cable or the like instead of the cradle 6.
- the image processing performed by the terminal device 7 is displayed according to the elapsed time from the captured image information that is captured from the external device 5 and stored in a memory or the like that has a function as a storage unit (not shown), for example.
- the processing for selecting an image and the processing based on the image processing method according to the present invention to be described later are performed in the control unit 9a of the terminal body 9.
- the control unit 9a is composed of a CPU or the like.
- the processing result can be temporarily held in a register or the like (not shown).
- the capsule endoscope 3 includes a substantially hemispherical force bar member formed by an exterior member 14 having a U-shaped cross section and a transparent member that is watertightly attached to the open end of the exterior member 14 with an adhesive. It consists of 14a. Therefore, the exterior of the capsule endoscope 3 is formed to have a watertight structure and a capsule shape in a state where the exterior member 14 and the cover member 14a are connected.
- a capsule-shaped internal hollow portion composed of the exterior member 14 and the cover member 14a, and the cover member 14a is provided at a portion corresponding to the approximate center of the arc of the hemisphere of the cover member 14a.
- An objective lens 15 that captures an image of an observation site incident thereon is housed in a lens frame 16 and arranged.
- a charge coupled device (hereinafter referred to as CCD) 17 that is an imaging device is disposed at the imaging position of the objective lens 15.
- CCD charge coupled device
- four white LEDs 18 that emit and emit illumination light are arranged on the same plane (only two LEDs are shown in the figure). ).
- the CCD 17 is driven to generate a photoelectrically converted imaging signal, and the imaging signal is subjected to predetermined signal processing to generate a captured image signal.
- the processing circuit 19 that performs the imaging processing to be performed and the LED driving processing that controls the operation of lighting of the LED 18 and the non-lighting of the LED 18, and the captured image signal generated by the imaging processing of the processing circuit 19 is converted into a wireless signal and transmitted.
- the CCD 17, the LED 18, the processing circuit 19, the communication processing circuit 20, and the transmission antenna 23 are arranged on a substrate (not shown), and the substrates are connected by a flexible substrate (not shown).
- the processing circuit 19 includes an arithmetic circuit (not shown) for performing image processing to be described later. That is, as shown in FIG. 3, the capsule endoscope 3 includes an imaging device 43 having the CCD 17, LED 18, and processing circuit 19, a transmitter 37 having the communication processing circuit 20, and a transmission antenna 23. Have.
- the image pickup device 43 includes an LED driver 18A for controlling the operation of turning on / off the LED 18, a CCD driver 17A for controlling the drive of the CCD 17 and transferring the photoelectrically converted charge, and the transfer from the CCD 17.
- a processing circuit 19A that generates an imaging signal using the generated charge, and generates a captured image signal by performing predetermined signal processing on the imaging signal; the LED driver 18A, the CCD driver 17A, the processing circuit 19A, and It has a switch section that supplies the drive power from the battery 21 to the transmitter 37, and a timing generator 19B that supplies a timing signal to the switch section and the CCD driver 17A.
- the switch unit switches on / off the power supply from the battery 21 to the LED driver 18A, and the power supply to the CCD 17, the CCD driver 17A, and the processing circuit 19A.
- the timing generator 19B is always supplied with driving power from the battery 21.
- each part other than the timing generator 19B is not It is an operating state.
- the switch 19D is turned on, whereby the CCD 17, the CCD driver 17A, and the processing circuit 19A supplied with power from the battery 21 are in an operating state.
- the electronic shutter of the CCD 17 is operated to remove unnecessary negative current, and then the timing generator 19B turns on the switch 19C to drive the LED driver 18A to light the LED 18. Let the CCD17 be exposed. The LED 18 is turned on for a predetermined time required for exposure of the CCD 17, and then turned off at the timing when the switch 19C is turned off to reduce power consumption.
- the charge stored within the predetermined time when the CCD 17 is exposed is transferred to the processing circuit 19A under the control of the CCD driver 17A.
- the processing circuit 19A generates an image pickup signal based on the charge transferred by the CCD 17 and performs predetermined signal processing on the image pickup signal to generate an endoscope image signal.
- the processing circuit 19A when the signal transmitted from the transmitter 37 is an analog wireless system, the processing circuit 19A generates an analog imaging signal in which the composite synchronization signal is superimposed on the CDS output signal, and then stores the analog imaging signal. Output to the transmitter 37 as an endoscopic image signal.
- the processing circuit 19A further performs a coding process such as scrambling on the serial digital signal generated by the analog / digital converter.
- a digital captured image signal is generated, and the digital captured image signal is output to the transmitter 37 as an endoscope image signal.
- the transmitter 37 performs a modulation process on an analog captured image signal or a digital captured image signal, which is an endoscopic image signal supplied from the processing circuit 19A, to the outside from the transmission antenna 23. Wireless transmission.
- the switch 19E supplies drive power to the transmitter 37 only at the timing when the captured image signal is output from the processing circuit 19A. As shown, the timing generator 19B turns on / off.
- the switch 19E may be controlled so that driving power is supplied to the transmitter 37 after a predetermined time has elapsed since the captured image signal was output from the processing circuit 19A.
- the switch 19E is provided in the capsule endoscope 3 to detect a predetermined pH value by a pH sensor (not shown), to detect a humidity higher than a predetermined value by a humidity sensor (not shown),
- the signal is output from the timing generator 19B based on a detection result such as detection of pressure or acceleration exceeding a predetermined value by an acceleration sensor (not shown)
- the transmitter 37 is inserted into the body cavity of the patient 2 as the subject. It may be configured to be controlled so as to supply power.
- the capsule endoscope 3 is provided with a timer circuit (not shown), and this timer circuit, for example, provides high-speed imaging with a large number of images per second within a predetermined time, for a predetermined time. After the elapse of time, the drive of the imaging device 43 is controlled so as to achieve low-speed imaging with a small number of images taken per second.
- the timer circuit is activated when the capsule endoscope 3 is turned on, and this timer circuit allows, for example, high-speed imaging until the patient 2 passes through the esophagus immediately after swallowing. It is also possible to control the drive of the imaging device 43. Furthermore, a capsule endoscope for low-speed imaging and a capsule endoscope for high-speed imaging may be provided separately and used separately according to the observation target site.
- the antenna unit 4 installed on the body surface of the patient 2 will be described.
- the patient 2 wears a jacket 10 in which an antenna unit 4 including a plurality of receiving antennas 11 is installed.
- this antenna unit 4 is used for GPS, for example, a plurality of receiving antennas 11 having a unidirectional directivity such as patch antennas are directed toward the patient 2 in the body direction.
- a plurality of receiving antennas 11 having a unidirectional directivity such as patch antennas are directed toward the patient 2 in the body direction.
- the plurality of antennas 11 are disposed so as to surround the capsule body 3D in the body. Use this highly directional antenna 11 This makes it less susceptible to interference from radio waves from outside the capsule body 3D inside the body.
- the jacket 10 covers the antenna unit 4 installed on the body surface of the patient 2 and the main body 5D of the external device 5 installed on the waist of the patient 2 by a belt.
- a shield jacket 72 formed of electromagnetic shielding fibers.
- the electromagnetic shield fiber forming the shield jacket 72 metal fiber, metal chemical fiber, copper sulfide-containing fiber, or the like is used.
- the shield jacket 72 is not limited to the jacket shape, and may be, for example, a vest or a one-piece shape.
- a key hole 74 is provided in the external main body 5D of the external device 5, and a key 75 provided in the shield jacket 72 is provided. Is inserted into the key hole 74 so that the belt 73 can be detachably attached.
- a pocket may be simply provided in the shield jacket 72, and the external main body 5D may be accommodated in the pocket, or Velcro (registered trademark) may be attached to the external main body 5D and the shield jacket 72 of the external device 5. It may be installed and fixed with the Velcro (registered trademark).
- the shield jacket 72 to the body on which the antenna unit 4 is arranged, the radio waves from the outside with respect to the antenna unit 4 are shielded and shielded, and the influence of interference by external radio waves is further affected. ing.
- the antenna unit 4 includes a plurality of reception antennas l la to l Id that receive radio signals transmitted from the transmission antenna 23 of the capsule endoscope 3 and an antenna switching switch 45 that switches between the antennas l la to lld. It consists of.
- the external device 5 converts a radio signal from the antenna switching switch 45 into a captured image signal and performs reception processing such as amplification and amplification, and predetermined signal processing is performed on the captured image signal supplied from the reception circuit 33.
- a signal processing circuit 35 that generates a captured image display signal and captured image data, and a liquid crystal monitor 12 that displays the captured image based on the captured image display signal generated by the signal processing circuit 35.
- the memory 47 that stores the captured image data generated by the signal processing circuit 35 and the size of the radio signal received by the receiving circuit 33 And an antenna selection circuit 46 for controlling the antenna switching switch 45.
- a plurality of reception antennas 11 indicated as reception antennas lla to lId in the figure of the antenna unit 4 are radio signals transmitted from the transmission antenna 23 of the capsule endoscope 3 with a certain radio wave intensity.
- the plurality of receiving antennas 11 a to 11 id sequentially switch the receiving antennas that receive the radio signals by controlling the antenna switching switch 45 by the antenna selection signal from the antenna selection circuit 46 of the external device 5.
- the radio signal received for each of the receiving antennas lla to d sequentially switched by the antenna switching switch 45 is output to the receiver 33.
- the reception intensity of the radio signal for each of the reception antennas l la to l Id is detected, and the positional relationship between each reception antenna 11a to l id and the capsule endoscope 3 is calculated.
- the radio signal is demodulated and the captured image signal is output to the signal processing circuit 35.
- the antenna selection circuit 46 is controlled by the output from the receiver 33.
- the radio signal transmitted from the capsule endoscope 3 has an intensity reception that is a transmission period of a reception intensity signal indicating the reception intensity of the radio signal in the transmission period of one frame of the captured image signal. It is assumed that the period and the video signal period, which is the transmission period of the captured image signal, are sequentially transmitted.
- the antenna selection circuit 46 is supplied with the reception intensity of the reception intensity signal received by each of the reception antennas lla to lId via the reception circuit 33.
- the reception intensity of the reception intensity signal of the other antenna is higher than that of the antenna currently receiving the image signal, the reception antenna of the video signal period is switched to the next frame power.
- the antenna selection circuit receiving the comparison result
- the antenna l li with the maximum received intensity is designated by 46 so as to designate the antenna for receiving the image signal.
- FIG. 6 the case where the capsule endoscope 3 is set to the origin of the three-dimensional coordinates X, Y, and ⁇ will be described as an example.
- three receiving antennas l la, 1 lb, and 11c are used, and between the receiving antenna 11a and the receiving antenna l ib The distance is Dab, the distance between the receiving antenna l ib and the receiving antenna 11c is Dbc, and the distance Dac between the receiving antenna 11a and the receiving antenna 11c.
- the receiving antennas l la to l lc and the capsule endoscope 3 have a predetermined distance relationship.
- This distance Li is calculated in advance using relational data such as the amount of radio wave attenuation due to the distance between the capsule endoscope 3 and the receiving antenna l lj.
- the calculated distance data indicating the positional relationship between the capsule endoscope 3 and each receiving antenna 1 lj is stored in the memory 47 as position information of the capsule endoscope 3. Based on the captured image information stored in the memory 47 and the position information of the force Pseudo endoscope 3, it is useful for setting the position of the endoscopic observation findings in the image information processing method described later by the terminal device 7. .
- the i-th image of N images (1 ⁇ N) taken consecutively in time is Ii (l ⁇ i ⁇ N), and the RGB planes are Ri, Gi, respectively.
- Bi the first pixel (1 ⁇ 1313 ⁇ ) that can be produced in each plane
- rik gi k
- bik the first pixel (1 ⁇ 1313 ⁇ ) that can be produced in each plane.
- the image processing operation in the image processing apparatus in the present embodiment is performed as a process in the control unit 9a of the terminal body 9 of the terminal device 7 described above.
- control unit 9a having a function as an image signal input unit and an image input unit inputs and receives an image signal based on an image of a body cavity imaged by the capsule endoscope 3.
- preprocessing for example, noise removal and median filtering are performed by median filtering, and halo pixels and dark pixels are processed later. In order to exclude them from the detection, they are detected by threshold-based processing (step Sl in Fig. 11).
- the processing based on the threshold value is, for example, if the density values of rik, gik, and bik are all 10 or less, and the density values of rik, gik, and bik are all 230 or more. If it is a value of, it is processed as a halation pixel.
- the control unit 9a having functions as an image dividing unit and a region setting unit divides each of Ri, Gi, and Bi planes into small regions (step S2 in FIG. 11).
- the control unit 9a uses the Ri, Gi, and Bi planes in the x-axis direction.
- the control unit 9a treats the area where m or n is not an integer, in some cases, the area at the extreme end where the size is a fraction as an area having a fractional number of pixels, Or, the following processing power is excluded.
- control unit 9a having a function as a feature amount calculation unit, in each divided region, tone information that reflects the color difference on the image of the image to be imaged and the image on the image of the image to be imaged Texture information that reflects the difference in structure is calculated as a feature quantity (step S3 in FIG. 11).
- one of the regions divided by the control unit 9a is denoted as Hst (l ⁇ s ⁇ m, l ⁇ t ⁇ n).
- the tone information calculated by the control unit 9a is an average value of gik / rik (hereinafter referred to as x gst) as a value based on a ratio of RGB values of each pixel included in one region Hst. ) And the average value of bik / rik (hereinafter referred to as / i bst).
- x gst the average value of gik / rik
- / i bst the average value of bik / rik
- Each value of / i gst and / i bst takes a small value in a similar manner in a region showing a relatively red tone such as a gastric mucosa.
- each value of / i gst and Ai bst takes a large value in a similar manner in an area having a relatively white tone such as the small intestine.
- each value of / gst and / bst takes a value such that ⁇ gst> ⁇ bst in a relatively yellow tone region such as feces.
- the texture information calculated by the control unit 9a reflects the difference in the structure of the image to be imaged as described above. And the structure on the image of the image to be imaged is shown, for example, as a fine structure such as villi on the mucosal surface, and an irregular pattern of stool.
- the texture information calculated by the control unit 9a includes the standard deviations arst, agst, and abst of the RGB values of each pixel included in one region Hst, and the pixel information included in the one region Hst.
- the RGB value variation coefficients CVrst, CVgst, and CVbst are shown as three feature values, which are the average values of RGB values divided by mrst, mgst, and mbst.
- CVbst ⁇ bst / mbst ⁇ (3) Illumination supplied to the object to be imaged by the coefficient of variation CVrst, CVgst, and CVbst calculated using Equation (1), Equation (2), and Equation (3) above
- the degree of pixel variation due to the texture structure can be quantified regardless of the effect of light intensity.
- the values of CVrst, CVgst, and CVbst are, for example, in a region where the structure on the image is relatively flat, such as the gastric mucosa imaged during normal observation, in a state where magnification observation is not performed. Since there is no clear texture structure, the value is almost the same.
- each value of CVrst, CVgst, and CVbst takes a large value in a similar manner in a region where the structure on the image includes a relatively large number of edges, such as villi of the small intestine.
- the control unit 9a of the present embodiment In the subsequent processing performed by the control unit 9a of the present embodiment, five feature quantities having color tone information and texture information are used. The values constituting the feature quantities are used by the user, etc. It is possible to make changes or additions as appropriate.
- the control unit 9a replaces the values of ⁇ gst and / i bst as color information, and is a value indicated as chromaticity, which is a ratio of rik, gik and bik in each pixel of each region, that is, rik
- the following processing may be performed using the values of / (rik + gik + bik), gik / (rik + gik + bik) and bik / (rik + gik + bik).
- the control unit 9a removes the five feature values composed of tone information and texture information, that is, the values of / igst, tbst, CVrst, CVgst, and CVbst, excluding the halation pixel and the dark portion pixel. Based on the RGB value of each pixel, it is calculated for each (m X n) area Hst. In the present embodiment, for example, in the (lx X ly) pixels included in one region Hst, the ratio of the sum of the number of halation pixels and the number of buttocks pixels is 50%. If it exceeds, control may be performed so as to exclude one area Hst from the subsequent processing.
- control unit 9a having a function as a region classification unit executes identification classification based on the five feature amounts calculated in each of the divided regions Hst, thereby performing imaging for each region. After identifying what the image object is, each region Hst is classified based on the identification result
- the control unit 9a is prepared in advance as images constituting four classes of teacher data consisting of gastric mucosa, villi, feces, and bubbles.
- FIG. 13, FIG. 14, FIG. 15 and FIG. 16 are used to calculate the above-mentioned five feature quantities determined each time in one region of the image, and then create a linear discriminant function for each of the four classes.
- the control unit 9a uses the linear discriminant function as the discriminator calculated by the above-described procedure, for example, one region Hst out of four classes consisting of gastric mucosa, villi, feces, and bubbles. Which class belongs is identified, and classification based on the identification result is performed.
- control unit 9a performs the identification and classification as described above for all the regions Hst included in the image Ii, as shown in FIG. A classification result is obtained as if the villi and bubbles were classified (step S4 in FIG. 11).
- the classification and classification performed by the control unit 9a of the present embodiment for each region of the image using the classifier is not limited to that based on the linear discriminant function. It may be based on a technique such as IJ or neural network.
- the control unit 9a identifies and classifies each region of the input image as one of the four classes of gastric mucosa, villi, feces, and bubbles. The number and types of classes to be classified can be changed or added as appropriate according to the user's usage.
- the control unit 9a may classify the esophagus or the colonic mucosa, for example, or classify the duodenum and villus as separate classes. There may be.
- control unit 9a having a function as a classification determination value calculation unit is classified based on the above-described classification result, that is, the total number of regions classified as biological mucosal surfaces, that is, classified as gastric mucosa or villus.
- the value p of the ratio of z to the total number of areas (m X n) of the image Ii is calculated based on the following formula (4) (step S5 in FIG. 11).
- the control unit 9a having a function as an image classification unit, the image Ii displays the surface of the biological mucosa.
- the ratio p calculated by Equation (4) is compared with the threshold value thr.
- the control unit 9a detects that the value p of the ratio p in the image Ii is greater than the threshold value th r (step S6 in FIG. 11)
- the image Ii is an image in which the surface of the biological mucosa is sufficiently captured. That is, it is identified and classified as an image that needs to be observed, and a flag value flagi as a reference value is set to 1 (step S7 in FIG. 11).
- the value of the threshold value thr is assumed to be 0.5.
- control unit 9a detects that the value p of the ratio p in the image Ii is equal to or less than the threshold value thr (step S6 in Fig. 11), the surface of the biological mucosa is sufficiently detected by the feces and bubbles. Are identified and classified as images that are not necessary for observation, and the flag value flagi is set to 0 (step S8 in FIG. 11).
- control unit 9a adds 1 to the image number i (step S in Fig. 11) when the above-described processing has not been completed for all the input images Ii (step S9 in Fig. 11). 10) For the next image, the processing shown in steps S1 to S9 in FIG. 11 is continued.
- the control unit 9a having a function as an image display control unit performs the processing as described above, so that when the user observes the image to be imaged, the control unit 9a is based on the flag value flagi. For example, only the image that needs to be observed with the flag value flagi being 1 is displayed on the display 8c, and the image that has the flag value flagi of 0 and is not necessary to be observed is not displayed on the display 8c. S can.
- the control unit 9a having a function as an image deletion unit may reduce the size of image data to be stored by deleting an image whose flag value flag is 0 and which is unnecessary for observation. good.
- the ratio p was calculated from the total number z of regions classified as gastric mucosa or villi.
- the processing performed by the control unit 9a is not limited to this.For example, as described below, the number of regions zl classified as gastric mucosa and the number of regions z2 classified as villi are individually It may be something like handling.
- control unit 9a performs processing such as calculating the ratio of the number of regions to the total number of regions (m X n) of the image Ii individually for each class based on the number of regions classified into each class. It may be what you do.
- the control unit 9a obtains classification results for all the regions Hst included in the image Ii by performing the processing from step S1 to step S4 in FIG. 11 (step Sl 1 in FIG. 19). If the number of regions classified as gastric mucosa is zl, the number of regions classified as villi is z2, and the number of regions classified as stool is z3, the controller 9a Based on the ratio pi of the number of regions classified as gastric mucosa to the total number of regions in image Ii (m X n) and the number of regions z2 classified as villi are the total number of regions in image Ii (m X The ratio p2 in n) and the ratio p3 in which the number of regions z3 classified as feces occupy the total number of regions (m X n) of the image Ii are respectively calculated (step S12 in FIG. 19).
- the controller 9a compares the value of the ratio pi with the threshold value thrl.
- the control unit 9a having a function as an image classification unit detects that the ratio value pi of the image Ii is greater than the threshold value thrl (step S13 in FIG. 19)
- the image Ii is used as a subject and the stomach image is detected. It is identified and classified as having been imaged (step S14 in FIG. 19).
- the value of the threshold thrl is assumed to be 0.8.
- control unit 9a detects that the value pi of the ratio pi in the image Ii is equal to or less than the threshold value thrl (step S13 in Fig. 11), this time, the value of the ratio p2 and the threshold value thr2 Make a comparison.
- the control unit 9a having a function as an image classification unit detects that the value of the ratio p2 in the image Ii is larger than the threshold value thr2 (step S15 in FIG. 19)
- the control unit 9a uses the image Ii as a subject in the small intestine ( The villi are identified and classified as being taken (step S 16 in FIG. 19).
- the threshold value thr2 is assumed to be 0.8.
- control unit 9a detects that the value p2 of the ratio p2 in the image Ii is equal to or less than the threshold value thr2 (step S15 in Fig. 11), this time, the value of the ratio p3 and the threshold value thr3 are determined. Make a comparison.
- the control unit 9a having the function as the image classification unit detects that the value p3 of the image Ii is greater than the value power threshold thr3 (step S17 in FIG. 19)
- the proportion of stool in the image Ii is large. Therefore, it is assumed that the image of the large intestine is taken with the image Ii as the subject. Are identified and classified (step S18 in FIG. 19).
- the value of the threshold value thr3 is assumed to be 0 ⁇ 8.
- control unit 9a suspends identification and classification of images that have not been identified and classified as any of the gastric mucosa, villi, and feces in the above processing. Then, the control unit 9a adds 1 to the image number i (step S20 in FIG. 19) when the above-described processing has not been completed for all the input images Ii (step S19 in FIG. 19). For the next image, the processing from step S11 to step S19 in FIG. 19 is continued.
- the control unit 9a performs the processing from step S11 to step S19 in Fig. 19 described above, thereby identifying and classifying the small intestine image and the large intestine image. Each is specified. In other words, the control unit 9a performs the processing from step S11 to step S19 in FIG. 19 as described above to determine whether the organ imaged as the subject is the stomach, the small intestine, or the large intestine. Can be detected.
- the main screen 101 of the view shown in FIG. 20 includes an image display unit 102 that displays an image of an image to be imaged, a patient / examination information display unit 103 that displays information about the patient and the examination contents, An image information display unit 104 that displays the number of images, an image display control unit 105 that performs display control on the image display unit 102, and a slider 106 are provided.
- the slider 106 has a function of displaying a desired image on the image display unit 102 based on an instruction using a mouse cursor (not shown). Further, the slider 106 has a guidance display unit 107 for indicating the start position of the image in which the small intestine is imaged and the image in which the large intestine is imaged. Therefore, the user can easily observe a desired site in the body cavity, for example, when a case in which bleeding of the small intestine is suspected is observed preferentially and efficiently.
- a button with a description such as “not shown” or “small intestine” may be provided on the main screen 101 of the view shown in FIG. 20, for example.
- the control unit 9a of the terminal device 7 divides the image Ii into rectangular regions having the number of pixels in the X axis direction lx the number of pixels in the y axis direction ly. However, even if, for example, an area having the size of the number of pixels lx / 2 in the X-axis direction and the number of pixels ly / 2 in the y-axis direction is overlapped with the rectangular area, the same processing is performed. good. In this case, the power S is positive to reduce misclassification that may occur when the boundaries of each class to be classified are included in the rectangular area.
- the control unit 9a of the terminal device 7 uses the image processing method of the present embodiment to identify and classify the gastric mucosa and the villi as separate classes. For example, after identifying and classifying the gastric mucosa and villus as one class of “biological mucosal surface”, and further classifying the gastric mucosa and villus into the region Hst classified as “biological mucosal surface”, The identification and classification in the two classes may be performed again.
- the capsule endoscope 3 sequentially proceeds from the stomach to the small intestine. Therefore, based on the obtained classification result, the control unit 9a of the terminal device 7 switches to the gastric mucosa class when, for example, the ratio of the regions classified as villi to the total number of regions exceeds 0.7. Processing for stopping the classification may be performed.
- the image processing method of the present embodiment may be used as an application for realizing a discriminator based on the color tone and pattern of the mucosal surface by setting, for example, an esophageal class and a large intestine class. Is possible.
- the ratio of the regions classified as stool is large with respect to the total number of regions, and the proportion of the regions classified as villi is small with respect to the total number of regions. It can also be used for purposes such as distinguishing from the large intestine.
- the image of the gastric mucosa and villus as the image of the biological mucosal surface is different from the surface of the biological mucosa, foreign matter or non- It is possible to identify and classify fecal and bubble images as images of biological mucous membranes for each image, and to display only images that need to be observed on the display 8c. As a result, the user can observe the body cavity in a state in which the image of the surface of the living body mucosa is not well captured, and as a result, the efficiency of observation using the capsule endoscope apparatus 1 can be improved. Improvements can be made.
- the image processing method of the present embodiment described above is used in combination with an image processing method for detecting a lesion site such as bleeding or redness, for example, so that the detection result of the lesion site is a living body. It is possible to determine whether or not the force is obtained from the mucosal surface, and as a result, it is possible to improve the detection accuracy of the lesion site.
- the control unit 9a of the terminal device 7 uses an image processing method for detecting a lesion site such as bleeding or redness, and refers to the classification result of the region Hst extracted as a region where a lesion is suspected.
- the classification result is a non-biological mucosal surface image such as stool or foam
- the detection accuracy of the lesion site can be improved by performing a process that treats it as a false detection.
- FIG. 21 relates to the second embodiment of the present invention. Note that detailed description of portions having the same configuration as in the first embodiment is omitted. In addition, the same components as those in the first embodiment are denoted by the same reference numerals and description thereof is omitted. Furthermore, the configuration of the capsule endoscope apparatus 1 in the present embodiment is the same as that in the first embodiment, and the image processing method in the present embodiment is also the same as the terminal device 7, for example, a personal It is assumed to be realized as a program executed on a computer. The image processing method according to the present embodiment is performed as processing in the control unit 9a included in the terminal body 9.
- FIG. 21 is a flowchart showing an image processing operation according to the present embodiment.
- the control unit 9a inputs an image signal based on the image of the image in the body cavity imaged by the capsule endoscope 3, and Ri, Gi, and N that constitute the input i-th image Ii.
- the control unit 9a For each Bi plane, for example, noise removal and reverse y correction by median filtering are performed as pre-processing, and the halation pixels and dark pixels are excluded from the subsequent processing targets, so detection is performed by processing based on threshold values. (Step S21 in FIG. 21).
- the processing based on the threshold is, for example, that all of the density values of rik, gik, and bik are values of 10 or less. If this is the case, the processing is performed so that the pixel is a halo pixel if all the density values of rik, gik, and bik are 230 or more.
- the control unit 9a divides the Ri, Gi, and Bi planes into small regions (step S22 in Fig. 21).
- the control unit 9a processes the endmost area whose size is a fraction as an area having a fractional number of pixels or excludes it from the subsequent processing targets. It shall be.
- the control unit 9a reflects the color tone information reflecting the color difference on the image of the image to be imaged and the difference in structure on the image of the image to be imaged.
- the texture information is calculated as a feature amount (step S23 in FIG. 21).
- one region out of the regions divided by the control unit 9a is represented as Hj (1 ⁇ j ⁇ m X n).
- the tone information calculated by the control unit 9a is an average value of gik / rik (hereinafter referred to as Ai gj as a value based on a ratio of RGB values of each pixel included in one region Hj. 2) and the average value of bik / rik (hereinafter referred to as / i bj).
- Ai gj average value of gik / rik
- / i bj the average value of bik / rik
- Each value of / i gj and i bj takes a value from 0 to 1.
- each value of ⁇ p ⁇ bj takes a small value in a similar manner in a region showing a relatively red tone such as the gastric mucosa.
- each value of / i gj and i bj is substantially the same in a region showing a relatively white tone such as the small intestine.
- each value of / gj and i bj takes a value such that ⁇ gi> ⁇ bj in a relatively yellow tone region such as feces.
- the texture information calculated by the control unit 9a reflects the difference in the structure of the image to be imaged as described above. And the structure on the image of the image to be imaged is shown, for example, as a fine structure such as villi on the mucosal surface, and an irregular pattern of stool.
- the texture information calculated by the control unit 9a is the standard deviation ⁇ r of the RGB values of each pixel included in one region Hj. j, ⁇ gj, and ⁇ bj divided by the average RGB values mrj, mgj, and mbj of each pixel included in one region Hj, the variation coefficient CVrj of RGB values shown as three features CV gj and CVbj.
- the calculation formulas for calculating the coefficient of variation CVrj, CVgj, and CVbj are shown as the following formulas (6), (7), and (8).
- CVrj CT rj / mrj (6)
- CVbj CT bj / mbj ⁇ ' ⁇ (8) Illumination light quantity supplied to the imaging target by the coefficient of variation CVrj, CVgj and CVbj calculated by the above equations (6), (7) and (8)
- the degree of pixel variation due to texture structure can be quantified regardless of the effect of the difference.
- Each value of CVrj, CVgj, and CVbj has a clear texture in a region where the structure on the image is relatively flat, such as the gastric mucosa imaged in normal observation in a state where magnification observation is not performed. Since there is no structure, the value is almost the same.
- each value of CVrj, CVgj, and CVbj takes a large value in a similar manner, for example, in a region that includes a relatively large number of edges in the structure on the image, such as villi of the small intestine.
- the control unit 9a sets the five feature quantities including the tone information and the texture information, that is, the values of / igg, / ibj, CVrj, CVgj, and CVbj, and the halation pixel and the dark part pixel. Based on the RGB value of each excluded pixel, it is calculated for each of (m X n) regions Hj. In the present embodiment, for example, in the (Ix X ly) pixels included in one region Hj, the ratio of the sum of the number of halation pixels and the number of dark pixels exceeds 50%. In such a case, control may be performed so as to exclude one region Hj from the subsequent processing.
- ⁇ ( ⁇ a) is the prior probability of occurrence, and feature vectors determined from the five feature quantities in one region Hj are used, and the probability of occurrence of feature vectors from all classes
- p (x) be a probability density function based on
- p (SI ca) be a state-dependent probability density (multivariate normal probability density) function based on the probability of occurrence of a feature vector 2 £ from one class ⁇ a
- belonging to the class ⁇ a of the generated feature vector ⁇ g is shown as the following formula (9).
- d indicates the number of dimensions that is the same as the number of feature quantities of JL ⁇ and ⁇ a are the mean vectors of the feature vectors in class ⁇ a And the variance-covariance matrix in one class coa.
- a transposed matrix of ( ⁇ & ) 3 ⁇ 4 ( ⁇ ia) is shown, I ⁇ a I denotes a determinant of ⁇ a, and ⁇ a ⁇ 1 denotes an inverse matrix of ⁇ a.
- P ( ⁇ a) are equal in all classes, and the probability density function p () is calculated by the above equation (11). It shall be expressed as a function common to all classes.
- the mean vector and the variance-covariance matrix ⁇ a as the classification criteria are elements constituting the parameter in one class ⁇ a, and before the first image II is input to the terminal device 7.
- the terminal device 7 based on a plurality of images constituting four classes of teacher data consisting of gastric mucosa, villi, stool and foam, for example, as shown in FIGS. 13, 14, 15, and 16, After being calculated in advance for each class from the feature vector 2 £ determined each time in one area of the image, it is stored in the terminal device 7 as an initial value.
- the variance-covariance matrix ⁇ a is a matrix that shows the variation and spread of the distribution of the feature vector ⁇ belonging to one class ⁇ a, and the number of dimensions d is the same as the number of features of the feature vector ⁇ . On the other hand, it is expressed as a dXd matrix.
- the control unit 9a generates the a posteriori probability ⁇ ( ⁇
- control unit 9a determines whether or not the classification result of the one region Hj classified into the class ⁇ a is accurate after the above processing. Further, a process based on the threshold for the value of the probability density function Pl (2 I coa) that gives the maximum posterior probability PI (coa I) is further performed.
- the control unit 9a for example, out of the average values of the five feature amounts of the average vector, for example, the standard deviation ⁇ of the feature amount xl with respect to the average value ⁇ of the feature amount xl And a threshold vector 2 ⁇ 1 including a value obtained by adding a product of a multiplication coefficient as a predetermined constant.
- a threshold vector ⁇ kl is expressed, for example, by the following mathematical formula (12).
- the value of the multiplication coefficient H is 1.5.
- the control unit 9a substitutes the threshold vector ⁇ hi as the ⁇ in the formula (9), the formula (10), and the formula (11), so that one region Hj classified
- the value of the probability density function p (2 £ kl I ⁇ a) is calculated as the threshold value of the class co a.
- control unit 9a detects that the value of pi I ⁇ a) is larger than the value of p
- control unit 9a detects that the value of pl (x
- step S29 in Fig. 21 the control unit 9a adds 1 to the region number j (step in Fig. 21).
- step S30 the processing shown in steps S25 to S29 in FIG. 21 is performed.
- step S31 in Fig. 21 the controller 9a adds 1 to the image number i (step in Fig. 21). S32), the process shown in steps S21 to S31 in FIG. 21 is continued for the next image.
- the five-dimensional multivariate normal probability density that determines the feature vector 2 £ using all of the five feature amounts based on the tone information and texture information at a time is used.
- the case to prescribe was described
- image classification can be performed with higher accuracy.
- the control unit 9a sets the state-dependent probability density function for the two feature quantities ⁇ gj and / bj constituting the tone information as pcIca), and configures the texture information.
- the state-dependent probability density function for the three feature values CVrj, CVgj, and CVbj is calculated as pt (xt I co a).
- the control unit 9a uses these two state-dependent probability density functions pc (Icoa) and pt (
- ⁇ ( ⁇ I x) Pc ( W a
- the threshold for judging the accuracy of the classification result for class ⁇ a is based on the mean vectors ⁇ c and ⁇ t of the feature values of the color tone information and the texture information, respectively, and the standard deviations ⁇ cl and ⁇ tl. For example, it is set as p (2 ⁇ I oa) and p ( ⁇ i I ⁇ a).
- the prior probability P (coa) is assumed to be the same in all classes.
- the prior probability P (ca) is set based on, for example, the time distribution of the part to be imaged by the capsule endoscope 3, and the prior probability P ( ⁇ a) of the villus class or stool class is set higher, or the capsule endoscope Based on the risk of misclassification of the part imaged by Mirror 3, the prior probability ⁇ ( ⁇ a) of the gastric mucosa class and villus class is set higher than the stool class and foam class that do not require observation, etc. It may be set to a value according to the application.
- images of the gastric mucosa and villus as images of the surface of the living mucosa, and images of stool and bubbles as images of foreign matter or non-living mucosa. Can be identified and classified for each small area of the image.
- the user can easily exclude images that do not have a good image of the surface of the biological mucosa, such as images of foreign matter occupying most of the small areas of the image, as images that do not require observation.
- the observation efficiency in the capsule-type endoscope apparatus 1 can be improved.
- the control unit 9a of the terminal device 7 can obtain a highly reliable image classification result by performing processing using the image processing method in the present embodiment.
- FIG. 22 relates to the third embodiment of the present invention. Note that detailed description of portions having the same configurations as those of the first embodiment and the second embodiment is omitted. In addition, the same components as those in the first embodiment and the second embodiment are denoted by the same reference numerals, and description thereof is omitted. Furthermore, the configuration of the capsule endoscope apparatus 1 in the present embodiment is the same as that in the first embodiment and the second embodiment, and the image processing method in the present embodiment is also similar. It is assumed that the terminal device 7 is realized, for example, as a program executed on a personal computer. The image processing method in the present embodiment is performed as processing in the control unit 9a included in the terminal body 9.
- FIG. 22 is a flowchart showing an image processing operation according to the present embodiment.
- the control unit 9a Before performing the image processing method according to the present embodiment, the control unit 9a first performs step S21 shown in FIG. 21 described in the second embodiment for the input i-th image Ii. To S30, the classification result for the image Ii is obtained (step S41 in FIG. 22).
- the classification result obtained by the control unit 9a by performing the process shown in step S41 of FIG. 22 indicates that each region in the image includes a gastric mucosa class, a villus class, a stool class, a foam class, and an unknown class. It is assumed that the classification result is classified into one of the following five classes.
- control unit 9a having a function as a classification determination value calculation unit sets the number of regions classified into the villi class based on the classification result for the image Ii to the total number of regions (m X n) of the image Ii.
- the value of the share J is calculated (step S42 in FIG. 22).
- the ratio J is specifically a value calculated by substituting p in the above formula (4) for J and z for the number of regions za classified into the villi class, for example.
- control unit 9a compares the value of the ratio J with the threshold value thrj.
- control unit 9a having the function as the region classifying unit detects that the value p of the ratio p in the image Ii is greater than the threshold value thrj (step S43 in FIG. 22)
- the gastric mucosa The area classified into the class is reclassified as the villi class (step S44 in FIG. 22).
- the value of the threshold value thrj is assumed to be 0.5.
- control unit 9a having a function as a region classification unit has a value of the ratio p in the image Ii.
- the region classified into the villi class is reclassified as the gastric mucosa class in each region Hj of the image Ii (step in FIG. 22). S45).
- control unit 9a When the reclassification of the image Ii by the processing as described above is completed for the image Ii, the control unit 9a now performs the series of operations from step S41 in FIG. Processing is performed (step S46 in FIG. 22).
- the control unit 9a of the terminal device 7 further includes regions Hj classified into the gastric mucosa and villus classes with respect to the obtained images. To reclassify by exchanging them exclusively. Therefore, the control unit 9a of the terminal device 7 performs processing using the image processing method according to the present embodiment, and the image of the gastric mucosa that cannot exist in a single image ( Misclassification in the case of classification with images of villi images of the small intestine can be eliminated, and as a result, highly accurate image classification results can be obtained.
- FIG. 23 relates to a fourth embodiment of the present invention. Note that a detailed description of parts having the same configurations as those of the first to third embodiments is omitted. In addition, the same components as those in the first to third embodiments are denoted by the same reference numerals and description thereof is omitted. Furthermore, the configuration of the capsule endoscope apparatus 1 in the present embodiment is the same as that in the first to third embodiments, and the image processing method in the present embodiment is also similar. It is assumed that the terminal device 7 is realized as a program executed on, for example, a personal computer. The image processing method in this embodiment is performed as processing in the control unit 9a included in the terminal body 9.
- FIG. 23 is a flowchart showing an image processing operation according to the present embodiment.
- the control unit 9a Before performing the image processing method according to the present embodiment, the control unit 9a first inputs an image signal based on the image of the image in the body cavity imaged by the capsule endoscope 3, and the input The processing from step S21 to step S30 shown in FIG. 21 described in the second embodiment is performed on the i-th image Ii, and the classification result for the image Ii is obtained (step of FIG. 23). S51).
- the classification result obtained by the control unit 9a by performing the process shown in step S51 of FIG. 23 indicates that each region in the image is composed of a fecal class, a foam class, an unknown class, and a biological mucous membrane class. Assume that the classification results are classified into one of the two classes.
- step S55 in Fig. 23 the control unit 9a adds 1 to the region number j (step in Fig. 23).
- step S56 For the next area, the processing shown in steps S53 to S55 in FIG. 23 is performed.
- control unit 9a of the terminal device 7 when performing processing using the image processing method in the present embodiment, the control unit 9a of the terminal device 7 further performs an area Hj classified into the biological mucous membrane class on the image obtained as a result of classification. Is reclassified as gastric mucosa class or villus class. Therefore, the control unit 9a of the terminal device 7 performs processing using the image processing method in the present embodiment to classify the gastric mucosa image and the (small intestine) villi image with high accuracy. It can be carried out.
- control unit 9a of the terminal device 7 is used in combination with the image processing method in the present embodiment and the image processing method in the third embodiment, so that an image of the gastric mucosa and an image of the (small intestine) villi It is possible to obtain a classification result in which the images are classified with higher accuracy.
- the image processing methods described in the first to fourth embodiments of the present invention apply to the image of the image captured by the capsule endoscope 3 of the capsule endoscope apparatus 1.
- it is used for an image of an image captured by an endoscope apparatus having an endoscope having an imaging element and an objective optical system at the distal end portion of the insertion portion. Also good.
- the image processing method in the present embodiment is also realized as a program executed in the personal computer as the terminal device 7, for example.
- the image processing method in the present embodiment is performed as a process in the control unit 9a included in the terminal body 9. Furthermore, in the present embodiment, the control unit 9a Assume that the image processing described below is used for a series of images input in advance to the terminal device 7.
- FIG. 24 is a flowchart showing a part of the image processing operation according to the present embodiment.
- FIG. 25 is a flowchart showing an image display control operation performed after the processing shown in the flowchart of FIG. 24 is performed as a part of the image processing operation according to the present embodiment.
- the control unit 9a Before performing the image processing method according to the present embodiment, the control unit 9a first inputs the i-th image Ii (Il ⁇ Ii ⁇ of the N images (1 ⁇ N) in total. The classification result for IN) is obtained (step S61 in Fig. 24). In the present embodiment, the control unit 9a performs, for example, the processing from step S1 to step S4 shown in FIG. 11 or the step S shown in FIG. 21 as a processing method for obtaining the classification result for the image Ii. Any processing method from 21 to step S30 can be used.
- the classification result obtained by the control unit 9a by performing the processing for obtaining the classification result for the image Ii is that each region in the image includes the gastric mucosa class, the villus class, the stool class, the foam class, and the unknown class. Assume that the classification results are classified as one of the five classes. Furthermore, when the control unit 9a performs processing for obtaining a classification result for the image Ii, the control unit 9a detects that the number of pixels such as an extremely dark portion or halation is greater than or equal to a predetermined threshold in one region. For example, the one area may be classified as either a fecal class, a foam class, or an unknown class.
- the control unit 9a determines whether the image Ii is an image that does not require observation by comparing the value of the ratio K with a threshold value thlx (in this embodiment, for example, 0.7). To do.
- a threshold value thlx in this embodiment, for example, 0.7
- the control unit 9a determines the region classified as the stool class and the bubble class.
- the ratio K1 may be compared with the threshold value thlx after calculating the ratio K1 that the total number of images occupies the total number of areas (m X n) of the image Ii. Further, in the process for performing the determination as described above, the control unit 9a performs the region classified as the gastric mucosa class and the villus class.
- the value of the ratio K2 may be compared with a threshold thly (eg, 0 ⁇ 3).
- a threshold thly eg, 0 ⁇ 3
- the threshold value thlx and the threshold value thly are not limited to values that are fixed in advance.
- the user can set desired values by operating the terminal device 7.
- the user can select how much an image obtained by imaging the surface of the biological mucosa is to be observed. For this reason, for example, the user places importance on observation efficiency when performing screening examinations to detect lesions, and observes more images in more detail when conducting detailed examinations, depending on the application.
- the controller 9a can execute the image processing method of the present embodiment.
- the control unit 9a detects that the value K of the ratio K in the image Ii is greater than or equal to the threshold value thlx (step S63 in Fig. 24)
- the image Ii is regarded as an image that does not require observation, and the reference value
- the flag value kflagi is set to 1 (step S64 in FIG. 24).
- control unit 9a detects that the value K of the ratio K in the image Ii is smaller than the threshold value thlx (step S63 in Fig. 24), the image Ii is not an image that does not require observation, and the flag value kflagi Is set to 0, for example (step S65 in FIG. 24).
- control unit 9a holds the flag value kflagi determined by the processing as described above in association with the image Ii (step S66 in FIG. 24).
- control unit 9a determines whether all the images up to the image II force image IN have been classified by the processing as described above (step S67 in Fig. 24), and classifies all the images. If not, this time, a series of processing from step S61 in FIG. 24 is performed on the (i + 1) -th image Ii + 1 (step S68 in FIG. 24). In addition, when the control unit 9a classifies all the images from the image II to the image IN by the processing as described above, the control unit 9a applies to a series of images that are input in advance to the terminal device 7. End processing
- control unit 9a When the value of the flag value kflagi associated with the image Ii is 1 (step S72 in FIG. 25), the control unit 9a performs display control so that the image Ii is not displayed on the display 8c for each unit of the terminal device 7. (Step S73 in FIG. 25).
- step S72 in Fig. 25 the control unit 9a displays the image Ii as a display. Display control to be displayed on the play 8c is performed on each part of the terminal device 7 (step S74 in FIG. 25).
- control unit 9a determines whether or not the display control processing as described above has been performed for all the images from the image Ii to the image IN (step S75 in FIG. 25), and for all the images. If the process is not performed, 1 is added to the image number i, and then a series of processes from step S71 in FIG. 25 are performed on the next image Ii + 1 (step S76 in FIG. 25). In addition, when the control unit 9a performs the display control process as described above for all the images from the image II to the image IN, the user finishes sequentially observing a series of images from the image II to the image IN. The display control process is terminated.
- the image processing method described above is not limited to the case where a user sequentially observes a series of images from image II to image IN, starting with image II. For example, It may be used when the user selects and observes a desired image among a series of images from image II to image IN.
- the control unit 9a may control each unit of the terminal device 7 so as not to store an image classified as an image that does not require observation. After classification, each unit of the terminal device 7 may be controlled so as to delete the image stored in the storage unit (not shown).
- images of the gastric mucosa and villus as images of the surface of the biological mucosa, and images of stool and bubbles as images of foreign matter or non-biological mucosa. Can be identified and classified for each image, and only the image that needs to be observed can be displayed on the display 8c. Therefore, the user can observe the body cavity in a state in which an image on which the surface of the biological mucosa is not well captured is excluded. It is possible to improve the observation efficiency in the observation using the portable endoscope apparatus 1.
- FIG. 26 to FIG. 31 relate to the sixth embodiment of the present invention. Note that detailed description of portions having the same configurations as those in the first to fifth embodiments is omitted. In addition, the same components as those in the first embodiment to the fifth embodiment are denoted by the same reference numerals, and description thereof is omitted. Further, the configuration of the capsule endoscope apparatus 1 in the present embodiment is the same as that in the first to fifth embodiments, and the image processing method in the present embodiment is also similar. It is assumed that the terminal device 7 is realized, for example, as a program executed in a personal computer. The image processing method according to the present embodiment is performed as processing in the control unit 9a included in the terminal body 9.
- FIG. 26 is a flowchart showing a part of the image processing operation according to the present embodiment.
- FIG. 27 is a flowchart showing a part of the image processing operation according to the present embodiment, which is performed subsequent to the processing of FIG.
- FIG. 28 is a flowchart showing a part of the image processing operation according to the present embodiment, which is performed subsequent to the processing of FIG.
- FIG. 29A is a diagram showing one of the eight directions used as an index when determining an edge feature amount (also referred to as an edge feature vector) in the image processing operation according to the present embodiment. It is.
- FIG. 29B is a diagram showing one direction different from FIG. 29A out of the eight directions serving as indices when determining the edge feature amount in the image processing operation according to the present embodiment.
- FIG. 29A is a diagram showing one of the eight directions used as an index when determining an edge feature amount (also referred to as an edge feature vector) in the image processing operation according to the present embodiment.
- FIG. 29B is a diagram
- FIG. 29C is a diagram showing one direction different from FIG. 29A and FIG. 29B among the eight directions that serve as indices when determining the edge feature amount in the image processing operation according to the present embodiment. is there.
- FIG. 29D is a diagram showing one direction different from FIGS. 29A to 29C among the eight directions that serve as indices when determining the edge feature amount in the image processing operation according to the present embodiment.
- FIG. 29E is a diagram showing one direction different from FIGS. 29A to 29D out of the eight directions used as an index when determining the edge feature amount in the image processing operation according to the present embodiment. It is.
- FIG. 29F is a diagram showing one direction different from FIGS.
- the FIG. 29G determines the edge feature amount in the image processing operation according to the present embodiment.
- FIG. 30 is a diagram showing one direction different from FIG. 29A to FIG. 29F among the eight directions serving as indices.
- FIG. 29H is a diagram showing one direction different from FIGS. 29A to 29G among the eight directions serving as indices when determining the edge feature amount in the image processing operation according to the present embodiment.
- FIG. 30 is a diagram showing a positional relationship between the center area and the outermost peripheral area set in the image processing operation according to the present embodiment.
- FIG. 31 is a diagram showing an angle formed by the direction of the edge feature vector and the direction of the vector VI set in the image processing operation according to the present embodiment.
- control unit 9a Before performing the image processing method according to the present embodiment, the control unit 9a first inputs an image signal based on the image of the image in the body cavity imaged by the capsule endoscope 3, and the input The classification result for the i-th image Ii is obtained (step S81 in FIG. 26). Note that in the present embodiment, the control unit 9a performs, for example, the processing from step S1 to step S4 shown in FIG. 11 or the step shown in FIG. Any of the processing methods from step S21 to step S30 may be used.
- the classification result obtained by the control unit 9a by performing the process for obtaining the classification result for the image Ii indicates that each region in the image is classified into the gastric mucosa class, the villus class, the stool class, the foam class, and the unknown class. It is assumed that the classification result is classified into one of the following five classes. Further, in the present embodiment, the control unit 9a divides the input i-th image Ii into MXM regions Hk (l ⁇ k ⁇ MXM) when performing the process of step S81 in FIG.
- control unit 9a classified each of the MXM regions Hk into any one of the five classes including the gastric mucosa class, the villi class, the stool class, the foam class, and the unknown class. It is assumed that such a classification result is obtained.
- the control unit 9a calculates the average value gak of the density values of the G (green) pixels in each of the MXM regions Hk (Fig. 26). Step S82). Then, the control unit 9a determines the G pixel based on the average value gak of the G pixel density values in the region Hk and the average value gakt of the G pixel density values in each of the regions Hkt (t ⁇ 8) adjacent to the region Hk.
- the variation amount Gbt of the density value is calculated by the following equation (14) (step S83 in FIG. 26).
- the control unit 9a uses the value of Gbt in each region Hkt obtained by the above equation (14). That is, the maximum value is the maximum value Gbm, and among the region Hkt, the region Hktm that gives the maximum value Gbm is the direction in which dirGbm exists with respect to the region Hk, then these two are the edge features of the region Hk This is held as a quantity (step S84 in FIG. 26). It should be noted that the region where the maximum value Gbm of the density value variation of the G pixel Gbm is given Hktm force The direction existing for the region Hk is one of the directions 1 to 8 shown in FIGS. It is assumed that the direction is determined.
- the control unit 9a having a function as an edge detection unit compares the maximum value Gbm with the threshold value thr el (step S85 in FIG. 26), and the maximum value Gbm is larger than the threshold value threl. In this case, it is determined that the edge in the image Ii exists in the region Hk (step S86 in FIG. 26). If the maximum value Gbm is equal to or smaller than the threshold threl, it is determined that the edge in the image Ii does not exist in the region Hk. (Step S87 in FIG. 26).
- the threshold value threl is assumed to be 0.3, for example.
- control unit 9a adds 1 to the value of the region number k, and performs processing using the above equation (14) shown in step S83 to step S87 in FIG. 26 for all (MXM) regions Hk.
- an area where an edge exists in the image Ii is specified (step S88 and step S89 in FIG. 26).
- control unit 9a performs, for example, inverse gamma correction or shading correction as preprocessing for the image Ii. It is also possible to perform the process.
- the control unit 9a performs the above-described processing on all the (MXM) regions Hk to identify the region where the edge exists in the image Ii, and then sets the region Hk as the central region.
- Ml X M1 (Ml ⁇ M) area evaluation areas are acquired (step S91 in FIG. 27). Thereafter, the control unit 9a detects the outermost peripheral area number D, which is the number of areas existing on the outermost periphery among the areas constituting the arrangement evaluation area (step S92 in FIG. 27). In addition, the control unit 9a calculates a vector VI that is a vector facing the direction in which the region Hk exists in each of the D outermost peripheral regions Hkl in the arrangement evaluation region (step S93 in FIG. 27). Note that the positional relationship between the region Hk and the outermost peripheral region Hkl is, for example, as shown in FIG.
- each of the regions defined as the outermost peripheral region Hkl and the control unit 9a is calculated (step S94 in FIG. 27).
- the control unit 9a determines that the edge exists in the processing from step S81 to step S89 in FIG. 26 among the D outermost peripheral regions Hkl, and 6 1 ⁇ thre2.
- the number of areas E is detected (step S95 in FIG. 27).
- the value of the threshold thre2 is assumed to be 45 °, for example.
- the control unit 9a calculates the value of EZD, and when the value of E / D is larger than the threshold value thre3 (step S96 in FIG. 27), the region Hk is represented as a bleeding part in the image Ii. It is determined that the region is a bleeding candidate region (step S97 in FIG. 27).
- the control unit 9a determines that the edge exists in the process from step S81 to step S89 in FIG.
- E outermost peripheral regions Hkl that satisfy 0 1 ⁇ thre2 are determined to be bleeding region edge candidate regions in which a bleeding region edge may exist (step S97 in FIG. 27).
- the value of the threshold value thre3 is assumed to be 0 ⁇ 7, for example.
- control unit 9a adds 1 to the value of the region number k, and calculates the equation (14) shown in steps S91 to S97 in Fig. 27 for all (MXM) regions Hk.
- a candidate for a region where the bleeding part exists and a candidate for a region where the edge of the bleeding part exists are specified in the image Ii (step S98 and step S99 in FIG. 27).
- the control unit 9a performs the above-described processing on all (MXM) regions Hk, thereby identifying the bleeding portion candidate region and the bleeding portion edge candidate region in the image Ii.
- the number H of blood extraction area candidate regions is detected (step S101 in FIG. 28).
- the control unit 9a detects E bleeding part edge candidate areas corresponding to the respective bleeding part candidate areas (step S102 in FIG. 28).
- the control unit 9a corresponds to the region Hk as the bleeding part candidate region, and in the outermost peripheral region Hkl as the bleeding part edge candidate region, the density of the G pixel having the region defined as the region Hkl.
- control unit 9a sets the density value of the R (red) pixel in the region defined as the region Hkl in the outermost peripheral region Hkl as the bleeding region edge candidate region corresponding to the region Hk as the bleeding region candidate region.
- the average value rakl and the average value raklt of the density values of the R pixels in each region Hklt (t ⁇ 8) adjacent to the region Hkl are calculated.
- the control unit 9a calculates the variation amount Rblt of the density value of the R pixel by the following formula (16) based on the average value rakl and the average value raklt (step S104 in FIG. 28).
- the control unit 9a calculates the value of Gbm / Rblt as the color edge feature amount based on the maximum value Gbm of the region defined as the region Hkl and the variation amount Rblt in the direction dirGbm. After that, the control unit 9a detects the number F of regions where Gbm / Rblt> thre4 among the E regions Hkl (step S105 in FIG. 28).
- the value of the threshold value t hre4 is assumed to be 1.0, for example.
- the value used as the color edge feature amount in the processing performed by the control unit 9a is not limited to the Gbm / Rblt value.
- control unit 9a uses, for example, the value of Gbm / Bblt as a color edge feature amount based on the variation amount Bblt of the density value of the B pixel calculated by a method substantially similar to the variation amount Rblt. It may be a thing.
- the control unit 9a as the bleeding site determination unit calculates the value of F / E, and when the value of F / E is larger than the threshold value thre5 (step S106 in Fig. 28), the region Hk is image Ii. 28, and the region Hkl is determined as the bleeding region edge region corresponding to the region Hk (step S107 in FIG. 28). Further, when the value of F / E is equal to or less than the threshold thre5 (step S106 in FIG. 28), the control unit 9a determines that the region Hk is not a bleeding part (step S108 in FIG. 28). In the present embodiment, the value of the threshold thre5 is assumed to be 0.7, for example.
- control unit 9a repeats the processing shown in Step S101 to Step S108 of FIG. 28 for all the H regions Hk detected as the bleeding region candidate regions, so that there is a bleeding portion in the image Ii. And the region where the edge of the bleeding part exists (step S109 and step S110 in FIG. 28).
- control unit 9a identifies the region where the bleeding part exists in the image Ii, FIG.
- the control unit 9a having a function as a classification result determination unit includes the region Hk specified as the region where the bleeding part exists, the gastric mucosa class included in the class related to the biological mucosa, the villi class included in the class related to the biological mucosa, Alternatively, if it is detected that the class of the unknown class is classified as a shift (step S112 in FIG.
- control unit 9a classifies the region Hk specified as the region where the bleeding part exists as a stool class included in the class related to the non-biological mucosa, or as a foam class included in the class related to the non-biological mucosa.
- step SI12 in FIG. 28 it is determined that the classification result of the region Hk in the image Ii is an error (step S114 in FIG. 28).
- control unit 9a determines that the region Hk is classified as For example, another determination may be made as “an area where a lesion site may be imaged” without performing the correct / incorrect determination.
- control unit 9a determines whether or not the processing as described above has been performed for all the input images Ii (step S115 in FIG. 28), and the processing has not been performed for all the images. Then, a series of processing from step S71 in FIG. 25 is performed on the next image Ii + 1 (step S116 in FIG. 28). In addition, when the processing as described above is performed on all input images Ii, the control unit 9a ends the processing.
- the control unit 9a of the terminal device 7 specifies the region Hk where the bleeding portion exists in the image Ii, and then the region Hk of the image Ii obtained in advance.
- the correctness / incorrectness is processed for the classification result of Therefore, the control unit 9a of the terminal device 7 uses the image processing method in the present embodiment, for example, a bleeding site as a lesion site in a region where a foreign object such as feces or an image of a non-biological mucosa is captured.
- the configuration of the capsule endoscope apparatus 1 in the present embodiment is the same as that in the first to sixth embodiments, and the image processing method in the present embodiment is also similar. It is assumed that the terminal device 7 is realized, for example, as a program executed in a personal computer. The image processing method according to the present embodiment is performed as processing in the control unit 9a included in the terminal body 9.
- FIG. 32 is a flowchart showing a part of the image processing operation according to the present embodiment.
- FIG. 33 is a flowchart showing a part of the image processing operation according to the present embodiment, which is performed subsequent to the processing of FIG.
- FIG. 34 is a diagram showing an example when an input image is divided into (m X n) regions in the image processing operation according to the present embodiment.
- the control unit 9a inputs an image signal based on the image of the image in the body cavity imaged by the capsule endoscope 3, and Ri, Gi, and N that constitute the input i-th image Ii.
- the processing based on the threshold value is, for example, if the density values of rik, gik, and bik are all 10 or less, and the density values of rik, gik, and bik are all 230 or more. If it is a value, it is processed as a gray pixel.
- the control unit 9a divides each of the Ri, Gi, and Bi planes into small regions (step S202 in FIG. 32).
- the control unit 9a processes the endmost region whose size is a fraction as a region having a fractional pixel number, or a subsequent processing target. Shall be excluded.
- the control unit 9a reflects the color tone information reflecting the color difference on the image of the image to be imaged and the difference in structure on the image of the image to be imaged. Texture information is calculated as a feature amount (step S203 in FIG. 32).
- one region among the regions divided by the control unit 9a is denoted as Hj (l ⁇ j ⁇ m X n).
- the tone information calculated by the control unit 9a is an average value of gik / rik (hereinafter referred to as x gj) as a value based on a ratio of RGB values of each pixel included in one region Hj. 2) and the average value of bikZrik (hereinafter referred to as z bj).
- x gj a value based on a ratio of RGB values of each pixel included in one region Hj. 2
- z bj the average value of bikZrik
- the texture information calculated by the control unit 9a reflects the difference in the structure of the image to be imaged as described above. And the structure on the image of the image to be imaged is shown, for example, as a fine structure such as villi on the mucosal surface, and an irregular pattern of stool.
- the texture information calculated by the control unit 9a includes the standard deviations ⁇ rj, ⁇ , and ⁇ bj of the RGB values of each pixel included in one region Hj as the RGB of each pixel included in the one region Hj.
- the average value of the values is the RGB coefficient of variation CVrj, CV gj and CVbj, which are the three feature values divided by mrj, mgj and mbj.
- Calculation formulas for calculating the coefficient of variation CVrj, CVgj, and CVbj are shown as the following formula (17), formula (18), and formula (19).
- CVgj a gj / mgj '' (18)
- CVbj a bj / mbj (19)
- the degree of pixel variation due to the texture structure can be quantified regardless of the effect of the difference.
- the values of CVrj, CVgj, and CVbj are clear in a region where the structure on the image is relatively flat, such as the gastric mucosa imaged in normal observation, for example, in a state where magnification observation is not performed. Since there is no smooth texture structure, the value is almost the same.
- each value of CVrj, CVgj, and CVbj takes a large value in a similar manner, for example, in a region that includes a relatively large number of edges in the structure on the image, such as villi of the small intestine.
- the control unit 9a removes the values of the five feature amounts including the tone information and the texture information, that is, z gj, z bj, CVrj, CVgj, and CVbj from the halation pixel and the heel part pixel. Based on the RGB value of each pixel, it is calculated for each of m X n regions Hj. In the present embodiment, for example, in the (lx X ly) pixels of one region Hj, the ratio of the sum of the number of halation pixels and the number of dark pixels exceeds 50%. In such a case, control may be performed so as to exclude one region Hj from the subsequent processing.
- p (SI ca) is a state-dependent probability density (multivariate normal probability density) function based on the probability of occurrence of a feature vector ⁇ from, the posterior probability belonging to the class ⁇ a of the generated feature vector ⁇ g (The calculation formula for calculating ⁇ a
- Equation (22) ⁇ ⁇ ⁇ ⁇ (2 1)
- d indicates the number of dimensions that is the same as the number of feature quantities of 2 £
- ⁇ a is an average vector of feature vectors in class ⁇ a and one class ⁇
- P ( ⁇ a) are equal in all classes
- the probability density function p () is calculated by the above equation (22). It shall be expressed as a function common to all classes.
- the mean vector and the variance-covariance matrix ⁇ a used as classification criteria are elements that constitute a parameter in one class ⁇ a.
- a plurality of images constituting four classes of teacher data consisting of gastric mucosa, villi, feces and foam, for example, in the first embodiment
- the feature vectors s determined each time in each region of the image are calculated in advance for each class, Each is recorded in the terminal device 7 as an initial value.
- the control unit 9a may estimate the parameter by adding the feature vector of each class in the image Ii to the feature vector in the teacher data of each class.
- the average vector “a” is a vector having the same number of dimensions as that of the feature vector s and consisting of an average value of each of the five feature quantities of the feature vector 2 £.
- force Sx (xl, x2, x3, x4, x5)
- the mean beta The average value of each of the five features of the feature vector is / ⁇ 1, / ⁇ 2, / ⁇ 3, / ⁇ 4, and i / ⁇ 4, ⁇ 5).
- the variance-covariance matrix ⁇ a is a matrix indicating the distribution and spread of the distribution of the feature vector 2 £ belonging to one class ⁇ a
- the dimension number d is the same as the number of feature quantities of the feature vector ⁇ .
- the control unit 9a generates the posterior probability ⁇ ⁇ ( ⁇
- control unit 9a determines whether or not the classification result of the one region Hj classified into the class ⁇ a is accurate in accordance with the above processing. Further, a process based on the threshold for the value of the probability density function Pl (2 I coa) that gives the maximum posterior probability PI (coa I) is further performed.
- the control unit 9a for example, out of the average values of the five feature amounts of the average vector, for example, the standard deviation ⁇ of the feature amount xl with respect to the average value ⁇ of the feature amount xl.
- a threshold vector 2 ⁇ including a value obtained by adding a product of the multiplication coefficient ⁇ as a predetermined constant is determined.
- Such a threshold vector ⁇ kl is expressed, for example, by the following formula (23), and in the present embodiment, the value of the multiplication coefficient H is 1.5.
- the control unit 9a substitutes the threshold vector ⁇ hi as ⁇ in the above formula (20), formula (21), and formula (22), so that one region
- the value of the probability density function p I ca) is calculated as the threshold value of the class ca into which Hj is classified.
- control unit 9a detects that the value of pi I coa) is larger than the value of p ( ⁇ kl
- control unit 9a detects that the value of pl (x
- step S209 in Fig. 32 When the classification for all the divided m X n regions has not been completed (step S209 in Fig. 32), the control unit 9a adds 1 to the region number j (step S210 in Fig. 32). For the next region, the processing shown in steps S205 and S209 in FIG. 32 is performed. In addition, when the classification for all the divided m X n regions is completed (step S209 in FIG. 32), the control unit 9a determines that the m X n classification results in the image Ii and the m X n regions are Based on each feature quantity, the average beta norm ⁇ and variance-covariance matrix ⁇ a are calculated again as classification criteria in each of the four classes (step S211 in Fig. 33).
- the control unit 9a is generated using the equation (20) to the equation (22) in which the average vector and the variance-covariance matrix ⁇ a calculated in the process shown in step S211 of Fig. 33 are substituted.
- the posterior probability ⁇ ⁇ ⁇ ( ⁇ 3 I belonging to 3 and the generated feature vector ⁇ ⁇ 4 I belonging again to the class ⁇ 4 are calculated again.
- the control unit 9a then calculates these four posterior probabilities.
- classification is performed on the assumption that the feature vector belongs to the class ⁇ a that gives the maximum posterior probability P2 (co a I, and the region where the feature vector ⁇ occurs based on the identification result.
- Post The value of the probability density function P2 (2 £ I ⁇ a) giving the probability P2 (coa I) is calculated.
- control unit 9a determines whether or not the reclassification result of the one area Hj reclassified to the class ⁇ a is accurate after the above processing. Then, further processing based on the threshold is performed on the value of the probability density function P2 ( ⁇ I ⁇ a) that gives the maximum posterior probability P2 (coa I).
- a threshold vector ⁇ b including a value obtained by adding the standard deviation ⁇ xl of the feature quantity xl and the product of a multiplication coefficient as a predetermined constant is determined.
- the control unit 9a substitutes the threshold vector ⁇ as in the above formula (20), formula (21), and formula (2 2), and one region Hj is reclassified. Calculate the value of the probability density function P ( ⁇ l coa) as the threshold value of the class ca.
- control unit 9a detects that the value of p2 (2 £ Icoa) is larger than the value of ⁇ ( ⁇
- control unit 9a detects that the value of ⁇ 2 ( ⁇
- step S217 in Fig. 33 If the classification for all the divided mXn regions has not been completed (step S217 in Fig. 33), the control unit 9a adds 1 to the region number j (step S218 in Fig. 33), and The processing shown in Step S213 to Step S217 in FIG. 33 is performed for this area.
- the control unit 9a for the (i + 1) -th image Ii + 1, starts from step S201 in FIG. A series of processes are performed (step S219 in FIG. 33).
- the mean vector and the variance-covariance matrix ⁇ a calculated in the process shown in step S211 of FIG. 33 are performed on the (i + 1) -th image Ii + 1, and step S205 of FIG. It may be used in the processing shown in FIG. In this case, continuous in time Image classification can be performed with higher accuracy by dynamically changing the parameters used for image identification and classification between images.
- the control unit 9a performs the process of calculating the average vector ⁇ and the variance covariance matrix ⁇ a again for both the tone information and the texture information in the process shown in step S211 of FIG. For example, a process for calculating again the variance-covariance matrix ⁇ a for only one of the tone information and the texture information may be performed.
- the control unit 9a of the terminal device 7 performs the process using the classification result of the image Ii shown in step S201 and step S218 in Fig. 32 and Fig. 33 as described above, for example, It is possible to determine whether or not the image Ii is an image of the surface of the living mucosa, which is an image of the gastric mucosa, villi, etc.
- the control unit 9a counts the number of regions classified into each class in the classification result of the image Ii shown in step S201 to step S218 in Fig. 32 and Fig. 33, for example, the gastric mucosa. Calculate the percentage of the total number of areas (m X n) by the number of areas A classified into the class or villi class. Then, when A / (m X n) is equal to or greater than a predetermined threshold (for example, 0.8), the control unit 9a determines that the image Ii is an image of the biological mucosa surface. Thereby, the control unit 9a can extract an image that is surely an image of the biological mucosa surface.
- a predetermined threshold for example, 0.8
- the five-dimensional multivariate normal probability density that determines the feature vector 2 £ using all the five feature amounts based on the tone information and texture information at a time is used.
- the case where it prescribes was described.
- image classification can be performed with higher accuracy.
- the control unit 9a sets the state-dependent probability density function for the two feature values ⁇ gj and / zbj constituting the tone information as pcIca), and the texture information.
- the controller 9a uses these two state-dependent probability density functions pc I coa) and pt ( ⁇
- the threshold for judging the accuracy of the classification result for class ⁇ a is based on the mean vectors ⁇ c and ⁇ t of the feature values of the color tone information and the texture information, respectively, and the standard deviations ⁇ cl and ⁇ tl. For example, it is set as p (2 ⁇ I oa) and p ( ⁇ i I ⁇ a).
- the classification result is accurate As a result, the region Hj having the feature vector xc and the feature vector xt is classified as one of the gastric mucosa, villi, stool or foam class, and otherwise classified as the unknown class.
- the prior probability P (coa) is a force that is assumed to be equal in all classes.
- the prior probability P (coa) is set based on, for example, the time distribution of the part to be imaged by the capsule endoscope 3 and the prior probability P ( ⁇ a) of the villus class or stool class is set higher, or the capsule endoscope Based on the risk of misclassification of the part imaged by Mirror 3, the prior probability ⁇ ( ⁇ a) of the gastric mucosa class and villus class is set higher than the stool class and foam class that do not require observation, etc. It may be set to a value according to the application.
- the statistical discriminator used by the control unit 9a when classifying images is not limited to that based on Bayes' theorem, for example, based on a linear discriminant function or the like. Even if there is good ,.
- an image of the gastric mucosa and villus as an image of the surface of the living mucosa and an image of feces and bubbles as an image of a foreign body or a non-living mucosa are displayed.
- Each small area can be identified and classified. Therefore, the user can easily exclude images that are not well-captured images of the surface of the biological mucosa, such as images of foreign objects occupying most of the small areas of the image, as images that do not require observation.
- the capsule endoscope device 1 The observation efficiency used can be improved.
- the parameter of the statistical discriminator can be calculated as an optimum value according to the image input to the terminal device 7. Therefore, the capsule endoscope device 1 used in this embodiment is characterized by the color tone of the surface of the living mucous membrane and individual differences in the fine structure, characteristic variations of each part constituting the capsule endoscope 3, etc. Even when the amount changes, classification of each region of the image input to the terminal device 7 can be performed with high accuracy.
- a normal mucosal image and disease By performing processing using an image processing method that classifies the image as an abnormal site image, the detection accuracy of the lesion site can be improved.
- the control unit 9a performs a process that further uses a classification criterion such as, for example, if the stool image occupies most of the image, the colon is an imaged image.
- a classification criterion such as, for example, if the stool image occupies most of the image, the colon is an imaged image.
- An organ imaged in the image can be specified.
- 35 to 40 relate to an eighth embodiment of the present invention.
- the same components as those in the first embodiment to the seventh embodiment are denoted by the same reference numerals and description thereof is omitted.
- the configuration of the capsule endoscope apparatus 1 in the present embodiment is the same as that in the first to seventh embodiments, and the image processing method in the present embodiment is also similar. It is assumed that the terminal device 7 is realized, for example, as a program executed in a personal computer.
- the image processing method according to the present embodiment is performed as processing in the control unit 9a included in the terminal body 9.
- FIG. 35 is a flowchart showing an image processing operation in the image processing apparatus of the present embodiment.
- FIG. 36 is a flowchart showing an image processing operation in the image processing apparatus of this embodiment. It is a chart.
- FIG. 37 is a diagram showing an example when determining a neighboring region in one region in the image processing operation of the present embodiment.
- FIG. 38 is a schematic diagram illustrating an example of an image of a body cavity image captured by a capsule endoscope used in the image processing operation of the present embodiment.
- FIG. 39 is a diagram showing an example of the image classification result shown in FIG.
- FIG. 40 is a diagram showing a reclassification result after performing the image processing operation of the present embodiment based on the classification result shown in FIG.
- the controller 9a Before performing the image processing operation in the present embodiment, the controller 9a first inputs an image signal based on the image of the image in the body cavity imaged by the capsule endoscope 3, and the input i For the second image Ii, the processing from step S201 to step S218 shown in FIG. 32 and FIG. 33 described in the seventh embodiment is performed, and the classification result for image Ii is obtained (step in FIG. 35). S221).
- countmax indicating the maximum number of region integration iterations is a value set by the user.
- the control unit 9a performs region integration processing as will be described in detail later, for the number of countmax values set by the user. In the present embodiment, the following description will be made assuming that the value of countmax is 5.
- Each of the five classes from class 1 to class 5 has a one-to-one correspondence with any of the gastric mucosa class, villus class, stool class, foam class and unknown class.
- each of the eight neighboring regions of Hj corresponds to one of the rectangular regions from region 1 to region 8 as shown in FIG.
- each neighboring region of one region Hj is hereinafter referred to as Hjs.
- the control unit 9a performs the subsequent processing only for the region that can be set as the neighborhood region of the one region Hj.
- the control unit 9a determines whether or not the neighborhood region Hjs has been classified into class c (step S234 in FIG. 36).
- the control unit 9a uses the cost function represented by the following formula (25) and evaluates it. Add 0.2 to the value Vjc (step S235 in Figure 36).
- Vjc Vjc + 0.2 (25)
- control unit 9a adds 1 to the value of the region s, and performs processing using the above formula (25) shown in Step S234 and Step S235 of FIG. 36 for all the neighboring regions Hjs of the one region Hj.
- the evaluation value in class c is calculated (step S2 36 and step S237 in FIG. 36).
- control unit 9a when determining the evaluation value in class c (step S236 in Fig. 36), the control unit 9a adds 1 to the value of class c and continues the series of processing shown in step S237 from step S232 in Fig. 36. By repeating the above, the evaluation values in all classes from class 1 to class 5 are calculated (step S238 and step S239 in FIG. 36).
- the control unit 9a having a function as a region classification unit compares the value of each Vjc, that is, the values of Vjl, Vj2, Vj3, Vj4, and Vj5, to class c that gives the minimum Vjc value, Reclassify one region Hj (step S224 in FIG. 35). Note that, when there are a plurality of classes c that give the minimum Vjc value, the control unit 9a performs processing such as selecting a class that minimizes c.
- Step S225 in Fig. 35 the control unit 9a sets 1 to the region number j.
- Step S226 in FIG. 35 For the next region, the processing shown in Step S223 and Step S224 in FIG. 35 and the processing from Step S231 to Step S239 in FIG. 36 are repeated.
- step S227 in Fig. 35 When the classification for all the divided m X n regions is completed and the control unit 9a is smaller than the value Scountmax of the count (step S227 in Fig. 35), the control unit 9a sets the count value to 1 (Step S228 in FIG. 35), the process shown in FIG. 35 from step S222 to step S227 and the processing from step S231 in FIG. Repeat the process. Further, the control unit 9a completes the classification for all the divided m X n regions, and the count value is equal to or greater than the countmax value (step S227 in FIG. 35), this time, (i + 1) A series of processing from step S221 in FIG. 35 is performed on the) th image Ii + 1 (step S229 in FIG. 35).
- FIG. 38, FIG. 39, and FIG. 40 show an example when the control unit 9a of the terminal device 7 performs processing using the image processing operation in the present embodiment.
- FIG. 38 is a diagram schematically showing an image corresponding to the image ⁇ ⁇ ⁇ ⁇ in the image processing operation in the present embodiment described above.
- the control unit 9a of the terminal device 7 obtains a classification result as shown in FIG. 39 in the process shown in step S221 of FIG. .
- the control unit 9a further sets the region integration processing shown in step S222 and subsequent steps in FIG. 35 as the countmax value (5 times in the present embodiment). Repeatedly, the reclassification result as shown in Fig. 40 is obtained.
- the control unit 9a of the terminal device 7 reclassifies a predetermined region of the image input to the terminal device 7 and performs a classification result in a neighborhood region of the predetermined region. Therefore, it is possible to classify each area of the image with higher accuracy while suppressing the occurrence of misclassified areas.
- the ninth embodiment of the present invention relate to the ninth embodiment of the present invention. Note that detailed description of parts having the same configurations as those in the first to eighth embodiments is omitted. Further, the same components as those in the first embodiment to the eighth embodiment are denoted by the same reference numerals and description thereof is omitted. Further, the configuration of the capsule endoscope apparatus 1 in the present embodiment is the same as that in the first to eighth embodiments, and the image processing method in the present embodiment is also similar. It is assumed that the terminal device 7 is realized, for example, as a program executed in a personal computer. The image processing method according to the present embodiment is performed as processing in the control unit 9a included in the terminal body 9.
- FIG. 41 is a flowchart showing an image processing operation in the present embodiment.
- FIG. 42 is a flowchart showing an image processing operation in the present embodiment.
- FIG. 43 is a diagram showing an example of an array of numbers virtually assigned to each small rectangular area having 4 ⁇ 4 pixels in the image processing operation according to the present embodiment.
- FIG. 44 is a diagram showing the positional relationship of the neighboring outer peripheral area Ht with respect to one rectangular area RO in the image processing operation according to the present embodiment.
- FIG. 45 is a diagram illustrating an example of an angle ⁇ t formed by the approximate g radient vector sl and the direction vector d! In the image processing operation according to the present embodiment.
- FIG. 46 is a schematic diagram showing an example of an image of a body cavity imaged by the capsule endoscope used in the present embodiment.
- FIG. 47 is a diagram showing an example of the image classification result shown in FIG.
- the control unit 9a of the terminal device 7 performs substantially the same processing as described above from Step S201 to Step S203 in FIG. That is, the control unit 9a inputs an image signal based on the image of the image inside the body cavity imaged by the capsule endoscope 3, and performs preprocessing on the input i-th image Ii (see FIG. In step S241 of 41), the image Ii is divided into m ⁇ n small regions (step S242 in FIG. 41), and in each divided region, the color tone information and texture information are calculated as feature amounts (in FIG. 41). Step S243).
- control unit 9a performs image processing described below, and is a class preset as a class having structurally clear characteristics among the divided areas, for example, areas classified into the bubble class The It is detected (step S244 in FIG. 41).
- control unit 9a further divides each rectangular area having the number of pixels of 8 ⁇ 8 into four in the plane Gi that is the green plane among the planes of the image Ii.
- small rectangular areas each having 4 ⁇ 4 pixels are generated (step S25 1 in FIG. 42).
- the control unit 9a performs G (green) in each small rectangular area having 4 ⁇ 4 pixels.
- G green
- a number having an arrangement as shown in FIG. 43 is virtually assigned to each small rectangular area having 4 ⁇ 4 pixels.
- control unit 9a sets the direction consisting of the line segment connecting the area 1 and the area 7 or the area 2 and the area 8 to the vertical direction on the image, and the direction consisting of the line segment connecting the area 3 and the area 4 or the area 5 and the area 6 to each other.
- the horizontal direction on the image, the direction consisting of the line segment connecting area 2 and area 5 or area 4 and area 7 is the left diagonal direction on the image
- the direction consisting of the line segment connecting area 1 and area 6 or area 3 and area 8 Is set as the diagonal direction to the right on the image, and the logarithmic difference value of the average value ga of the density values of the G pixels is calculated between the regions.
- control unit 9a determines that there is an array of pixels indicating substantially circular edges in the direction on the image where there is a combination of regions where the value of the logarithmic difference is maximum, and the maximum value of the logarithmic difference And the direction on the image that gives the maximum value are temporarily held.
- control unit 9a having a function as the attention region setting unit and the neighboring outer peripheral region detection unit is configured as one of the attention regions as shown in FIG. 44 among the rectangular regions having the number of pixels of 8 X 8.
- Is set as a coordinate (xo, yo) as a region where the center of the substantially circular shape exists, and then the distance Q (Q 1, 2, 3, (3)
- the control unit 9a having a function as a vector calculation unit obtains coordinates (x After all of t, yt) are detected, the approximate gradient beta sl in each detected neighboring outer peripheral region and the direction vector dl connecting the coordinates (xt, yt) and coordinates (xo, yo) are calculated (Fig. 42). Step S253).
- the approximate gradient vector gi in each of the neighboring outer peripheral regions Ht has the maximum value of the logarithmic difference of the average value ga of the density values of the G pixels held in the control unit 9a, and A vector having a direction on the image giving the maximum value as a direction.
- the control unit 9a calculates the approximate gradient vector gi and the direction vector Vdt in each of the neighboring outer peripheral regions Ht, the value of the magnitude I gi I of Vgt is set to a threshold value (in this embodiment, 0. 4) Judge whether it is above or not.
- a threshold value in this embodiment 0. 4
- the control unit 9a uses the inner product formula to calculate the angle ⁇ t formed by the approximate gradient vector Vgt and the direction vector Vdt as shown in FIG.
- the value of co s is calculated according to 27) (step S254 in Fig. 42).
- the control unit 9a having a function as an edge determination unit detects that the value of I cos 6 t I is larger than 0 ⁇ 7 based on the calculation result of cos ⁇ t using the above equation (27) (FIG. 42).
- the control unit 9a having a function as a region extraction unit, when there is a radial gradient betatono centered on one rectangular region RO in L regions out of T neighboring outer peripheral regions, L / Based on the value of T, if the value of L / T is, for example, 0.7 or more (step S256 in FIG. 42), the central portion of the bubble having a substantially circular shape is extracted in the one rectangular region R0 extracted. Judge that it exists. Then, the control unit 9a performs the process as described above for each value of the distance Q while changing the value of the distance Q to the value of Qmax set in advance by the user (Step S257 and Step S258 in FIG. 42). ).
- control unit 9a sequentially performs the processing described above for each set region RO while sequentially setting one rectangular region R0 for all the rectangular regions having the number of pixels of 8 ⁇ 8. 42 steps S259 and S260).
- the controller 9a performs various processes on the image Ii by performing the processing described above. Detect areas that fall into the bubble class according to the size of various bubbles present in different areas
- control unit 9a detects a region classified into the villi class in a region other than the region classified into the bubble class (step S245 in Fig. 41). Processes similar to those shown in FIG.
- the control unit 9a calculates the equation (20) to the equation (22) having the average vector and the variance-covariance matrix ⁇ a calculated in advance for each class based on the teacher data.
- where the generated feature vector belongs to class ⁇ , and the posterior probability ⁇ ( ⁇ 2 I that belongs to the generated feature vector ⁇ force S class ⁇ 2 and the generated feature vector force S class The posterior probability ⁇ ( ⁇ 3 I belonging to ⁇ 3 and the posterior probability ⁇ ( ⁇ 4
- the control unit 9a The class ⁇ a that gives the maximum posterior probability ⁇ 1 ( ⁇ a
- is identified as belonging to the feature beta 2 £, and one area where the feature vector is generated based on the identification result Hj is classified into class ⁇ a and villi class Assuming class ⁇ 1 (a 1), the control unit 9a detects a region classified into class ⁇ 1 in the above processing from the mXn regions, and the maximum posterior probability P1 in each of the regions. The value of the probability density function pi (x I ⁇ 1) giving ( ⁇
- control unit 9a determines the power of whether the classification result of each region classified into the class ⁇ 1 is accurate, so that the probability density giving the maximum posterior probability ⁇ 1 ( ⁇ 1 I 2 ⁇ Further processing based on a threshold value for the value of the function Pi (x I ⁇ ) is performed.
- the control unit 9a for example, among the average values of the five feature amounts of the average vector ill calculated in advance based on the teacher data, for example, the average of the feature amount xl
- a threshold vector ⁇ hl including a value obtained by adding the product of the standard deviation ⁇ xl of the feature quantity xl and the multiplication coefficient H as a predetermined constant to the value ⁇ xl is determined.
- the value of the multiplication factor is set as 0.8 in this process as a value that constitutes a threshold for the value of pl (x I ⁇ 1) in order to reliably classify the villi class.
- the control unit 9a substitutes the threshold vector 2 ⁇ 1 as in the above formulas (20), (21), and (22) to obtain the probability density function. Calculate the value of p ( ⁇ hl I ⁇ ) Put out.
- control unit 9a determines that the classification result classified into the villi class is accurate for the region where the value of pi ( ⁇
- the mean vector 2 and the variance covariance matrix ⁇ 2 that are the parameters to be calculated are calculated (step S246 in FIG. 41). Thereafter, the control unit 9a sets the mean vector jl and the variance covariance matrix ⁇ 1 that are the parameters constituting the villi class, the mean vector ⁇ 2 and the variance covariance matrix ⁇ 2 that is the parameter that constitutes the bubble class, and For example, the image processing described in the seventh embodiment or the eighth embodiment of the present invention is further performed on the image Ii as shown in FIG. 46 to obtain a final result as shown in FIG. A classification result is obtained (step S247 in FIG. 41).
- the detection of the region classified into the bubble class in the present embodiment is not limited to the detection as described above, but may be, for example, as described below.
- the control unit 9a uses, for example, the neighboring outer peripheral region Ht indicated by the coordinates (xt, yt) for a bubble whose edge shape is distorted to become an elliptical shape based on the following equation (28). To detect.
- ⁇ is an integer equal to or greater than 1, and [] indicates a Gaussian symbol.
- the control unit 9a After detecting all the coordinates (xt, yt) satisfying the above formula (28), the control unit 9a detects the approximate gradient vector Ygi, the coordinates (xt, yt), and the coordinates The direction vector dl connecting (xo, yo) is calculated. Then, the control unit 9a determines whether or not the value of the magnitude I Vgt I of sl is equal to or greater than a threshold value (0.4 in the present embodiment). When the control unit 9a detects that the value of I Vgt I is greater than or equal to the threshold value, the control unit 9a determines the angle ⁇ t formed by the approximate gradient vector i and the direction vector based on the inner product formula by using the above equation (27). Calculate the value of t.
- the control unit 9a Based on the calculation result of cos ⁇ t using Equation (27), the control unit 9a detects that the value of
- the controller 9a calculates the value of L1 / T from the number of regions L1 and the number T of neighboring outer peripheral regions detected by using the above equation (26). Then, based on the value of L1 / T, if the value of L1 / T is greater than or equal to a threshold value (for example, 0.7), the control unit 9a has an distorted edge shape in one rectangular region RO. It is determined that there is a central part of a circular bubble.
- a threshold value for example, 0.7
- the control unit 9a changes the distance (Q-) from the distance (Q-) while changing the direction by an angle ⁇ between 0 ° force and 360 ° around the region RO.
- a process may be performed to detect a neighboring outer peripheral region having a radial gradient vector.
- the control unit 9a sets S as the number of directions in which it is determined that there is a neighboring outer peripheral region having a radial gradient vector, and sets the number of neighboring outer peripheral regions T to [360 / ⁇ ], and then sets the value of SZT. calculate. Then, based on the value of S / T, if the value of SZT is equal to or greater than a threshold value (for example, 0.7), the control unit 9a determines that the central portion of the bubble exists in one rectangular region R0.
- a threshold value for example, 0.7
- the vector used when the control unit 9a detects the region classified into the bubble class is not limited to the approximate gradient vector, and may be, for example, an average gradient NT.
- the turntable described in the seventh embodiment is used.
- it is possible to improve the observation efficiency in observation using the cell-type endoscope device 1 it is possible to obtain a reflex effect and it is difficult to classify according to the feature amount composed of color tone information and texture information.
- the bubble class and the villus having structurally distinct features are calculated in advance based on the image input to the terminal device 7, so that the bubble class and the villi are calculated.
- the region of the villi class can be classified with higher accuracy.
- the force using five values based on the color tone information and texture information as values constituting the feature amount constitutes the feature amount.
- the value can be changed or added as appropriate according to the purpose of the user.
- the detection result of a lesion site is converted into a living body by being used in combination with image processing for detecting a lesion site such as bleeding or redness. It is possible to determine whether or not it is obtained from the mucosal surface, and as a result, it is possible to improve the detection accuracy of the lesion site.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Artificial Intelligence (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Mathematical Physics (AREA)
- Fuzzy Systems (AREA)
- Physiology (AREA)
- Psychiatry (AREA)
- Computer Networks & Wireless Communication (AREA)
- Quality & Reliability (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP06729056.9A EP1870020B1 (en) | 2005-04-13 | 2006-03-14 | Image processing apparatus and image processing method |
CN200680010400.4A CN101150977B (zh) | 2005-04-13 | 2006-03-14 | 图像处理装置 |
US11/666,556 US7953261B2 (en) | 2005-04-13 | 2006-03-14 | Image processing apparatus and image processing method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005-115961 | 2005-04-13 | ||
JP2005115960A JP4624841B2 (ja) | 2005-04-13 | 2005-04-13 | 画像処理装置および当該画像処理装置における画像処理方法 |
JP2005-115960 | 2005-04-13 | ||
JP2005115961A JP4624842B2 (ja) | 2005-04-13 | 2005-04-13 | 画像処理方法、画像処理装置及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006112227A1 true WO2006112227A1 (ja) | 2006-10-26 |
Family
ID=37114959
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/305022 WO2006112227A1 (ja) | 2005-04-13 | 2006-03-14 | 画像処理装置及び画像処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US7953261B2 (ja) |
EP (2) | EP1870020B1 (ja) |
KR (1) | KR100970295B1 (ja) |
CN (1) | CN101966071B (ja) |
WO (1) | WO2006112227A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010187756A (ja) * | 2009-02-16 | 2010-09-02 | Olympus Corp | 画像処理装置、画像処理方法および画像処理プログラム |
WO2016175178A1 (ja) * | 2015-04-27 | 2016-11-03 | オリンパス株式会社 | 画像解析装置、画像解析システム、画像解析装置の作動方法 |
WO2016175098A1 (ja) * | 2015-04-27 | 2016-11-03 | オリンパス株式会社 | 画像解析装置、画像解析システム、画像解析装置の作動方法 |
Families Citing this family (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005000101A2 (en) * | 2003-06-12 | 2005-01-06 | University Of Utah Research Foundation | Apparatus, systems and methods for diagnosing carpal tunnel syndrome |
US8335362B2 (en) * | 2006-06-12 | 2012-12-18 | Given Imaging Ltd. | Device, system and method for measurement and analysis of contractile activity |
DE102006028646A1 (de) * | 2006-06-22 | 2007-12-27 | Siemens Ag | Auswertungsverfahren für Bilddatensätze mit selbsttätiger Bestimmung von Auswertungsbereichen |
US8213698B2 (en) * | 2006-09-19 | 2012-07-03 | Capso Vision Inc. | Systems and methods for capsule camera control |
US8187174B2 (en) * | 2007-01-22 | 2012-05-29 | Capso Vision, Inc. | Detection of when a capsule camera enters into or goes out of a human body and associated operations |
JP5106928B2 (ja) * | 2007-06-14 | 2012-12-26 | オリンパス株式会社 | 画像処理装置および画像処理プログラム |
JP5078486B2 (ja) * | 2007-07-26 | 2012-11-21 | オリンパスメディカルシステムズ株式会社 | 医療用画像処理装置及び医療用画像処理装置の作動方法 |
JP5065812B2 (ja) * | 2007-08-29 | 2012-11-07 | オリンパスメディカルシステムズ株式会社 | 生体内画像取得装置および生体内画像取得システム |
US20090080768A1 (en) * | 2007-09-20 | 2009-03-26 | Chung Shan Institute Of Science And Technology, Armaments Bureau, M.N.D. | Recognition method for images by probing alimentary canals |
JP5312807B2 (ja) * | 2008-01-08 | 2013-10-09 | オリンパス株式会社 | 画像処理装置および画像処理プログラム |
JP2009225933A (ja) * | 2008-03-21 | 2009-10-08 | Fujifilm Corp | カプセル内視鏡システム及びカプセル内視鏡の動作制御方法 |
JP5281826B2 (ja) * | 2008-06-05 | 2013-09-04 | オリンパス株式会社 | 画像処理装置、画像処理プログラムおよび画像処理方法 |
JP5117353B2 (ja) * | 2008-11-07 | 2013-01-16 | オリンパス株式会社 | 画像処理装置、画像処理プログラムおよび画像処理方法 |
DE102009027275A1 (de) * | 2009-06-29 | 2010-12-30 | Robert Bosch Gmbh | Bildverarbeitungsverfahren für ein Fahrerassistenzsystem eines Kraftfahrzeugs zur Detektion und Klassifikation wenigstens eines Teils wenigstens eines vorgegebenen Bildelements |
JP5388732B2 (ja) * | 2009-07-15 | 2014-01-15 | Hoya株式会社 | 医療用観察システムおよびプロセッサ |
DE102009043652A1 (de) * | 2009-09-29 | 2011-03-31 | Richard Wolf Gmbh | Endoskopisches Instrument |
WO2011042950A1 (ja) * | 2009-10-05 | 2011-04-14 | 富士通株式会社 | 生体情報処理装置、生体情報処理方法及び生体情報処理用コンピュータプログラム |
US8503785B2 (en) * | 2010-01-15 | 2013-08-06 | Gravic, Inc. | Dynamic response bubble attribute compensation |
KR101184876B1 (ko) * | 2010-02-11 | 2012-09-20 | 삼성전자주식회사 | 영상과 상호작용이 가능한 캐릭터의 동적 효과를 생성하는 방법 및 장치 |
US8768024B1 (en) * | 2010-06-01 | 2014-07-01 | Given Imaging Ltd. | System and method for real time detection of villi texture in an image stream of the gastrointestinal tract |
JP5622461B2 (ja) * | 2010-07-07 | 2014-11-12 | オリンパス株式会社 | 画像処理装置、画像処理方法、および画像処理プログラム |
JP5739428B2 (ja) | 2010-08-04 | 2015-06-24 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 画像分類装置、方法、プログラム、プログラムを記録する記録媒体及び集積回路 |
US8922633B1 (en) | 2010-09-27 | 2014-12-30 | Given Imaging Ltd. | Detection of gastrointestinal sections and transition of an in-vivo device there between |
US8965079B1 (en) | 2010-09-28 | 2015-02-24 | Given Imaging Ltd. | Real time detection of gastrointestinal sections and transitions of an in-vivo device therebetween |
CN102843950B (zh) * | 2010-11-08 | 2015-02-04 | 奥林巴斯医疗株式会社 | 图像显示装置以及胶囊型内窥镜系统 |
JP5576775B2 (ja) * | 2010-11-29 | 2014-08-20 | オリンパス株式会社 | 画像処理装置、画像処理方法、及び画像処理プログラム |
WO2012105141A1 (ja) | 2011-02-01 | 2012-08-09 | オリンパスメディカルシステムズ株式会社 | 診断支援装置 |
US9498136B2 (en) * | 2011-04-18 | 2016-11-22 | Koninklijke Philips N.V | Classification of tumor tissue with a personalized threshold |
JP5980490B2 (ja) * | 2011-10-18 | 2016-08-31 | オリンパス株式会社 | 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム |
CN103747718B (zh) * | 2012-03-21 | 2016-03-30 | 奥林巴斯株式会社 | 图像处理装置 |
US9877633B2 (en) * | 2012-07-25 | 2018-01-30 | Intuitive Surgical Operations, Inc | Efficient and interactive bleeding detection in a surgical system |
BR112015014945A2 (pt) * | 2012-12-21 | 2017-07-11 | Koninklijke Philips Nv | sistema de monitoramento remoto de fotoplestimografia, método de monitoramento remoto de fotoplestimografia, e programa de computador |
US9324145B1 (en) | 2013-08-08 | 2016-04-26 | Given Imaging Ltd. | System and method for detection of transitions in an image stream of the gastrointestinal tract |
US10314514B2 (en) * | 2016-05-29 | 2019-06-11 | Ankon Medical Technologies (Shanghai) Co., Ltd. | System and method for using a capsule device |
CN106910184B (zh) * | 2017-01-12 | 2020-10-09 | 杭州电子科技大学 | 基于深度卷积神经网络的内窥镜图像肠道出血检测方法 |
AU2018243312B2 (en) * | 2017-03-31 | 2023-02-02 | Biora Therapeutics, Inc. | Localization systems and methods for an ingestible device |
CN107730473A (zh) * | 2017-11-03 | 2018-02-23 | 中国矿业大学 | 一种基于深度神经网络的煤矿井下图像处理方法 |
EP3821790B1 (en) * | 2018-07-09 | 2024-07-24 | FUJIFILM Corporation | Medical image processing device, medical image processing system, medical image processing method, and program |
CN110278630B (zh) * | 2019-05-22 | 2022-09-27 | 杭州极致光生物照明有限公司 | 光谱调色照明方法、装置及其水族箱灯具 |
US10799090B1 (en) * | 2019-06-13 | 2020-10-13 | Verb Surgical Inc. | Method and system for automatically turning on/off a light source for an endoscope during a surgery |
DE102019125413A1 (de) * | 2019-09-20 | 2021-03-25 | Carl Zeiss Meditec Ag | Verfahren und Vorrichtung zum Erstellen und Anzeigen einer Karte von einem Gehirnoperationsfeld |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11104072A (ja) * | 1997-10-03 | 1999-04-20 | Mitsubishi Electric Corp | 医療支援システム |
JP2002336193A (ja) * | 2001-05-17 | 2002-11-26 | Olympus Optical Co Ltd | 診断支援装置 |
JP2003079568A (ja) * | 2001-06-29 | 2003-03-18 | Fuji Photo Film Co Ltd | 蛍光画像取得方法および装置並びにプログラム |
JP2004154176A (ja) * | 2002-11-01 | 2004-06-03 | Olympus Corp | 内視鏡撮像装置 |
JP2004321603A (ja) * | 2003-04-25 | 2004-11-18 | Olympus Corp | 画像表示装置、画像表示方法および画像表示プログラム |
JP2004337596A (ja) * | 2003-04-25 | 2004-12-02 | Olympus Corp | 画像表示装置、画像表示方法および画像表示プログラム |
JP2005192880A (ja) * | 2004-01-08 | 2005-07-21 | Olympus Corp | 画像処理方法 |
JP2006122502A (ja) * | 2004-10-29 | 2006-05-18 | Olympus Corp | 画像処理方法及びカプセル型内視鏡装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4670468A (en) * | 1982-08-09 | 1987-06-02 | Regents Of The University Of California | Method for protecting and healing gastroduodenal mucosa and the liver of mammals |
JP3974946B2 (ja) * | 1994-04-08 | 2007-09-12 | オリンパス株式会社 | 画像分類装置 |
US6331116B1 (en) * | 1996-09-16 | 2001-12-18 | The Research Foundation Of State University Of New York | System and method for performing a three-dimensional virtual segmentation and examination |
US7236623B2 (en) * | 2000-04-24 | 2007-06-26 | International Remote Imaging Systems, Inc. | Analyte recognition for urinalysis diagnostic system |
US20020177779A1 (en) | 2001-03-14 | 2002-11-28 | Doron Adler | Method and system for detecting colorimetric abnormalities in vivo |
AU2002304266A1 (en) * | 2001-06-20 | 2003-01-02 | Given Imaging Ltd. | Motility analysis within a gastrointestinal tract |
CN101288582A (zh) * | 2003-04-25 | 2008-10-22 | 奥林巴斯株式会社 | 图像显示装置和图像显示方法 |
US20050075537A1 (en) * | 2003-10-06 | 2005-04-07 | Eastman Kodak Company | Method and system for real-time automatic abnormality detection for in vivo images |
CN1261910C (zh) * | 2003-12-08 | 2006-06-28 | 西安理工大学 | 彩色多窗ct图像的自动生成方法 |
JP2005115960A (ja) | 2004-11-09 | 2005-04-28 | Ziosoft Inc | 医用画像データベースシステム |
JP4225501B2 (ja) | 2004-11-15 | 2009-02-18 | 高司 澤口 | 携帯個人認証装置及び同装置によりアクセスが許可される電子システム |
-
2006
- 2006-03-14 KR KR1020077023403A patent/KR100970295B1/ko not_active IP Right Cessation
- 2006-03-14 WO PCT/JP2006/305022 patent/WO2006112227A1/ja active Application Filing
- 2006-03-14 US US11/666,556 patent/US7953261B2/en active Active
- 2006-03-14 EP EP06729056.9A patent/EP1870020B1/en not_active Not-in-force
- 2006-03-14 CN CN2010105160418A patent/CN101966071B/zh not_active Expired - Fee Related
- 2006-03-14 EP EP11008715.2A patent/EP2415389B1/en not_active Not-in-force
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11104072A (ja) * | 1997-10-03 | 1999-04-20 | Mitsubishi Electric Corp | 医療支援システム |
JP2002336193A (ja) * | 2001-05-17 | 2002-11-26 | Olympus Optical Co Ltd | 診断支援装置 |
JP2003079568A (ja) * | 2001-06-29 | 2003-03-18 | Fuji Photo Film Co Ltd | 蛍光画像取得方法および装置並びにプログラム |
JP2004154176A (ja) * | 2002-11-01 | 2004-06-03 | Olympus Corp | 内視鏡撮像装置 |
JP2004321603A (ja) * | 2003-04-25 | 2004-11-18 | Olympus Corp | 画像表示装置、画像表示方法および画像表示プログラム |
JP2004337596A (ja) * | 2003-04-25 | 2004-12-02 | Olympus Corp | 画像表示装置、画像表示方法および画像表示プログラム |
JP2005192880A (ja) * | 2004-01-08 | 2005-07-21 | Olympus Corp | 画像処理方法 |
JP2006122502A (ja) * | 2004-10-29 | 2006-05-18 | Olympus Corp | 画像処理方法及びカプセル型内視鏡装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1870020A4 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010187756A (ja) * | 2009-02-16 | 2010-09-02 | Olympus Corp | 画像処理装置、画像処理方法および画像処理プログラム |
US8743189B2 (en) | 2009-02-16 | 2014-06-03 | Olympus Corporation | Image processing apparatus, image processing method, and computer-readable recording medium storing image processing program |
WO2016175178A1 (ja) * | 2015-04-27 | 2016-11-03 | オリンパス株式会社 | 画像解析装置、画像解析システム、画像解析装置の作動方法 |
WO2016175098A1 (ja) * | 2015-04-27 | 2016-11-03 | オリンパス株式会社 | 画像解析装置、画像解析システム、画像解析装置の作動方法 |
JP6058240B1 (ja) * | 2015-04-27 | 2017-01-11 | オリンパス株式会社 | 画像解析装置、画像解析システム、画像解析装置の作動方法 |
JP6058241B1 (ja) * | 2015-04-27 | 2017-01-11 | オリンパス株式会社 | 画像解析装置、画像解析システム、画像解析装置の作動方法 |
US10733734B2 (en) | 2015-04-27 | 2020-08-04 | Olympus Corporation | Image analysis apparatus, image analysis system, image analysis apparatus operation method to analyze brightness change of subject |
Also Published As
Publication number | Publication date |
---|---|
CN101966071B (zh) | 2012-10-17 |
KR100970295B1 (ko) | 2010-07-15 |
CN101966071A (zh) | 2011-02-09 |
EP1870020A4 (en) | 2011-05-25 |
EP2415389A2 (en) | 2012-02-08 |
EP2415389A3 (en) | 2013-02-20 |
EP2415389B1 (en) | 2016-05-04 |
EP1870020A1 (en) | 2007-12-26 |
EP1870020B1 (en) | 2015-08-05 |
US20070292011A1 (en) | 2007-12-20 |
US7953261B2 (en) | 2011-05-31 |
KR20070116634A (ko) | 2007-12-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006112227A1 (ja) | 画像処理装置及び画像処理方法 | |
JP4624841B2 (ja) | 画像処理装置および当該画像処理装置における画像処理方法 | |
US8160329B2 (en) | Medical image processing device and medical image processing method | |
JP4615963B2 (ja) | カプセル型内視鏡装置 | |
EP1969992B1 (en) | Image processing device and image processing method in the image processing device | |
JP4472631B2 (ja) | 画像処理装置および当該画像処理装置における画像処理方法 | |
EP1769729B1 (en) | System and method for in-vivo feature detection | |
WO2006035437A2 (en) | System and method to detect a transition in an image stream | |
JP4520405B2 (ja) | 画像処理装置および当該画像処理装置における画像処理方法 | |
JP4624842B2 (ja) | 画像処理方法、画像処理装置及びプログラム | |
JP2006166939A (ja) | 画像処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200680010400.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 11666556 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006729056 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020077023403 Country of ref document: KR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
NENP | Non-entry into the national phase |
Ref country code: RU |
|
WWP | Wipo information: published in national office |
Ref document number: 11666556 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2006729056 Country of ref document: EP |