WO2014073527A1 - 医療用画像処理装置 - Google Patents
医療用画像処理装置 Download PDFInfo
- Publication number
- WO2014073527A1 WO2014073527A1 PCT/JP2013/079884 JP2013079884W WO2014073527A1 WO 2014073527 A1 WO2014073527 A1 WO 2014073527A1 JP 2013079884 W JP2013079884 W JP 2013079884W WO 2014073527 A1 WO2014073527 A1 WO 2014073527A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- region
- area
- feature amount
- mucous membrane
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0646—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with illumination filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30092—Stomach; Gastric
Definitions
- the present invention relates to a medical image processing apparatus that performs image processing on a medical image obtained by imaging a biological mucous membrane.
- An endoscope apparatus has, for example, an elongated insertion portion that is inserted into a body cavity as a living body, and a solid-state imaging device that forms an image in the body cavity formed by an objective optical system disposed at a distal end portion of the insertion portion.
- the image and the like are taken and output as an imaging signal, and the image of the image in the body cavity is displayed on the display means such as a monitor based on the imaging signal.
- the user observes an organ or the like in the body cavity, for example.
- the endoscope apparatus can directly capture an image of the digestive tract mucosa. Therefore, the user can comprehensively observe various findings such as the color tone of the mucous membrane, the shape of the lesion, and the fine structure of the mucosal surface (mucosal microstructure).
- NBI narrow band light observation
- Computer diagnosis support (Computer Aided) that provides support information such as the estimation of disease states by image analysis and the identification of microstructures that should be focused on when providing and quantifying quantitative judgment scales through image processing for medical images Research and development of Diagnosis (CAD) is underway.
- CAD Diagnosis
- the mucous membrane microstructure in the endoscopic image has a continuous pattern captured in a complicated form, and it is difficult to extract and analyze with high accuracy by the conventional image analysis technique. Further, the pattern of the mucous membrane microstructure to be imaged differs depending on the organs such as the stomach and the large intestine. Furthermore, even in the same organ, for example, in the stomach, the pyloric gland and the fundus gland differ.
- mucosal microstructures such as blood vessels and epithelial structures are imaged two-dimensionally on endoscopic images
- Yasushi Kenshi: gastroscopic endoscope; 79-87, 2009 (hereinafter non-patented) In fact, it has a three-dimensional structure.
- MCE glenoid marginal epithelium
- Japanese Patent No. 2918162 filed by the present applicant describes a method for dividing and detecting a small area using a gastric subdivision in the gastric mucosa as a unit area.
- the unit is a gastric subdivision, and it is not intended for analysis into a complex and minute structure as much as the above-mentioned biohistological unit.
- it since it is not an area division method considering a three-dimensional structure, it is difficult to divide each unit area well, such as when a plurality of unit areas are imaged so as to be folded or connected. There is a problem of being.
- the mucous membrane microstructure in the endoscopic image is captured in a complex pattern of continuous patterns, and it is difficult to extract and analyze with high accuracy by the conventional image analysis technique.
- the pattern of the mucous membrane microstructure to be imaged differs depending on the organs such as the stomach and the large intestine. Furthermore, even in the same organ, for example, in the stomach, the pyloric gland and the fundus gland differ.
- mucosal microstructures such as blood vessels and epithelial structures are imaged two-dimensionally on endoscopic images, as described in the above-mentioned non-patent document, a three-dimensional structure is actually used. Presents.
- MCE glenoid marginal epithelium
- Japanese Patent No. 2918162 filed by the present applicant describes a method for dividing and detecting a small area using a gastric subdivision in the gastric mucosa as a unit area.
- the unit is a gastric subdivision, and the analysis to the minute structure as much as the above-described range of one unit in biohistology is not targeted.
- it since it is not an area division method considering a three-dimensional structure, it is difficult to divide each unit area well, such as when a plurality of unit areas are imaged so as to be folded or connected. There is a problem of being.
- Japanese Patent No. 4451460 sets a plurality of regions of interest (abbreviated as ROI) in an endoscopic image, calculates a feature amount from each ROI, and estimates and discriminates the pit pattern classification. Disclosure.
- the ROI is set manually.
- the ROI is manually set as described above, and it is not disclosed that the range that becomes one unit in terms of biohistology is automatically set as the ROI. Therefore, the ROI is automatically set. It is difficult to set.
- the present invention has been made in view of the above-described problems, and by making it possible to divide and set a unit region from a mucosal microstructure, it is possible to support diagnosis and the like of individual structures on a medical image.
- An object is to provide a medical image processing apparatus.
- the present invention has been made in view of the above-described points, and can be used for medical purposes in which a discrimination target region is appropriately set with a size in units of units and the state of the biological mucous membrane in the discrimination target region can be discriminated.
- An object is to provide an image processing apparatus.
- a medical image processing apparatus includes an input unit to which a biological mucosal image obtained by imaging a biological mucous membrane is input, and a mucous membrane microstructure from the biological mucosal image input to the input unit.
- a region extracting unit for extracting a corresponding mucosal microstructure region, a closed region identifying unit for identifying at least one closed region considered to be surrounded by the mucosal microstructure region, and the mucosa extracted by the region extracting unit
- a unit region setting unit that sets a unit region based on the microstructure region and the closed region identified by the closed region identifying unit.
- FIG. 1 is a diagram illustrating an example of a schematic configuration of an endoscope apparatus having a medical image processing apparatus according to a first embodiment of the present invention.
- FIG. 2 is a diagram illustrating an example of a configuration of a rotary filter included in the light source device of FIG.
- FIG. 3 is a diagram illustrating an example of transmission characteristics of each filter included in the first filter group in FIG. 2.
- FIG. 4 is a diagram illustrating an example of transmission characteristics of each filter included in the second filter group in FIG. 2.
- FIG. 5 is a block diagram showing a configuration of a calculation unit in FIG.
- FIG. 6A is a flowchart showing the processing contents in the first embodiment.
- FIG. 6B is a diagram showing a semi-elliptical shape used for template matching.
- FIG. 6A is a flowchart showing the processing contents in the first embodiment.
- FIG. 6B is a diagram showing a semi-elliptical shape used for template matching.
- FIG. 7 is a schematic diagram of an endoscopic image.
- FIG. 8 is an enlarged schematic view of a part of FIG.
- FIG. 9 is a diagram illustrating an example of a detection result of the MCE region MA1 in the first embodiment.
- FIG. 10 is a diagram illustrating an example of a labeling result of an area including the MCE area MA1 in the first embodiment.
- FIG. 11 is a diagram illustrating an example of calculating a width Wk as a feature amount of the MCE region MAi in the first embodiment.
- FIG. 12 is a diagram showing an example of setting of the unit area UA3 in the first embodiment.
- FIG. 13 is a diagram illustrating an example in which the unit areas UA2 and UA3 overlap in the first embodiment.
- FIG. 14A is a diagram illustrating an example of a schematic closed region CAj that is a target in a modification of the first embodiment.
- FIG. 14B is a flowchart illustrating an example of processing content in the modification.
- FIG. 15 is a diagram illustrating an example of a detection result of the core line CL1 in the modification.
- FIG. 16 is a diagram illustrating an example of the virtual line VL1 in the modification.
- FIG. 17 is a diagram showing an example of a labeling result using the virtual line VL1 in the modified example.
- FIG. 18 is a diagram illustrating a setting example of virtual lines VL1 and VL2 in a modified example.
- FIG. 19A is a block diagram illustrating a configuration of a calculation unit according to the second embodiment of the present invention.
- FIG. 19A is a block diagram illustrating a configuration of a calculation unit according to the second embodiment of the present invention.
- FIG. 19A is a block diagram illustrating a configuration of a calculation unit according
- FIG. 19B is a flowchart showing an example of processing contents in the second embodiment of the present invention.
- FIG. 20 is a schematic diagram illustrating an example when the MCE region is unclear in the endoscopic image.
- FIG. 21 is a diagram showing an example of the unit area UAj when the process described in the first embodiment is applied to FIG.
- FIG. 22 is a diagram illustrating an example of a detection result of the MCE region MA2 in the second embodiment.
- FIG. 23 is a diagram illustrating an example of a matched filter MF used in the second embodiment.
- FIG. 24 is a diagram illustrating an example of pixel value setting of image data according to the second embodiment.
- FIG. 25 is a diagram illustrating an example of a detection result of the approximate closed region candidate pixel CPz in the second embodiment.
- FIG. 20 is a schematic diagram illustrating an example when the MCE region is unclear in the endoscopic image.
- FIG. 21 is a diagram showing an example of the unit area UAj when the process described in the
- FIG. 26 is a diagram showing an example of a labeling result for the approximate closed region candidate pixel CPz in the second embodiment.
- FIG. 27 is a diagram showing an example of setting of the unit area UA2 in the second embodiment.
- FIG. 28 is a diagram showing an example of a schematic configuration of an endoscope apparatus having a medical image processing apparatus according to a third embodiment of the present invention.
- FIG. 29 is a diagram illustrating an example of a configuration of a rotary filter included in the light source device of FIG.
- FIG. 30 is a diagram illustrating an example of transmission characteristics of each filter included in the first filter group in FIG. 29.
- FIG. 31 is a diagram illustrating an example of transmission characteristics of each filter included in the second filter group in FIG. 29.
- FIG. 32 is a block diagram showing the configuration of the calculation unit of FIG.
- FIG. 33A is a flowchart showing the processing contents in the third embodiment.
- FIG. 33B is a diagram showing a semi-elliptical shape used for template matching.
- FIG. 34 is a schematic diagram of an endoscopic image.
- FIG. 35 is a schematic diagram enlarging a part of FIG.
- FIG. 36 is a diagram showing an example of the detection result of the MCE region MB1 in the third embodiment.
- FIG. 37 is a diagram showing an example of a labeling result of an area including the MCE area MB1 in the third embodiment.
- FIG. 38 is a diagram illustrating an example of calculating a width Tk as a feature amount of the MCE region MBi in the third embodiment.
- FIG. 39 is a diagram showing an example of setting of the unit area UB3 in the third embodiment.
- FIG. 40 is a diagram illustrating an example in which the unit areas UB2 and UB3 overlap in the third embodiment.
- FIG. 41A is a diagram showing a set unit area of interest.
- FIG. 41B is a diagram showing a setting example of the discrimination target area Ad set so as to include the target unit area.
- FIG. 42 is a flowchart showing detailed processing contents of steps S509 and S510 of FIG. 33A.
- FIG. 43A is a diagram schematically illustrating an image example different from the image illustrated in FIG. 35 obtained by imaging the biological mucous membrane input to the calculation unit.
- FIG. 43B is a diagram showing an example in which the target unit area set in the case of the image of FIG.
- FIG. 43A is set.
- FIG. 43C is a diagram showing an example in which a rectangular area is further set in FIG. 43B.
- FIG. 43D is a diagram showing an example in which the rectangular area in FIG. 43C is set as the discrimination target area.
- FIG. 44 is a flowchart showing the processing contents for setting the discrimination target area based on the second feature amount calculated for the unit area of interest.
- 45A is a diagram showing a discrimination target region that is set when the area is equal to or larger than a threshold in the process of FIG.
- FIG. 45B is a diagram showing a discrimination target region that is set when the area is less than the threshold value in the process of FIG. FIG.
- FIG. 46 is a flowchart showing the processing contents for setting a discrimination target area using two thresholds based on the second feature amount calculated for the unit area of interest.
- FIG. 47 is a flowchart showing the processing contents when the discrimination target area is set using the degree as the third feature amount for the image of FIG. 43A.
- FIG. 48 is a diagram showing an example of the discrimination target area set in the process of FIG.
- FIG. 49 is an explanatory diagram showing that the number of unit areas touched by a common MCE is recognized by the recognition unit.
- FIG. 50 is a flowchart showing the processing contents in a modification of the third embodiment.
- an endoscope apparatus 1 includes an endoscope 2 that is inserted into a body cavity of a subject and outputs an image obtained by imaging a subject such as a living tissue 101 in the body cavity, and a living body.
- a light source device 3 that emits illumination light for illuminating the tissue 101
- a processor 4 that constitutes a medical image processing device that performs various processes on an output signal from the endoscope 2, and a video signal from the processor 4
- an external storage device 6 for storing an output signal corresponding to a processing result in the processor 4.
- the endoscope 2 includes an insertion portion 21a having an elongated shape and dimensions that can be inserted into a body cavity of a subject, a distal end portion 21b provided on the distal end side of the insertion portion 21a, and a proximal end side of the insertion portion 21a. And an operating portion 21c provided. Further, a light guide 7 for transmitting illumination light emitted from the light source device 3 to the distal end portion 21b is inserted into the insertion portion 21a.
- One end surface (light incident end surface) of the light guide 7 is detachably connected to the light source device 3.
- the other end face (light emission end face) of the light guide 7 is disposed in the vicinity of an illumination optical system (not shown) provided at the distal end portion 21 b of the endoscope 2.
- the illumination light emitted from the light source device 3 passes through the light guide 7 connected to the light source device 3 and the illumination optical system (not shown) provided at the distal end portion 21b, and then enters the body cavity. The light is emitted to the living tissue 101.
- the distal end portion 21b of the endoscope 2 includes an objective optical system 22 that forms an optical image of the subject, and an imaging unit that is disposed at the imaging position of the objective optical system 22 and captures the optical image and acquires it as an image.
- a charge-coupled device (abbreviated as CCD) 23 is provided.
- the operation unit 21c of the endoscope 2 is provided with an observation mode switching switch 24 capable of giving an instruction to switch the observation mode to either the normal light observation mode or the narrow band light observation mode. .
- the light source device 3 includes a white light source 31 formed of a xenon lamp or the like, a rotary filter 32 that uses white light emitted from the white light source 31 as surface-sequential illumination light, a motor 33 that rotationally drives the rotary filter 32, and a rotary filter 32. And a motor 34 that moves the motor 33 in a direction perpendicular to the emission light path of the white light source 31 (symbol A in FIG. 1), and a rotary filter drive that drives the motors 33 and 34 based on the control of the control unit 42 of the processor 4. And a condensing optical system 36 that condenses the illumination light that has passed through the rotary filter 32 and supplies it to the incident end face of the light guide 7.
- the rotary filter 32 has a disk shape with the center as a rotation axis, and includes a first filter group 32 ⁇ / b> A including a plurality of filters provided along the circumferential direction on the inner circumference side, and the outer circumference. And a second filter group 32B having a plurality of filters provided along the circumferential direction of the side. Then, when the driving force of the motor 33 is transmitted to the rotating shaft, the rotary filter 32 rotates.
- positioned it is comprised with the light shielding member.
- the first filter group 32A is provided along the circumferential direction on the inner peripheral side of the rotary filter 32, and passes through an R filter 32r that passes light in the red wavelength band and light in the green wavelength band.
- the filter includes a G filter 32g and a B filter 32b that transmits light in a blue wavelength band.
- the R filter 32r has a configuration that mainly transmits light (R light) from 600 nm to 700 nm, for example, as shown in FIG.
- the G filter 32g has a configuration that mainly transmits light (G light) from 500 nm to 600 nm as shown in FIG. 3, for example.
- the B filter 32 b has a configuration that mainly transmits light (B light) from 400 nm to 500 nm.
- the R filter 32r, the G filter 32g, and the B filter 32b are simply indicated by R, G, and B.
- the white light emitted from the white light source 31 passes through the first filter group 32A, thereby generating broadband light for the normal light observation mode.
- the second filter group 32B includes a Bn filter 321b that transmits blue and narrow band light, and a Gn that transmits green and narrow band light, which are provided along the circumferential direction on the outer peripheral side of the rotary filter 32. And a filter 321g.
- the Bn filter 321b has a center wavelength set near 415 nm and is configured to transmit light in a narrow band (Bn light) compared to the B light.
- the Gn filter 321g has a center wavelength set near 540 nm and is configured to transmit light in a narrow band (Gn light) compared to G light.
- the Bn filter 321b and the Gn filter 321g are simply indicated by Bn and Gn.
- the white light emitted from the white light source 31 passes through the second filter group 32B, thereby generating a plurality of bands of narrowband light for a narrowband light observation mode.
- the processor 4 has a configuration having a function as the medical image processing apparatus of the present embodiment. Specifically, as shown in FIG. 1, the processor 4 includes an image processing unit 41 and a control unit 42.
- the image processing unit 41 includes an image data generation unit 41a, a calculation unit 41b, and a video signal generation unit 41c.
- the image data generation unit 41 a of the image processing unit 41 is obtained in the CCD 23 by performing processing such as noise removal and A / D conversion on the output signal from the endoscope 2 based on the control of the control unit 42. Image data corresponding to the selected image is generated.
- the calculation unit 41b of the image processing unit 41 performs a predetermined process using the image data generated by the image data generation unit 41a, so that the biological mucous membrane fine image is obtained from the image data obtained by imaging the biological tissue 101.
- the structure is extracted, and further, an operation for setting a unit region based on a predetermined condition from the mucosal microstructure is performed.
- the image data includes a mucous membrane fine structure of a living body, and a unit as a process for setting a range which is one unit in terms of the biohistology of the mucosal fine structure as a unit region Perform area setting processing. Details of such unit area setting processing will be described later.
- the video signal generation unit 41c of the image processing unit 41 generates a video signal by performing processing such as gamma conversion and D / A conversion on the image data generated by the image data generation unit 41a, and displays the display device 5. Etc.
- control unit 42 When the control unit 42 detects that an instruction to switch to the normal light observation mode has been made based on an instruction from the observation mode switch 24, the control unit 42 controls to emit broadband light for the normal light observation mode from the light source device 3. To the rotary filter driving unit 35. Then, based on the control of the control unit 42, the rotary filter driving unit 35 inserts the first filter group 32A on the emission light path of the white light source 31, and the second filter from the emission light path of the white light source 31. The motor 34 is operated so as to retract the group 32B.
- control unit 42 detects that an instruction to switch to the narrowband light observation mode is made based on the instruction of the observation mode changeover switch 24, the control unit 42 outputs the narrowband light of a plurality of bands for the narrowband light observation mode. Control for emitting light from the light source device 3 is performed on the rotary filter drive unit 35.
- the rotary filter driving unit 35 interposes the second filter group 32 ⁇ / b> B on the emission light path of the white light source 31 and the first filter from the emission light path of the white light source 31.
- the motor 34 is operated so as to retract the group 32A.
- an image in which blood vessels near the surface layer included in the living tissue 101 are emphasized.
- the calculation unit 41b constituting the medical image processing apparatus of FIG. 1 receives a biological mucous membrane image (image data) obtained by imaging the biological mucous membrane with the CCD 23 from the image data generation unit 41a.
- a region extraction unit 43c for extracting a region, a closed region identification unit (or a substantially closed region identification unit) 43d for identifying at least one closed region (or surrounding region) that is considered to be surrounded by the mucosal microstructure region,
- a unit region setting unit that sets a unit region based on the mucous membrane fine structure region extracted by the region extraction unit 43c and the closure identified by the closed region identification unit 43d Has a 3e, the.
- the calculation unit 41b is not limited to
- the processor 4 as the medical image processing apparatus of the present embodiment includes the calculation unit 41b including the input unit 43a, the region extraction unit 43c, the closed region identification unit 43d, and the unit region setting unit 43e.
- the region extracting unit 43c may extract the mucosal microstructure structure region corresponding to the mucosal microstructure from the biological mucous membrane image input to the input unit 43a without performing preprocessing.
- the unit region setting unit 43e shown in FIG. 5 calculates the width of the mucosal microstructure region having a belt (belt) shape as the feature amount used for setting the range from the mucosal microstructure region or the closed region to the unit region.
- a width calculation unit 44a serving as a calculation unit
- a range setting unit 44b that sets the range of the unit region based on the feature amount of the width.
- the input unit 43a constituting the calculation unit 41b may be configured by an input end of image data to the calculation unit 41b.
- the region extraction unit 43c, the closed region identification unit 43d, the unit region setting unit 43e, the width calculation unit 44a, and the range setting unit 44b are provided in the calculation unit 41b configured by a central processing unit (CPU) or the like.
- CPU central processing unit
- the present embodiment is not limited to such a case, and may be configured using dedicated hardware for performing each process.
- the region extraction unit 43c, the closed region identification unit 43d, the unit region setting unit 43e, the width calculation unit 44a, and the range setting unit 44b are respectively divided into a region extraction circuit, a closed region identification circuit, a unit region setting circuit, a width calculation circuit, and a range.
- You may comprise by a setting circuit.
- the operator turns on the power of each part of the endoscope apparatus 1 and then selects the normal light observation mode with the observation mode switch 24. Then, the operator views the endoscope 2 while viewing the image displayed on the display device 5 when selecting the normal light observation mode, that is, an image having substantially the same color as when the object is viewed with the naked eye. By inserting into the body cavity, the distal end portion 21b is brought close to the site where the biological tissue 101 to be observed exists.
- the observation mode changeover switch 24 When the normal light observation mode is selected by the observation mode changeover switch 24, light of each color of R light, G light, and B light is sequentially emitted from the light source device 3 to the living tissue 101, and the light of each color is emitted from the endoscope 2. Images corresponding to each are acquired.
- the image data generation unit 41a of the image processing unit 41 receives color component image data corresponding to each image. Are generated respectively.
- FIG. 6A shows main processing for performing unit area setting processing.
- the mucosal microstructure included in the image data is shown.
- a unit region setting process will be described for the case of a marginal epithelium (Marginal Crypto Epithelium: MCE).
- FIG. 7 schematically shows an image obtained by imaging the pyloric gland of the stomach in the narrow-band light observation mode. Imaging in the narrow-band light observation mode has an advantage of displaying the structure in the vicinity of the surface layer more clearly than in the broadband light observation mode.
- FIG. 8 shows an image obtained by locally cutting a part of the image as shown in FIG. In FIG. 7 to FIG. 8, a plurality of band-shaped MCEs 51 overlap each other, and an inter-pit portion 52 exists in an inner region (usually a closed region) surrounded by the MCEs 51. Further, a blood vessel 53 runs in the inter-pit portion 52.
- image data captured by the endoscope 2 depends on imaging conditions such as distance and angle and the state of the mucous membrane itself, so that there may be a clear boundary between one MCE 51 or a boundary between adjacent MCEs 51. It may be unclear.
- the image data generated by the image data generation unit 41a is input from the input unit 43a into the preprocessing unit 43b in the calculation unit 41b.
- the image data in this embodiment and other embodiments is composed of 3 (color component) images of size (horizontal ⁇ vertical number of pixels) of ISX ⁇ ISY, RGB, and each image of R, G, and B Are provided with 8-bit gradations each having a value from 0 to 255.
- ISX ⁇ ISY 640 ⁇ 480.
- the preprocessing unit 43b performs preprocessing such as noise suppression and inverse gamma correction on the input image data.
- preprocessing such as noise suppression and inverse gamma correction
- known median filtering rearranging the pixel values in the mask including the target pixel by size and replacing the value of the target pixel by the median value
- noise suppression is performed with a mask size of 3 ⁇ 3.
- gamma correction is a non-linear process applied to provide a visually linear gradation when an image is displayed on a monitor or the like, and inverse gamma correction returns this to the original linear gradation. . For this reason, when the gamma correction is not performed on the image data input to the preprocessing unit 43b, the inverse gamma correction is not necessary.
- the preprocessed image data is output to the area extraction unit 43c.
- the region extracting unit 43c extracts a band-shaped MCE region MAi (i ⁇ 1) as a mucosal microstructure region corresponding to the mucosa microstructure.
- the method for extracting the MCE region MAi uses a structural component extraction method based on template matching described in Japanese Patent No. 4409166, for example.
- FIG. 6B shows a semi-elliptical shape used for template matching.
- values described in the examples of Japanese Patent No. 4409166 may be used as they are.
- the processing may be performed on the G image among the three RGB images.
- Such an extraction method is applied to the image data of FIG. 7 or FIG. 8 to extract the MCE region MA1 as shown in FIG. 9, for example.
- the extracted MCE area MA1 is output to the closed area identification unit 43d.
- the extracted MCE area MA1 is indicated by hatching.
- the closed region identifying unit 43d is roughly surrounded by the MCE region MAi detected as the mucosal microstructure region (in other words, the closed region or the region surrounded by the MCE region MAi is considered to be surrounded).
- the process of detecting or identifying the approximate closed area CAj (j ⁇ 1) is performed.
- the substantially closed region CAj is an enclosed closed region, and in the second embodiment to be described later, it is not completely enclosed by the enclosed closed region. And a generally closed region that is generally enclosed.
- the well-known labeling is performed on the pixel area other than the area detected as the MCE area MAi.
- the MCE area MA1 is divided into seven labels L1 to L7 as shown in FIG.
- a process of enclosing the open area and closing the non-closed end is performed in FIG. 10 in the MCE area MA1 in FIG. 9, a process of enclosing the open area and closing the non-closed end is performed.
- the MCE area MA1 is indicated by oblique lines.
- the pixels other than those including the outermost peripheral pixel of the image data (pixels with an X coordinate of 0 or 639, and pixels with a Y coordinate of 0 or 479) in the pixels in the label are defined as a substantially closed region CAj .
- the labels L1 to L6 excluding the label L7 become the substantially closed regions CA1 to CA6, respectively.
- Information on the approximate closed areas CA1 to CA6 detected or identified by the closed area identifying unit 43d is output to the unit area setting unit 43e.
- the range setting unit 44b of the unit region setting unit 43e calculates a width as a feature amount of the MCE region MAi based on the approximate closed region CAj, and performs a process of setting the range of the unit region UAj.
- the width calculation unit 44a of the unit region setting unit 43e first calculates a width Wk (k ⁇ 1) as a feature amount of the MCE region MAi adjacent to the substantially closed region CAj.
- the width calculation unit 44a for example, from the boundary pixel BPk (for example, on the side of the substantially closed region CAj) between the substantially closed region CAj and the adjacent MCE region MAi, in the direction opposite to the substantially closed region CAj adjacent to the MCE region MAi.
- the width Wk at the boundary pixel BPk is calculated.
- FIG. 11 shows an example of calculating widths W1, W2, and W3 in, for example, three representative directions when the approximate closed region CA3 is targeted, and the width W2 in one direction by focusing on the boundary pixel BP2.
- the enlarged view in the case of calculating is shown.
- the MCE region MA1 is indicated by hatching.
- the substantially closed region CA3 adjacent to the boundary pixel BP2 is located in the lower left direction (the direction indicated by A in FIG. 11), and thus is in the direction opposite to the lower left direction from the boundary pixel BP2.
- the pixel is scanned in the upper right direction.
- the number of pixels in the MCE area MA1 is counted while scanning in the upper right direction (in FIG. 11, the side of the substantially closed area CA4), it becomes 8 pixels, which is the value of the width W2.
- the number of pixels is counted by scanning in each direction, and the one with the smallest number of pixels is set as the value of the width Wk.
- FIG. 11 only three boundary pixels BP1, BP2, and BP3 shown discretely in the substantially closed region CA3 are illustrated, but the calculation of the width Wk is performed on all the boundary pixels BPk.
- the average value of the width Wk calculated for all the boundary pixels BPk is set as a region size ASj of the MCE region MAi adjacent to the substantially closed region CAj. Then, based on the calculation of the width by the width calculation unit 44a, the range setting unit 44b determines that the unit region UAj is a region obtained by combining the approximate closed region CAj and the MCE region MAi located within the range of the region size ASj from the approximate closed region CAj. Is set or determined as a range.
- FIG. 12 shows an example of the unit area UA3 set when the approximate closed area CA3 described in FIG. 11 is targeted, and the area size AS3 of the MCE area MA1 adjacent to the approximate closed area CA3.
- the unit area UA3 set in FIG. 12 is indicated by hatching.
- the unit area UA3 in FIG. 12 has a substantially closed area CA3 and an area size AS3 determined by the average value of the above widths from the outer periphery boundary (periphery boundary) of the approximate closed area CA3 so as to surround the approximate closed area CA3. This is a combined region of the band-shaped MCE region MA1 located in the range.
- the band-shaped MCE region MA1 located in the range of the region size AS3 surrounding the substantially closed region CA3 may be expressed as an MCE region MA1-3.
- the unit area UA3 in FIG. 12 is composed of a substantially closed area CA3 and a band-shaped MCE area MA1-3 surrounding the substantially closed area CA3.
- the width Wk as the feature value of the band-shaped MCE area MAi surrounding the approximate closed area CAj is calculated from the approximate closed area CAj to set the range of the unit area UAj.
- the range of one unit area UA3 is allowed to overlap with the range of another unit area UA2.
- the unit area UA3 set from the substantially closed area CA3 described in FIG. 12 is indicated by hatching. Further, according to FIG. 13, the unit area UA2 set from the approximate closed area CA2 is indicated by diagonal lines by the same processing as in the case of the approximate closed area CA3.
- the unit area UA2 and the unit area UA3 overlap with each other in the area indicated by cross-hatching.
- a part of the unit area as described above is another unit area. Allow overlap with part of the region.
- the unit region setting unit 43e sets the unit region independently for each of the plurality of closed regions when the unit region is set for the plurality of closed regions. As shown in FIG. 5, a part of the unit areas (specifically, the band-shaped MCE area) is allowed to overlap each other.
- an area that is a unit in terms of biological histology can be set as the unit area UAj from the image data.
- one closed region CAj that is regarded as a closed region or closed region such as the inter-cavity 52 including the blood vessel 53, and one MCE that has a band-like closed loop shape surrounding the substantially closed region CAj.
- the area MAi can be set as the unit area UAj.
- the width Wk of the MCE region MAi is calculated using the boundary pixel BPk of the substantially closed region CAj has been described, but may be calculated from the MCE region MAi.
- the feature amount calculation unit such as the width calculation unit 44a may calculate a feature amount used for setting the range of the unit region from the MCE region MAi as the mucosal microstructure region, or may be an outline as a closed region.
- the feature amount used for setting the range of the unit area may be calculated from the closed area CAj.
- the content of the pre-processing in step S2 in FIG. 6A is not limited to noise suppression and inverse gamma correction, and brightness or color tone correction may be added.
- the preprocessing methods are not limited to the methods, and may be changed according to the characteristics of the target image or device. For example, although median filtering is used as a noise suppression method in the present embodiment, other methods such as a smoothing filter may be used.
- the detection or extraction method of the MCE region MAi in step S3 is not limited to the above-described structural component extraction method, and other methods may be used.
- threshold processing for pixel values such as a luminance value and color tone
- a method using various frequency filters such as a Gabor filter
- line segment detection processing using a Hessian matrix and a vector concentration degree which are known techniques, may be used.
- the method for detecting or identifying the roughly closed region CAj in step S4 is not limited to the above-described method using labeling, and other methods may be used.
- the closed region may be searched by scanning the boundary pixel BPk of the detected MCE region MAi, and the substantially closed region CAj may be detected.
- the width Wk is calculated by counting the number of pixels from the boundary pixel BPk.
- the width Wk is not limited to this method, and the size of the template or filter used in the extraction of the MCE area MAi in step S3 is the width. Other methods such as use as Wk may be used.
- the width Wk may be calculated from pixels sampled every several pixels, for example, without using all the boundary pixels BPk.
- the calculation of the region size ASj is not limited to the average value of the width Wk, and other statistics such as the mode value, minimum value, and maximum value of the width Wk may be used.
- the calculation of the area size ASj is not limited to the method using the width Wk, and may be calculated from the size / shape of the substantially closed area CAj or the user may set an arbitrary value. Good. Further, the region size ASj that defines the range of the unit region may be determined from the area of the substantially closed region CAj.
- the range of the unit area UAj is set by the distance from the boundary pixel of the approximate closed area CAj, but is not limited to this method.
- the range of the unit area UAj may be set based on the above.
- FIG. 13 shows an example in which each unit area UAj allows overlapping, it may be determined which unit area UAj the overlapping portion belongs to and the overlapping area may be divided.
- the unit area UAj is described as including the MCE area MAj.
- the unit area UAj may be defined so as not to include the MCE area MAj.
- unit area UAj substantially closed area CAj.
- Each pixel set as the unit area UAj may be stored as image data, and an image may be displayed on the display device 5 or the like by generating a video signal by the video signal generation unit 41c.
- the range of one unit of the mucosal microstructure can be set to be separable or extractable as a unit region. Diagnosis of individual structures on a mirror image can be supported. In other words, according to the present embodiment, it is possible to set or extract the range of the mucous membrane microstructure as a unit region, which is a unit in terms of biological histology.
- This modification relates to an image processing apparatus and a processing method that can accurately detect the unit area UAj when the MCE area MAi does not exhibit a complete closed curve due to, for example, tissue destruction due to cancer or imaging conditions. explain. Specifically, MCE may be obscured by the destruction or replacement of normal mucosal cells by the growth of cancer cells.
- description will be made mainly with reference to FIGS. 14A to 18.
- This modified example corresponds to a modified example related to the process of detecting the roughly closed region CAj in step S4 in the first embodiment, and the MCE region MAi detected in step S3 in the first embodiment exhibits a complete closed curve.
- the first embodiment can also be applied to the case where the substantially closed area CAj cannot be detected with high accuracy, such as when the area is not present.
- the configuration of this modification is almost the same as that of the first embodiment. Specifically, the calculation unit 41b constituting the image processing unit 41 in the first embodiment further sets a virtual mucosal microstructure region for the biological mucous membrane image as indicated by a dotted line in FIG. A virtual mucous membrane fine structure setting unit 43f is included.
- the virtual mucosal microstructure setting unit 43f sets a virtual MCE area MAi as a virtual mucosal microstructure structure area. For this reason, the virtual mucous membrane fine structure setting unit 43f has a function of a virtual MCE region setting unit.
- the closed region identifying unit 43d includes the mucosal fine structure region extracted by the region extracting unit 43c and the virtual mucosal fine structure region set by the virtual mucosal fine structure setting unit 43f (specifically, Identifies the region surrounded by the virtual MCE region MAi) as the closed region (or the substantially closed region) described in the first embodiment.
- step S11 after step S3 an opening end that opens in the MCE region MAj is detected as described later, and the virtual mucous membrane fine structure setting is performed.
- the unit 43f performs a process of connecting the open ends with the virtual gland VLx, and performs the process of step S4 after the process of connecting with the virtual gland VLx.
- Other processes are the same as those in the first embodiment. Therefore, only differences from the first embodiment will be mainly described.
- FIG. 14A shows an example in which the MCE region MA1 is detected or extracted in step S3.
- description will be made as a candidate area CCAj of the non-closed substantially closed area CAj indicated by hatching in FIG. 14A.
- the candidate area CCAj is also identified as a closed area in the same manner as in the case of the substantially closed area CAj (closed), and the unit area is set in the same manner as in the case of the approximate closed area CAj. To be able to.
- the MCE region MAi extracted as described above has two open ends that are not closed, the open region surrounded by a portion other than the two open ends becomes the candidate region CCAj of the approximate closed region CAj. .
- the core area CLi is detected as described below so that a unit area as a unit can be set biologically. ) Set the closed region by connecting the open ends.
- the virtual mucous membrane fine structure setting unit 43f in this modification performs the following processing.
- the virtual mucous membrane fine structure setting unit 43f first detects the core line CLi of the MCE region MAi by a known method such as thinning the MCE region MAj.
- the virtual mucous membrane fine structure setting unit 43f when there is an unclosed end point (also referred to as an open end) in the detected core line CLi as shown in step S11 of FIG. 14B, the end points are connected to the virtual line VLx (x ⁇ Connect in 1).
- FIG. 16 shows a connection example of the virtual line VLx. Subsequently, as in the case of performing step S4 of FIG. 6A in the first embodiment, a known labeling is performed. However, the pixels other than the region detected as the MCE region MAi and the virtual line VLx are subjected to labeling. Let it be a pixel. The subsequent processing is the same as in the first embodiment.
- step S4 the unit area UAj in step S5 is set, and the process in FIG. 14B is terminated.
- the end points of the core line CLi are connected by a straight virtual line VLx, but may be connected by a curve or the like.
- imaginary lines VLx (VL1 and VL2 in FIG. 18) are drawn from the end points of the respective core lines CLi in the extending direction of the core line CLi at the end points or the extending direction of the band-shaped MCE region. May be used to detect or identify the substantially closed region CAj. Thereby, compared with the case where the end points are connected with a straight line, the roughly closed region CAj having a complicated shape may be detected with high accuracy.
- the unit region UAj can be obtained with high accuracy even when the MCE region MAi does not exhibit a complete closed curve due to tissue destruction due to cancer or the like. Can be set.
- the endoscopic image obtained by imaging the mucosal microstructure is such that the MCE 51 between the adjacent interpits 52 is indicated by a dotted line due to tissue destruction due to cancer, imaging conditions, or the like. May become ambiguous.
- each inter-pit portion 52 can be set as an individual unit area without being set as one large unit area UAj as shown by the shaded pattern in FIG.
- a processing apparatus and a processing method thereof will be described.
- a description will be given with reference to FIGS. 19A to 27.
- This embodiment corresponds to the case where the same processing is performed from step S1 to step S3 in the first embodiment, and different processing is used in the detection of the approximate closed region CAj and the setting of the unit region UAj after step S4. The process will be described. Therefore, only differences from the first embodiment will be described.
- the medical image processing apparatus has the same configuration as that in FIG. 1, and the calculation unit 41b in the present embodiment includes the input unit 43a described above (in the first embodiment), as shown in FIG. 19A.
- a pre-processing unit 43b, a region extraction unit 43c, a closed region identification unit 43d, and a unit region setting unit 43e are included.
- the closed region identification unit 43d includes a convolution operation unit 45a using a matched filter, and a general closed region pixel candidate detection / labeling unit 45b that detects and labels pixel candidates in the general closed region.
- the substantially closed region pixel candidate detection / labeling unit 45b is divided into a substantially closed region pixel candidate detecting unit that detects pixel candidates in the substantially closed region and a labeling unit that labels the detected pixel candidates in the substantially closed region. It may be configured.
- the unit area setting unit 43e in the present embodiment sets a unit area using processing different from that in the first embodiment (specifically, the matched filter information).
- Step S21 in FIG. 19B is equivalent to the MCE region detection in step S3 of the first embodiment.
- the MCE detection process is applied to FIG. 20, the detection result of the MCE region MAi as shown in FIG. 22 is obtained. At this time, an area where the MCE is unclear may not be detected as the MCE area MAi.
- the MCE 51 is indicated by a shaded pattern.
- the MCE area MAi in FIG. 22 is also shown with a shaded pattern.
- the convolution operation unit 45a of the closed region identification unit 43d performs a convolution operation using the matched filter MF on the MCE region MAi.
- the matched filter MF for example, a matched filter MF in which a donut shape having an inner radius of 11 pixels and an outer radius of 21 pixels is designed in a 43 ⁇ 43 rectangular area as shown in FIG.
- the filter coefficient in the matched filter MF is 1 for the gray donut-shaped portion and 0 for the other white portions.
- the pixel values of the image data are converted into pixels belonging to the MCE area MAi indicated by a shaded pattern as shown in FIG. Is 1 and the other pixels are 0.
- a matched filter response value RVy (0 ⁇ y ⁇ ISX ⁇ ISY) is calculated for all the pixels of the image data.
- the approximate closed region pixel candidate detection / labeling unit 45b of the closed region identifying unit 43d uses the following formula (1) to calculate a pixel having a high matched filter response value RVy, that is, the approximate closed region candidate pixel CPz ( z ⁇ 1) is detected or identified using a threshold Thre1.
- the approximate closed region candidate pixel CPz represents each pixel that satisfies Equation (1).
- the MCE area MAi is indicated by a shaded pattern.
- the approximate closed region pixel candidate detection / labeling unit 45b performs known labeling on the approximate closed region candidate pixel CPz. As a result, it is divided into six labels (L1 ′ to L6 ′) as shown in FIG.
- the unit region setting unit 43e sets the range of the radius of the circle outside the matched filter MF (21 pixels in this embodiment) from the boundary pixels of the labels of the labels L1 ′ to L6 ′.
- Each is set as a unit area UAj.
- FIG. 27 shows a unit area UA2 when the label L2 'is taken as an example. By applying the same processing to the label L3 'etc. adjacent to the unit area UA2, the unit area can be set. Note that, as in the first embodiment and its modifications, the ranges of the unit areas UAj are allowed to overlap.
- matched filter MF only one type of donut filter is used as the matched filter MF.
- shape, size, and number of filters are not limited to this, and a matched filter such as a circular shape or a polygon is used.
- a plurality of matched filters having different sizes and shapes may be applied, and a filter having the largest filter response value RVy may be employed when a plurality of matched filters are applied.
- the range of the unit area UAj is set based on the radius of the matched filter MF.
- the present invention is not limited to this method, and the range may be calculated from the sizes of the labels L1 ′ to L6 ′. The user may specify in advance.
- the MCE between adjacent interpits is not clear, and a plurality of unit areas UAj are combined or known Even when the end points of the core line CLi extracted using thinning are unclear, they can be divided and the ranges of the individual unit areas UAj can be set with high accuracy.
- G image captured in the narrowband observation mode has been described, other color signals of RGB (specifically, an R image or a B image) may be used, or G / R or G ( A calculation result calculated by a combination of each color signal such as R + G + B) may be used. Further, image data captured in an observation mode other than the narrow band observation mode may be used.
- RGB specifically, an R image or a B image
- G / R or G A calculation result calculated by a combination of each color signal such as R + G + B
- image data captured in an observation mode other than the narrow band observation mode may be used.
- the MCE of the pyloric gland mucosa of the stomach has been described as an example, but the target organ is not limited, and the gastric fundus gland mucosa and intestinal metaplasia are not limited.
- Applicable to other digestive tract organs such as large intestine, small intestine and esophagus.
- Specific examples include neoplastic lesions that exhibit tubular or villous findings of the large intestine, Barrett's esophageal mucosa, and the like.
- the target mucosal microstructure is not limited to MCE, and other mucosal microstructures such as blood vessels, pits, surfaces, and patterns may be used.
- the threshold value and filter coefficient when detecting the mucosal microstructure may be appropriately changed according to the mucous membrane microstructure.
- Numeral values such as threshold values and filter coefficients are not limited to those described in this specification, and values may be changed.
- the unit area is divided and extracted from the mucosal microstructure captured in the medical image. As a result, it is possible to support the user's observation and diagnosis.
- a unit region is set from a mucosal microstructure captured as an endoscopic image that is a medical image, and an area including one or more unit regions from the set unit region is set.
- a medical image processing apparatus for determining the state of the mucous membrane will be described. 28 to 43D relate to the third embodiment of the present invention.
- an endoscope apparatus 501 includes an endoscope 502 that is inserted into a body cavity of a subject and outputs an image obtained by imaging a subject such as a living tissue 601 in the body cavity, and a living body.
- a light source device 503 that emits illumination light for illuminating the tissue 601
- a processor 504 that constitutes a medical image processing device that performs various processes on an output signal from the endoscope 502, and a video signal from the processor 504
- a display device 505 for displaying an image corresponding to the image data, and an external storage device 506 for storing an output signal corresponding to the processing result in the processor 504.
- the endoscope 502 includes an insertion portion 521a having an elongated shape and size that can be inserted into a body cavity of a subject, a distal end portion 521b provided on the distal end side of the insertion portion 521a, and a proximal end side of the insertion portion 521a. And an operating portion 521c provided.
- a light guide 507 for transmitting illumination light emitted from the light source device 503 to the distal end portion 521b is inserted into the insertion portion 521a.
- One end face (light incident end face) of the light guide 507 is detachably connected to the light source device 503.
- the other end face (light emitting end face) of the light guide 507 is disposed in the vicinity of an illumination optical system (not shown) provided at the distal end portion 521b of the endoscope 502.
- an illumination optical system not shown
- the illumination light emitted from the light source device 503 passes through the light guide 507 connected to the light source device 503 and the illumination optical system (not shown) provided at the distal end portion 521b, and then enters the body cavity.
- the light is emitted to the living tissue 601.
- the distal end portion 521b of the endoscope 502 includes an objective optical system 522 that forms an optical image of a subject, and an imaging unit that is disposed at the imaging position of the objective optical system 522 and captures the optical image and acquires it as an image.
- a charge-coupled device (abbreviated as CCD) 523 is provided.
- the operation unit 521c of the endoscope 502 is provided with an observation mode switching switch 524 capable of giving an instruction to switch the observation mode to either the normal light observation mode or the narrow band light observation mode. .
- the light source device 503 includes a white light source 531 formed of a xenon lamp, a rotary filter 532 that uses white light emitted from the white light source 531 as surface-sequential illumination light, a motor 533 that rotationally drives the rotary filter 532, and a rotary filter 532. And a motor 534 that moves the motor 533 in a direction perpendicular to the emission light path of the white light source 531 (reference numeral B in FIG. 1), and a rotary filter drive that drives the motors 533 and 34 based on the control of the control unit 542 of the processor 504. And a condensing optical system 536 that condenses the illumination light that has passed through the rotary filter 532 and supplies it to the incident end face of the light guide 507.
- the rotary filter 532 has a disk shape with the center as the rotation axis, and includes a first filter group 532A including a plurality of filters provided along the circumferential direction on the inner circumference side, and the outer circumference. And a second filter group 532B including a plurality of filters provided along the circumferential direction of the side. Then, when the driving force of the motor 533 is transmitted to the rotation shaft, the rotation filter 532 rotates.
- the rotary filter 532 is configured by a light-shielding member except for the portions where the filters of the first filter group 532A and the second filter group 532B are arranged.
- the first filter group 532A is provided along the circumferential direction on the inner peripheral side of the rotary filter 532, and passes through an R filter 532r that passes light in the red wavelength band, and passes light in the green wavelength band.
- R filter 532r that passes light in the red wavelength band
- G filter 532g and a B filter 532b that transmits light in the blue wavelength band are included.
- the R filter 532r has a configuration that mainly transmits light (R light) from 600 nm to 700 nm, for example, as shown in FIG. Further, for example, as shown in FIG. 30, the G filter 532g has a configuration that mainly transmits light (G light) from 500 nm to 600 nm. Further, for example, as shown in FIG. 30, the B filter 532b has a configuration that mainly transmits light (B light) from 400 nm to 500 nm. In FIG. 30, the R filter 532r, the G filter 532g, and the B filter 532b are simply indicated by R, G, and B.
- the white light emitted from the white light source 531 passes through the first filter group 532A to generate broadband light for the normal light observation mode.
- the second filter group 532B includes a Bn filter 821b that transmits blue and narrow band light, and a Gn that transmits green and narrow band light, which are provided along the circumferential direction on the outer peripheral side of the rotary filter 532. And a filter 821g.
- the Bn filter 821b has a center wavelength set near 415 nm and is configured to transmit light in a narrow band (Bn light) compared to B light.
- the Gn filter 821g has a center wavelength set near 540 nm and is configured to transmit light in a narrow band (Gn light) compared to G light.
- the Bn filter 821b and the Gn filter 821g are simply indicated by Bn and Gn.
- the white light emitted from the white light source 531 passes through the second filter group 532B, thereby generating a plurality of bands of narrowband light for a narrowband light observation mode.
- the processor 504 has a configuration having a function as the medical image processing apparatus of the present embodiment. Specifically, as illustrated in FIG. 1, the processor 504 includes an image processing unit 541 and a control unit 542.
- the image processing unit 541 includes an image data generation unit 541a, a calculation unit 541b, and a video signal generation unit 541c.
- the image data generation unit 541a of the image processing unit 541 is obtained in the CCD 523 by performing processing such as noise removal and A / D conversion on the output signal from the endoscope 502 based on the control of the control unit 542. Image data corresponding to the selected image is generated.
- the calculation unit 541b of the image processing unit 541 performs predetermined processing using the image data generated by the image data generation unit 541a, so that the mucous membrane fine particles of the living body are extracted from the image data obtained by imaging the living tissue 601. Based on the calculation of the feature amount after extracting the structure, performing an operation to set a unit region based on a predetermined condition from the mucosal microstructure, and further setting a discrimination target region including the unit region of interest as a unit region of interest Thus, a process of discriminating the state of the mucous membrane in the discrimination target area is performed.
- the image data includes a mucous membrane fine structure of a living body, and a region or a range that is a unit in terms of the biohistology of the mucosal fine structure is set as a unit region.
- the unit area setting process is performed. Details of such unit area setting processing will be described later.
- the video signal generation unit 541c of the image processing unit 541 generates a video signal by performing processing such as gamma conversion and D / A conversion on the image data generated by the image data generation unit 541a to generate a display device 505. Etc.
- the control unit 542 controls to emit broadband light for the normal light observation mode from the light source device 503 when detecting that an instruction to switch to the normal light observation mode is performed based on the instruction of the observation mode switching switch 524. Is performed on the rotary filter driving unit 535. Then, based on the control of the control unit 542, the rotary filter driving unit 535 causes the first filter group 532A to be inserted on the emission light path of the white light source 531 and the second filter from the emission light path of the white light source 531. The motor 534 is operated so as to retract the group 532B.
- control unit 542 detects that an instruction to switch to the narrowband light observation mode is performed based on the instruction of the observation mode changeover switch 524, the control unit 542 outputs the narrowband light of a plurality of bands for the narrowband light observation mode. Control for emitting light from the light source device 503 is performed on the rotary filter driving unit 535.
- the rotary filter driving unit 535 causes the second filter group 532B to be inserted on the emission light path of the white light source 531 and the first filter from the emission light path of the white light source 531.
- the motor 534 is operated so as to retract the group 532A.
- the color of the observation object such as the living tissue 601 is substantially the same as that of the naked eye.
- An image (normal light image) can be displayed on the display device 505 and further stored in the external storage device 506.
- an image (narrowband light image) in which blood vessels near the surface layer included in the living tissue 601 are emphasized. ) Can be displayed on the display device 505 and further stored in the external storage device 506.
- the calculation unit 541b configuring the medical image processing apparatus generates a biological mucosal image (image data) as medical image information obtained by imaging the biological mucous membrane with the CCD 523 from the image data generation unit 541a.
- the calculation unit 541b further sets a unit region setting unit 543e that sets one or more unit regions based on the mucosal microstructure region extracted by the region extraction unit 543c and the closed region identified by the closed region identification unit 543d. And a feature amount calculation unit 543f that calculates a feature amount including a first feature amount and the like from the unit region set by the unit region setting unit 543e.
- the calculation unit 541b further includes a discrimination target region setting unit 543g that sets a discrimination target region from a plurality of unit regions based on the feature amount (specifically, the second feature amount) calculated by the feature amount calculation unit 543f. And a determination unit 543h for determining the state of the mucous membrane of the determination target region based on the first feature amount calculated by the feature amount calculation unit 543f, and a plurality of unit regions are set by the unit region setting unit 543e.
- a unit unit of interest setting unit 543i that sets a unit of interest as a single unit region that meets a predetermined condition and is noticed by a user such as a surgeon from a plurality of unit regions.
- the attention unit area setting unit 543i is not limited to being provided inside the calculation unit 541b, and may be provided outside the calculation unit 541b.
- the feature amount calculation unit 543f includes a first feature amount calculation unit 544a that calculates a first feature amount as a feature amount used by the determination unit 543h to determine the state of the mucous membrane having the determination target region; A second feature quantity calculation unit 544b that calculates a second feature quantity as a feature quantity used for setting the discrimination target area, and the relevance of the target unit area with respect to the unit area around the target unit area And a third feature amount calculation unit 544c that calculates a third feature amount related to.
- the calculation unit 541b is not limited to the configuration including the input unit 543a, and the image processing unit 541 (for example, the image data generation unit 541a) excluding the calculation unit 541b may be configured to include the input unit 543a.
- the processor 504 as the medical image processing apparatus includes the input unit 543a, the region extraction unit 543c, the closed region identification unit 543d, the unit region setting unit 543e, and the feature amount calculation unit 543f (first of them). It is characterized by having a calculation unit 541b including a feature quantity calculation unit 544a) and a determination unit 543h, and constituent elements other than the characteristic constituent elements may be provided as necessary.
- the region extracting unit 543c may be configured to extract the mucosal microstructure structure region corresponding to the mucosal microstructure from the biological mucosa image input to the input unit 543a without performing preprocessing.
- the unit region setting unit 543e shown in FIG. 32 calculates a feature amount for calculating the width of the mucous membrane microstructure region having a belt (belt) shape as a feature amount used for setting the range from the mucosal microstructure region or the closed region to the unit region.
- a width calculation unit 545a serving as a calculation unit, and a range setting unit 545b that sets a range of the unit region based on the feature amount of the width are provided.
- the discrimination target area setting unit 543g sets a discrimination target area from the plurality of unit areas based on the second feature quantity, and the first feature quantity
- the calculation unit 544a calculates the first feature amount in the discrimination target region set by the discrimination target region setting unit 543g.
- the discrimination target region setting unit 543g includes a threshold setting unit 546a that sets a threshold used when setting the discrimination target region based on the second feature amount.
- the threshold set by the threshold setting unit 546a and the third threshold are set. Based on the feature amount, the determination target region may be set from the target unit region and a plurality of surrounding unit regions.
- the threshold value setting unit 546a is provided with a threshold value selection unit 546b as shown by a dotted line in FIG. 32, and the threshold value selection unit 546b is actually used from a plurality of types of threshold values such as a distance threshold value, an area threshold value, and an order threshold value.
- One threshold value to be used may be selected, and the discrimination target region may be set using the selected threshold value.
- the determination unit 543h includes a determination threshold setting unit 547a that sets a determination threshold used for determining the state of the mucous membrane based on the first feature amount calculated by the first feature amount calculation unit 544a. .
- the input unit 543a constituting the calculation unit 541b may be configured by an input end of image data to the calculation unit 541b.
- the region extraction unit 543c, the closed region identification unit 543d, the unit region setting unit 543e, the feature amount calculation unit 543f, the determination target region setting unit 543g, the determination unit 543h, the attention unit region setting unit 543i, and the like are included in the calculation unit 541b.
- the present embodiment is not limited to such a case, and may be configured using dedicated hardware for performing each process.
- the operator turns on the power of each part of the endoscope apparatus 501 and then selects the normal light observation mode with the observation mode switch 524. Then, the surgeon moves the endoscope 502 while viewing the image displayed on the display device 505 when the normal light observation mode is selected, that is, an image having substantially the same color as when the object is viewed with the naked eye. By inserting it into the body cavity, the tip 521b is brought close to the site where the biological tissue 601 to be observed exists.
- the observation mode changeover switch 524 When the normal light observation mode is selected by the observation mode changeover switch 524, light of each color of R light, G light, and B light is sequentially emitted from the light source device 503 to the living tissue 601, and the light of each color is emitted from the endoscope 502. Images corresponding to each are acquired.
- the image data generation unit 541a of the image processing unit 541 inputs image data of color components corresponding to each image. Are generated respectively.
- Steps S501 to S505 in FIG. 33A show main processing for performing unit area setting processing.
- FIG. 34 schematically shows an image obtained by imaging the pyloric gland of the stomach in the narrow-band light observation mode. Imaging in the narrow-band light observation mode has an advantage of displaying the structure in the vicinity of the surface layer more clearly than in the broadband light observation mode.
- FIG. 35 shows an image obtained by locally cutting out a part of the image shown in FIG. 34 or a part of an image similar to the part shown in FIG.
- a plurality of band-shaped MCEs 551 overlap each other, and an inter-pit portion 552 exists in an inner region (usually a closed region) surrounded by the MCEs 551.
- a blood vessel 553 runs in the inter-pit portion 552.
- the boundary between one MCE 551 may be clear or the boundary between adjacent MCEs 551 may be different. It may be unclear.
- the boundary of the MCE 551 indicated by the symbol B1 is clear, but the boundary of the MCE 551 indicated by the symbol B2 is unclear.
- the image data generated by the image data generation unit 541a is input from the input unit 543a into the preprocessing unit 543c in the calculation unit 541b.
- the image data in this embodiment and other embodiments is composed of 3 (color component) images of size (horizontal ⁇ vertical number of pixels) of ISX ⁇ ISY, RGB, and each image of R, G, and B Are provided with 8-bit gradations each having a value from 0 to 255.
- ISX ⁇ ISY 640 ⁇ 480.
- the preprocessing unit 543c performs preprocessing such as noise suppression and inverse gamma correction on the input image data.
- preprocessing such as noise suppression and inverse gamma correction
- known median filtering rearranging the pixel values in the mask including the target pixel by size and replacing the value of the target pixel by the median value
- noise suppression is performed with a mask size of 3 ⁇ 3.
- gamma correction is a non-linear process applied to provide a visually linear gradation when an image is displayed on a monitor or the like, and inverse gamma correction returns this to the original linear gradation. . For this reason, when gamma correction is not performed on the image data input to the preprocessing unit 543c, reverse gamma correction is not necessary.
- the preprocessed image data is output to the area extraction unit 543c.
- the region extraction unit 543c extracts a band-shaped MCE region MBi (i ⁇ 1) as a mucosal microstructure region corresponding to the mucosa microstructure.
- the MCE region MBi extraction method uses, for example, a structural component extraction method based on template matching described in Japanese Patent No. 4409166.
- FIG. 33B shows a semi-elliptical shape used for template matching.
- values described in the examples of Japanese Patent No. 4409166 may be used as they are.
- the processing may be performed on the G image among the three RGB images.
- Such an extraction method is applied to the image data shown in FIG. 34 or FIG. 35 to extract the MCE region MB1 as shown in FIG.
- the extracted MCE region MB1 is output to the closed region identification unit 543d.
- the extracted MCE region MB1 is indicated by hatching.
- the closed region identifying unit 543d is roughly surrounded by the MCE region MBi detected as the mucosal microstructure region (in other words, the closed region or the region surrounded by the MCE region MBi is considered to be surrounded).
- the process of detecting or identifying the approximate closed area CBj (j ⁇ 1) is performed.
- the substantially closed region CBj is an enclosed closed region, but may be applied to the case of the approximate closed region considered to be enclosed as described above.
- step S504 the well-known labeling is performed on the pixel region other than the region detected as the MCE region MBi.
- the MCE area MB1 is divided into eight labels L11 to L18 as shown in FIG. In FIG. 37, the MCE region MB1 is indicated by hatching.
- the pixels other than the label L18 including the outermost peripheral pixel of the image data (the X coordinate is 0 or 639 and the Y coordinate is 0 or 479) in the pixels in the label are substantially closed regions CBj.
- the labels L11 to L17 except for the label L18 become the substantially closed regions CB1 to CB7, respectively.
- all of CB1 to CB7 are not explicitly shown by reference numerals.
- the closed regions of labels L12 and L13 are shown as CB2 and CB3
- the closed regions of labels L13 and L14 are shown as CB3 and CB4. It is clearly stated.
- Information on the approximate closed regions CB1 to CB7 detected or identified by the closed region identifying unit 543d is output to the unit region setting unit 543e.
- the range setting unit 545b of the unit region setting unit 543e calculates the width as the feature amount of the MCE region MBi based on the approximate closed region CBj and performs a process of setting the range of the unit region UBj.
- the width calculation unit 545a of the unit region setting unit 543e first calculates a width Tk (k ⁇ 1) as a feature amount of one MCE region MBi adjacent to one roughly closed region CBj.
- the width calculation unit 545a for example, from the boundary pixel BQk between the approximate closed region CBj and the adjacent MCE region MBi (for example, on the approximate closed region CBj side), in the direction opposite to the approximate closed region CBj adjacent to the MCE region MBi.
- the pixel is scanned and the number of pixels in the MCE region MBi is counted to calculate the width Tk at the boundary pixel BQk.
- the enlarged view in the case of calculating width T2 in one direction is shown.
- the MCE region MB1 is indicated by hatching.
- the substantially closed region CB3 adjacent to the boundary pixel BQ2 is located in the lower left direction (the direction indicated by A in FIG. 38), and thus is in the direction opposite to the lower left direction from the boundary pixel BQ2.
- the pixel is scanned in the upper right direction.
- the number of pixels in the MCE region MB1 is counted while scanning in the upper right direction (in FIG. 38, the side of the substantially closed region CB4), it becomes 8 pixels, which is the value of the width T2.
- FIG. 38 illustrates only three boundary pixels BQ1, BQ2, and BQ3 discretely shown in the substantially closed region CB3, but the calculation of the width Tk is performed for all the boundary pixels BQk.
- the average value of the widths Tk calculated for all the boundary pixels BQk is set as the region size BSj of the MCE region MBi adjacent to the substantially closed region CBj. Then, based on the calculation of the width by the width calculation unit 545a, the range setting unit 545b determines the unit region UBj as a region obtained by combining the approximate closed region CBj and the MCE region MBi located in the range of the region size BSj from the approximate closed region CBj. Is set or determined as a range.
- FIG. 39 shows an example of the unit area UB3 set when the approximate closed area CB3 described in FIG. 38 is targeted, and the area size BS3 of the MCE area MB1 adjacent to the approximate closed area CB3.
- the unit region UB3 set in FIG. 39 is indicated by hatching.
- the unit area UB3 is located within the range of the approximate closed area CB3 and the area size BS3 determined by the above average value of the width from the outer peripheral boundary of the approximate closed area CB3 so as to surround the approximate closed area CB3. This is a combined region of the band-shaped MCE region MB1.
- the width Tk as the feature amount of the band-shaped MCE region MBi surrounding the approximate closed region CBj is calculated from the approximate closed region CBj to calculate the unit.
- the processing or method for setting the range of the region UBj has been exemplified, the processing is actually performed for all the substantially closed regions CBj.
- the range of one unit area UB3 is allowed to overlap with the range of another unit area UB2.
- the unit area UB3 set from the substantially closed area CB3 described in FIG. 39 is indicated by hatching. Further, according to FIG. 40, the unit area UB2 set from the substantially closed area CB2 is indicated by diagonal lines by the same processing as in the case of the roughly closed area CB3.
- the unit area UB2 and the unit area UB3 overlap with each other in the area indicated by cross-hatching.
- a part of the unit area in this way is another unit area. It is allowed to overlap with a part of the region (specifically, a band-shaped MCE region).
- the unit region setting unit 543e sets the unit region independently for each of the plurality of closed regions when the unit region is set by setting the MCE region so as to surround the plurality of closed regions.
- some MCE regions of the unit region are allowed to overlap each other.
- a region that is a unit in terms of biological histology can be set as the unit region UBj from the image data.
- one closed region CBj regarded as a closed region or closed region such as the inter-cavity 552 including the blood vessel 553, and one MCE having a band-shaped closed loop shape surrounding the substantially closed region CBj.
- the area MBi can be set as the unit area UBj.
- the determination target area Ad including the unit area UBj is set by calculating the feature amount of the unit area UBj as described below.
- step S506 after step S505, the unit of interest setting unit 543i sets the unit of attention UBin as one unit region existing at a distance closest to the center of the image.
- FIG. 41A shows an example of the target unit region UBin set corresponding to this condition C1 (that is, one unit region existing at a distance closest to the center of the image).
- the example shown in FIG. 41A shows a case where the unit area UB3 shown in FIG. 40 is set as the target unit area UBin.
- the second feature amount calculation unit 544b uses the outer shape of the target unit region UBin as a second feature amount related to the size of the region size of the target unit region UBin or the shape of the region.
- the maximum value of the (contour) width is calculated.
- the second feature amount calculation unit 544b outputs the calculated maximum value of the width to the discrimination target region setting unit 543g.
- the feature amount calculated by the second feature amount calculation unit 544b is not limited to the maximum value of the width of the target unit region UBin, and may be an average value of the width, or calculate the area as described later. You may do it. Further, the perimeter of the target unit region UBin may be calculated instead of the area.
- the discrimination target area setting unit 543g sets a circular area Ac having a diameter that is, for example, three times the maximum value Tmax of the calculated width as the threshold of the distance set by the threshold setting unit 546a.
- the inside of the circular area Ac is set as the discrimination target area Ad. That is, the threshold setting unit 546a sets a distance threshold from the maximum value of the width as the second feature amount.
- a circular area Ac having a radius of 3 times the maximum value Tmax of the calculated width around the target unit area UBin (its center of gravity position) is set as the discrimination target area Ad.
- the unit areas included inside the circular area Ac are set as a unit area group of the discrimination target area Ad.
- a region included in the determination target region Ad outside the region illustrated in FIG. 41A is also illustrated.
- the present invention is not limited to the case where the inside of the circular area Ac is set as the determination target area Ad.
- a unit area at least partially included in the circular area Ac is set as the unit area of the determination target area Ad
- a unit area in which at least a half or more of the area is included in the circular area Ac may be set as the unit area of the determination target area Ad.
- a unit area almost entirely included in the circular area Ac may be set as the unit area of the discrimination target area Ad.
- the discrimination target area setting unit 543g sets (determines) the discrimination target area Ad, and then outputs the setting (determination) information to the first feature quantity calculation unit 544a of the feature quantity calculation unit 543f.
- the first feature quantity calculation unit 544a calculates, for example, the circularity of each unit area included in the discrimination target area Ad as the first feature quantity of the discrimination target area Ad, The average circularity of is calculated.
- FIG. 42 shows details of the circularity calculation processing as the first feature amount calculation of step S509 in FIG. 33A and the mucous membrane state determination processing of step S510 based on the circularity calculation processing result.
- the average circularity is calculated as the statistic of the first feature value.
- the mucous membrane state is determined based on the calculation result of the average circularity. It is not limited.
- the first feature amount calculation unit 544a (a circularity calculation unit included in the first feature amount calculation unit) counts the unit regions included in the determination target region Ad from the target unit region UBin. Numbering (labeling) is performed with the parameter J, and the total number N of unit areas included in the discrimination target area Ad is set.
- the circularity calculation unit calculates the circularity C (J) of the unit area of the parameter J.
- the circularity calculation unit determines whether or not the parameter J is equal to the total number N. If the parameter J is not equal to the total number N, J is incremented by 1 in step S525, and then in step S522. Return to processing.
- the circularity calculation unit in step S526 uses the average circularity Cav as the statistic of the first feature quantity. calculate.
- the average circularity Cav is calculated by adding the circularity C of the parameter J from 1 to N and dividing by the total number N, thereby calculating the average circularity Cav, thereby performing the process of step S509 in FIG. 33A. Ends, and the process proceeds to step S510.
- step S527 the determination unit 543h compares the average circularity Cav with the determination threshold set by the determination threshold setting unit 547a, and determines whether the average circularity Cav is greater than or equal to the determination threshold.
- the mucosa state of the region Ad is determined.
- the determination threshold value setting unit 547a sets the determination threshold value to about 0.6, for example, when the average circularity is used as the first feature amount.
- the determination unit 543h determines that the mucous membrane has a regular structure (state).
- the determination unit 543h determines in step S529 that the mucous membrane has an irregular structure (is in a state). If it is determined that the mucous membrane has an irregular structure, it is highly likely that the normal mucous membrane has an abnormal shape due to the destruction of the structure.
- step S30 the determination unit 543h outputs the determination result to the video signal generation unit 541c outside the calculation unit 541b, displays the determination result on the display device 505, and performs the processing of FIG. 42 or FIG. 33A. Exit.
- the surgeon can obtain information on whether or not there is a high possibility that the mucous membrane of the discrimination target region is a lesion from the state of the mucous membrane determined by the determination unit 543h, and refers to the determination result by the determination unit 543h. By doing so, diagnosis can be performed efficiently.
- the discrimination target area with the size of the unit area as a unit and discriminate the state of the biological mucous membrane of the discrimination target area.
- the determination target region setting unit 543g is a predetermined multiple of the maximum value of the width (outline) of the target unit region UBin calculated as the second feature amount of the target unit region UBin (1 in the specific example).
- a case has been described in which a circular area Ac having a radius of .5 times is set as the discrimination target area Ad.
- the present invention is not limited to the case where the circular area Ac is set as the discrimination target area Ad.
- the rectangular area Ar may be set as the discrimination target area Ad as shown by a dotted line in FIG. 41B. Note that the rectangular area Ar shown in FIG. 41B is shown in the case where the length of one side is three times the maximum value of the width (outline) of the target unit area UBin, but is not limited to this size. .
- the unit area setting unit 543e sets a circular area Ac or a rectangular area Ar centered on the target unit area UBin based on the second feature amount (indicated by a dotted line in FIG. 32).
- a rectangular area setting unit 545c, and the unit area (group) included in the circular area Ac or the rectangular area Ar set in the circular / rectangular area setting unit 545c may be set as the determination target area Ad.
- the setting method of the discrimination target area Ad described above can be applied to an image in which unit areas are connected as shown in FIG. 41B.
- the unit area group includes, for example, an upper area A1 and a lower area.
- the present invention can also be applied to an image having a structure separated into the area A2.
- the unit area existing at a distance closest to the center of the image is set as the attention unit area UBin, and as shown in FIG. A rectangular area Ar is set, and this rectangular area Ar can be set as the discrimination target area Ad ′ as shown in FIG. 43D.
- the mucous membrane state of the determination target region Ad ′ can be determined based on the first feature amount described above based on the unit region included in the determination target region Ad ′ shown in FIG. 43D.
- each unit region is connected to or adjacent to the adjacent unit region by MCE as a common mucosal microstructure, whereas the image of FIG.
- the band-shaped MCE as the mucosal microstructure is not connected or adjacent to each other but is separated into two regions A1 and A2.
- the determination target region may be set based on the degree adjacent to the target unit region UBin as the third feature amount that places more importance on the relationship with the target unit region UBin.
- the order may be defined as follows.
- the attention unit area UBin is set as a zero-order unit area, and in a plurality of unit areas existing around the attention unit area UBin, a unit area adjacent to the zero-order unit area is defined as a primary unit area, the primary unit area A unit region that is adjacent to the unit region and not adjacent to the 0th order unit region is a secondary unit region, and a unit region that is adjacent to the secondary unit region and not adjacent to the primary unit region is a tertiary unit region.
- a unit area adjacent to the Nth order unit area and not adjacent to the (N ⁇ 1) th order unit area is defined as an N + 1th order unit area. Set. Specific examples will be described later.
- the third feature amount calculation unit 544c sets the target unit region UBin as a zero-order unit region, a primary unit region adjacent to the zero-order unit region, and a secondary adjacent to the primary unit region. Are calculated as the third feature amount.
- the calculation unit 541b is a unit region that is adjacent to the unit region having a common mucosal microstructure in the unit region of interest UBin and a plurality of unit regions existing around the unit region of interest UBin. Is recognized (see FIG. 32).
- the third feature amount calculation unit 544c calculates the degree adjacent to the target unit region UBin for a plurality of unit regions existing around the target unit region UBin. For this reason, the third feature amount calculation unit 544c includes a function of an order calculation unit that calculates the order as the third feature amount.
- the attention unit area setting unit 543i has described the case where the attention unit area UBin is set using the condition C1 (that is, one unit area present at the closest distance to the center of the image).
- the unit region of interest UBin may be set according to the following conditions C2-C5 based on the information on the degree to which the unit regions are connected (connected), not limited to the condition C1.
- the attention unit area setting unit 543i performs the following conditions C2-C5 C2: a unit region in which the order of unit regions to be connected (U-1, U-2, etc. in the specification to be described later) can be taken to the highest order, C3: a unit region closest to the center (or centroid) of the entire unit region group to be connected; C4: a unit area randomly set in the image,
- the unit region of interest UBin may be set according to C5: manually set unit region.
- the attention unit area UBin is not limited to one, and a plurality of attention unit areas UBin may be set. Furthermore, the attention unit region UBin is not limited to one, and the determination target region Ad may be set and the mucous membrane state may be determined for all unit regions. As the order at that time, the processing may be performed in order from the unit area meeting the above condition (C1-C5), may be random, or may be performed in order from the upper right of the image.
- the first feature amount calculation unit 544a has described the case where the average circularity is calculated as the first feature amount.
- the first feature amount calculation unit 544a is not limited to the circularity, and the area, color tone (luminance value, Pixel value), perimeter, and the like may be calculated from the unit area included in the determination target area Ad.
- the first feature quantity calculation unit 544a may calculate a statistic quantity of variance (coefficient of variation) in addition to the average statistic quantity as the first feature quantity.
- the second feature amount calculation unit 544b calculates the area of the target unit region UBin as the second feature amount (not the above-described width) as described below in step S507.
- the discrimination target area setting unit 543g may set the discrimination target area using the order based on the comparison result using the threshold value of the area based on the calculated area.
- the determination target region setting unit 543g determines the range of the determination target region Ad depending on whether or not the area of the target unit region UBin calculated as the second feature amount is greater than or equal to a threshold (more specifically, the area threshold). Set. As described above, the discrimination target area setting unit 543g sets the discrimination target area Ad (range) based on the calculated second feature amount in units of unit areas. That is, the discrimination target area setting unit 543g sets the discrimination target area Ad as a unit area group including the target unit area UBin based on the calculated second feature amount.
- the discrimination target area setting unit 543g shows a third feature amount that represents the structural relationship between the target unit area UBin and the surrounding unit areas UBj. It may be set (determined) based on
- FIG. 44 shows a process in which the discrimination target region setting unit 543g sets (determines) the discrimination target region Ad based on the area as the third feature amount.
- the area calculated by the second feature amount calculation unit 544b is input to the discrimination target region setting unit 543g.
- the determination target region setting unit 543g determines whether or not the area calculated by the second feature amount calculation unit 544b is equal to or larger than the threshold value of the area set by the threshold setting unit 546a.
- the threshold setting unit 546a sets the area threshold to, for example, about 5000, more specifically, the number of 5000 pixels.
- the determination target region setting unit 543g includes the target unit region UBin and the target unit region UBin around the target unit region UBin as illustrated in step S513.
- the unit area UBj that is in direct contact is set as the determination target area Ad.
- the discrimination target area setting unit 543g includes the 0th order unit area in which the target unit area UBin is the 0th order unit area and the first order unit area that directly contacts the 0th order unit area.
- the region (in the unit region group) is set as the discrimination target region Ad.
- the unit area (group) U-1 is set as the discrimination target area Ad.
- the determination target region setting unit 543g when the area of the target unit region UBin is less than the area threshold, the determination target region setting unit 543g, as shown in step S514, The primary unit region (group) U-1 that is in direct contact with the target unit region UBin, and the secondary unit region (group) U-2 that is in direct contact with the primary unit region (group) U-1 Are set as the discrimination target area Ad.
- the discrimination target area setting unit 543g sets the zero-order unit area, the primary unit area, and the area (in the unit area group) up to the secondary unit area as the discrimination target area Ad.
- the discrimination target area Ad in this case is as shown in FIG. 45B.
- the setting process of the discrimination target area Ad shown in FIG. 44 is finished, that is, the process of step S508 in FIG. 33A is finished.
- the range of the unit region included in the determination target region Ad is expanded from the determination target region Ad to the mucosa than when the area value is equal to or greater than the threshold value.
- the discrimination target area setting unit 543g sets (determines) the discrimination target area Ad, and then outputs information on the setting (determination) to the first feature quantity calculation unit 544a of the feature quantity calculation unit 543f. Then, as described above, the processing after step S509 in FIG. 33A is performed.
- the determination target region Ad is set to the primary unit region (group) U-1 or the secondary unit region (group) U-2 according to the area of the target unit region UBin as shown in FIG. It is not limited to setting. It may be set to use a tertiary unit region (group) U-3.
- the area of the target unit region UBin is less than the threshold (specifically 5000) in the determination in step S512 of FIG. 44, the area of the target unit region UBin is further increased as described above in step S515 of FIG. You may make it determine whether it is more than the 2nd threshold value (for example, 3000) of the value smaller than a threshold value.
- the secondary unit region (group) U-2 is included in the determination target region Ad as shown in step S514 of FIG.
- (Group) Up to U-3 may be included in the discrimination target area Ad.
- two levels of thresholds may be set, and the order of unit areas (groups) to be included in the discrimination target area Ad may be set according to the comparison result with the thresholds, or three or more levels of thresholds may be set.
- the order of the unit areas (groups) to be included in the determination target area Ad may be set according to the comparison result with the threshold value.
- connection order N 10000 / (area of target unit region UBin) (2)
- the decimal part in the calculation result of the expression (2) is rounded down.
- step S542 the discrimination target area setting means and the discrimination target area setting method using the order effective for setting the discrimination target area range in the case of the image as shown in FIG. 43A will be described below.
- step S542 the process for setting the unit area of interest is performed as shown in step S542.
- the recognition unit 543j recognizes the unit areas having a common mucosal microstructure as unit areas (or unit area groups) adjacent to each other. Then, the third feature quantity calculation unit 544c calculates the order of the unit area adjacent to the target unit area UBin as the third feature quantity based on the recognition result by the recognition unit 543j.
- the unit unit of interest UBin is set as a unit region existing at a distance closest to the center of the image, for example.
- the target unit region UBin may be set so as to correspond to, for example, the C2 or C3 condition other than the C1 condition.
- the second feature amount calculation unit 544b calculates second feature amounts such as the area and width of the target unit region UBin. Based on the calculation result of the second feature amount, as shown in step S544, the discrimination target area setting unit 543g identifies the unit area of the order that is equal to or less than the order threshold and sets it as the discrimination target area Ad. For example, when the determination target region Ad is set by the processing of FIG. 44, when the area of the target unit region UBin is equal to or smaller than the threshold value, a unit region having an order of 2 or less is set as the determination target region Ad.
- FIG. 48 shows an example of the discrimination target region Ad that is set when the area of the target unit region UBin is equal to or smaller than the threshold value in the case of the image of FIG. 43A.
- the target unit area UBin is set as a zero-order unit area, and a primary unit area (group) U-1 and four primary unit areas composed of four unit areas that are in contact with the outside through a common MCE 551.
- the determination target region Ad is set by a secondary unit region (group) U-2 including seven unit regions that are in contact with the outside of (group) U-1 via a common MCE.
- FIG. 49 is an explanatory diagram showing that unit areas around the target unit area UBin are in contact via a common MCE 551. As shown in FIG.
- the recognizing unit 543j recognizes the unit areas having the MCE 551 as the common mucosal microstructure among the plurality of unit areas around the target unit area UBin as adjacent unit areas.
- step S545 subsequent to step S544, the determination unit 543h determines the mucosal state using the calculation result of the first feature amount by the first feature amount calculation unit 544a for the determination target region Ad. .
- step S546 the determination unit 543h outputs the determination result to the video signal generation unit 541c, and the display device 505 displays the determination result. Then, the process of FIG. 47 ends.
- the discrimination target area Ad When the discrimination target area Ad is set using the order as shown in FIG. 47, the unit areas are connected as shown in FIG. 41B, as well as areas A1 and A2 as shown in FIG. 43A. Even in the case of an image in which unit areas are not connected to each other, the discrimination target area Ad can be set so as to include a unit area highly related to the target unit area UBin, so that the mucosal state can be discriminated with high accuracy. be able to.
- the mucosal state may be determined to be an irregular structure, and when it is less than the threshold value, the mucosal state may be determined to be a regular structure.
- the circularity, area, and perimeter of the target unit region UBin are compared with the average value in the discrimination target region Ad.
- the ratio of the former and the latter is equal to or greater than a threshold value of 1.3, the mucosal state is irregular. If the structure is less than the threshold value, the mucous membrane state may be determined as a regular structure.
- the case where various feature amounts are calculated in the entire unit area included in the determination target area Ad has been described. However, the above description has been given from the blood vessel, the MCE, and the interpits included in each unit area. Various feature amounts may be calculated. In addition, the mucosal state may be determined using threshold values corresponding to the blood vessel, MCE, and intercavity.
- the feature amount calculation unit such as the width calculation unit 545a may calculate the feature amount used for setting the range of the unit region from the MCE region MBi as the mucosal microstructure region, or may be an outline as a closed region.
- the feature amount used for setting the range of the unit area may be calculated from the closed area CBj.
- the content of the preprocessing in step S502 in FIG. 33A is not limited to noise suppression and inverse gamma correction, and brightness and color tone correction may be added.
- the preprocessing methods are not limited to the methods, and may be changed according to the characteristics of the target image or device. For example, although median filtering is used as a noise suppression method in the present embodiment, other methods such as a smoothing filter may be used.
- the detection or extraction method of the MCE region MBi in step S503 is not limited to the above-described structural component extraction method, and other methods may be used.
- threshold processing for pixel values such as a luminance value and color tone
- a method using various frequency filters such as a Gabor filter
- line segment detection processing using a Hessian matrix and a vector concentration degree which are known techniques, may be used.
- the detection or identification method of the roughly closed region CBj in step S504 is not limited to the above-described method using labeling, and other methods may be used.
- the closed region may be searched by scanning the boundary pixel BQk of the detected MCE region MBi to detect the substantially closed region CBj.
- the width Tk is calculated by counting the number of pixels from the boundary pixel BQk.
- the present invention is not limited to this method, and the size of the template or filter used in the extraction of the MCE region MBi in step S503 is the width.
- Other methods such as using as Tk may be used.
- the calculation of the width Tk is performed for all the boundary pixels BQk.
- the width Tk may be calculated from pixels sampled every several pixels, for example, without using all the boundary pixels BQk.
- the calculation of the region size BSj is not limited to the average value of the width Tk, and other statistics such as the mode value, minimum value, and maximum value of the width Tk may be used.
- the calculation of the region size BSj is not limited to the method using the width Tk, and may be calculated from the size / shape of the substantially closed region CBj or even if the user sets an arbitrary value. Good. Further, the region size BSj that defines the range of the unit region may be determined from the area of the roughly closed region CBj.
- the range of the unit region UBj is set by the distance from the boundary pixel of the substantially closed region CBj, but is not limited to this method.
- the range of the unit area UBj may be set based on the above.
- FIG. 40 shows an example in which each unit area UBj allows overlapping, but it may be determined which unit area UBj the overlapping portion belongs to and the overlapping area may be divided.
- Each pixel set as the unit region UBj may be held as image data, and an image signal may be displayed on the display device 505 or the like by generating a video signal by the video signal generation unit 541c.
- an area that is a unit in terms of biohistology is set to be separable or distinguishable as a unit area. Then, with the unit area as a unit, a discrimination target area composed of a unit area group of an appropriate size can be set around the target unit area UBin of interest, and the mucosal state of the discrimination target area can be discriminated.
- the target unit area UBin is set as the discrimination target area Ad
- the first feature amount is calculated by setting the target unit area UBin alone (set as the discrimination target area Ad).
- the mucous membrane state of the target unit region UBin may be determined based on the calculated first feature amount.
- the attention unit area UBin set in FIG. 41A is set as the determination target area Ad.
- the target unit area UBin shown in FIG. 43B may be set as the discrimination target area Ad.
- FIG. 50 shows the processing in this case.
- the processing shown in FIG. 50 is the same as that described in FIG. 33A from step S501 to S506, and after the processing in step S506, the processing in steps S509 and S510 is performed without performing S507 and S508 in FIG. 33A.
- the first feature amount is calculated for the determination target region Ad including a plurality of unit regions including the target unit region UBin.
- the first feature amount calculation unit 544a calculates the first feature amount for the target unit region UBin.
- the first feature amount calculation unit 544a calculates the circularity as the first feature amount of the unit region of interest UBin.
- the determination unit 543h determines that the mucous membrane of the unit region of interest UBin has a regular structure when the calculated circularity is, for example, 0.6 or more as a threshold value. Judged as an irregular structure.
- the mucous membrane of the target unit region UBin has a regular structure. You may make it determine with a regular structure.
- the case of the category indicating whether the structure is a regular structure or an irregular structure and the “state” of the mucous membrane has been described. It is not limited to the case of such a category, for example, a “pathological condition” or a result of comparing the calculation result of the statistic of the first feature amount with a threshold value to determine whether it is cancer or non-cancer. You may make it discriminate
- the case of the stomach has been described.
- the present invention can also be applied to other medical images obtained by imaging a biological mucous membrane such as the large intestine and esophagus.
- the determination unit 543h may include a first feature amount calculation unit 544a that calculates the first feature amount.
- the determination unit 543h may include a statistic calculation unit 547b that calculates the statistic of the first feature amount as indicated by a dotted line in FIG. Further, the determination unit 543h may determine the mucosal state based on the statistics.
- embodiments configured by partially combining the above-described embodiments and the like also belong to the present invention.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Endoscopes (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
Description
そして、ユーザは、モニタ等の表示手段に表示された体腔内の像の画像に基づき、例えば、体腔内における臓器等の観察を行う。また、内視鏡装置は、消化管粘膜の像を直接的に撮像することが可能である。そのため、ユーザは、例えば、粘膜の色調、病変の形状及び粘膜表面の微細な構造(粘膜微細構造)等の様々な所見を総合的に観察することができる。
しかし、これらの診断学の理解・実践には十分な経験等を要するため、医師によって判断が異なる場合や、経験の浅い医師が診断学を使いこなすことが難しい、などの課題があった。そこで、医療用画像に対する画像処理によって、定量的な判断尺度の提供・診断の際に着目すべき微細構造の特定・画像解析による病状の推定結果などの支援情報を提供するコンピュータ診断支援(Computer Aided Diagnosis:CAD)の研究・開発が行われている。
本発明の第1の実施形態は、医療用画像である内視鏡画像として撮像された粘膜微細構造から単位領域として分割される領域を設定する処理装置及び処理方法について説明する。図1から図13は、本発明の第1の実施形態に係るものである。
なお、本実施形態においては、前記画像データに生体の粘膜微細構造が含まれているものとし、前記粘膜微細構造について生体組織学的に一単位となる範囲を単位領域として設定する処理としての単位領域設定処理を行う。このような単位領域設定処理の詳細については、後述する。
すなわち、以上に述べた内視鏡装置1の構成によれば、通常光観察モードが選択された場合には、生体組織101等の観察対象物を肉眼で見た場合と略同様の色合いを有する画像(通常光画像)を表示装置5に表示させ、さらに、外部記憶装置6に記憶させることができる。また、以上に述べた内視鏡装置1の構成によれば、狭帯域光観察モードが選択された場合には、生体組織101に含まれる表層付近の血管が強調された画像(狭帯域光画像)を表示装置5に表示させ、さらに、外部記憶装置6に記憶させることができる。
以下、本実施形態では、生体組織101上における上皮模様(上皮構造)を形成する胃の幽門腺を狭帯域光観察モードで撮像した画像データの例において、この画像データに含まれる粘膜微細構造を腺窩辺縁上皮(Marginal Crypt Epithelium:MCE)とした場合に対して、単位領域設定処理を説明する。
また、上述の説明においては、単位領域UAjはMCE領域MAjを含むと説明したが、単位領域UAjがMCE領域MAjを含まないように定義してもよい。この定義の場合には、単位領域UAj=概略閉領域CAjとなる。
単位領域UAjとして設定された各画素を画像データとして保持し、映像信号生成部41cによって映像信号を生成することで、表示装置5等に画像表示してもよい。
次に本発明の第2の実施形態を図19A-図27を参照して説明する。粘膜微細構造を撮像した内視鏡画像は、図20の模式図のように、例えば癌による組織の破壊や撮像条件等が原因で、隣接する窩間部52の間のMCE51が点線で示すように不明瞭となる場合がある。
次のステップS24において概略閉領域画素候補検出・ラベリング部45bは、概略閉領域候補画素CPzに対して、公知のラベリングを実施する。その結果、例えば図26のように6つのラベル(L1’~L6’)に分割される。
本発明の第3の実施形態は、医用画像である内視鏡画像として撮像された粘膜微細構造から単位領域を設定し、設定された単位領域から、単位領域を単位として1つ以上含む領域の粘膜の状態を判別する医療用画像処理装置を説明する。図28から図43Dは、本発明の第3の実施形態に係る。
図33Aに示すように次のステップS507において第2の特徴量算出部544bは、注目単位領域UBinの領域サイズの大きさ又は領域の形状に関する第2の特徴量として、この注目単位領域UBinの外形(輪郭)の幅の最大値を算出する。第2の特徴量算出部544bは、算出した幅の最大値を判別対象領域設定部543gに出力する。なお、第2の特徴量算出部544bが算出する特徴量は、注目単位領域UBinの幅の最大値に限定されるものでなく、幅の平均値でも良いし、後述するように面積を算出するようにしても良い。また、面積でなく、注目単位領域UBinの周囲長を算出しても良い。
図33AにおけるステップS509に示すように第1の特徴量算出部544aは、判別対象領域Adの第1の特徴量として、例えば判別対象領域Adに含まれる各単位領域の円形度を算出し、それらの平均の円形度を算出する。
C2:連結する単位領域の次数(後述する明細書中のU-1,U-2等)が最も高次まで取れる単位領域、
C3:連結する単位領域群の全体の中心(又は重心)に最も近い単位領域、
C4:画像においてランダムに設定される単位領域、
C5:手動設定される単位領域、により注目単位領域UBinを設定しても良い。
接続次数P=10000/(注目単位領域UBinの面積) (2)
ここで、(2)式の演算結果における小数点以下は切り捨てる。また、演算結果が1未満の場合(つまり、注目単位領域UBinの面積が10000を超える場合)には、N=1とする。
さらに、領域サイズBSjの算出は、幅Tkの平均値に限定されるものではなく、幅Tkの最頻値・最小値・最大値等の他の統計量を用いてもよい。
Claims (20)
- 生体粘膜を撮像して得られる生体粘膜画像が入力される入力部と、
前記入力部に入力された前記生体粘膜画像から粘膜微細構造に対応する粘膜微細構造領域を抽出する領域抽出部と、
前記粘膜微細構造領域によって包囲されるとみなされる少なくとも1つの閉領域を識別する閉領域識別部と、
前記領域抽出部により抽出された前記粘膜微細構造領域及び前記閉領域識別部により識別された前記閉領域に基づき単位領域を設定する単位領域設定部と、
を有する医療用画像処理装置。 - 前記領域抽出部は、前記生体粘膜画像における帯状の構造領域を前記粘膜微細構造領域として抽出することを特徴とする請求項1に記載の医療用画像処理装置。
- 前記単位領域設定部は、前記単位領域の設定を複数の閉領域に対して行う場合、前記複数の閉領域各々に対して独立して前記単位領域の設定を行うことを特徴とする請求項1または請求項2に記載の医療用画像処理装置。
- さらに、前記粘膜微細構造領域から又は前記閉領域から前記単位領域の範囲設定に用いられる特徴量を算出する特徴量算出部と、
前記特徴量に基づいて前記単位領域の範囲を設定する範囲設定部と、
を備えることを特徴とする請求項1乃至請求項3に記載の医療用画像処理装置。 - 前記粘膜微細構造は、生体粘膜上の血管、腺構造に基づく上皮模様またはpit patternであること、を特徴とする請求項1乃至請求項4に記載の医療用画像処理装置。
- さらに、前記生体粘膜画像に対して仮想的な粘膜微細構造領域を設定する仮想粘膜微細構造設定部を有し、
前記閉領域識別部は、前記領域抽出部により抽出された前記粘膜微細構造領域及び前記仮想粘膜微細構造設定部により設定された前記仮想的な粘膜微細構造領域により包囲される領域を前記閉領域として識別することを特徴とする請求項1乃至5に記載の医療用画像処理装置。 - 前記仮想粘膜微細構造設定部は、前記領域抽出部により抽出された前記粘膜微細構造領域における閉じていない端点同士を接続して前記仮想的な粘膜微細構造領域を設定することを特徴とする請求項6に記載の医療用画像処理装置。
- 前記仮想粘膜微細構造設定部は、前記領域抽出部により抽出された前記粘膜微細構造領域における前記閉じていない端点から前記粘膜微細構造領域の延伸方向に延出した仮想線により前記仮想的な粘膜微細構造領域を設定することを特徴とする請求項6に記載の医療用画像処理装置。
- 前記特徴量算出部は、前記閉領域を囲む前記粘膜微細構造領域の幅のサイズを算出する幅算出部により構成されることを特徴とする請求項4に記載の医療用画像処理装置。
- 前記生体粘膜が上皮模様の場合、前記単位領域設定部は、1つの前記閉領域としての窩間部の領域と、該窩間部の領域の周縁の外側に隣接し、前記窩間部を囲む閉じた帯形状の前記粘膜微細構造領域を構成する腺窩辺縁上皮の領域とを設定することを特徴とする請求項5に記載の医療用画像処理装置。
- 前記単位領域設定部において設定された前記単位領域に基づいて設定される判別対象領域から第1の特徴量を算出する第1の特徴量算出部と、
前記第1の特徴量算出部によって算出された第1の特徴量に基づき、前記判別対象領域を有する粘膜の状態を判別する判別部と、
を備えることを特徴とする請求項1に記載の医療用画像処理装置。 - 前記単位領域設定部において、複数の単位領域が設定される場合、
さらに、前記単位領域設定部において設定された複数の単位領域から注目する1つの単位領域としての注目単位領域を設定する注目単位領域設定部と、
前記注目単位領域から第2の特徴量を算出する第2の特徴量算出部と、
前記第2の特徴量に基づき、前記複数の単位領域から前記第1の特徴量の算出を行う前記判別対象領域を設定する判別対象領域設定部と、を有し、
前記第1の特徴量算出部は、前記判別対象領域設定部において設定された前記判別対象領域において前記第1の特徴量の算出を行うことを特徴とする請求項11に記載の医療用画像処理装置。 - さらに、前記複数の単位領域に対して前記注目単位領域との関係性を表す第3の特徴量を算出する第3の特徴量算出部を有し、
前記判別対象領域設定部は、前記第2の特徴量に基づき前記判別対象領域を設定する際に用いられる閾値を設定する閾値設定部を有し、
前記閾値と前記第3の特徴量に基づき前記複数の単位領域から前記判別対象領域を設定することを特徴とする請求項12に記載の医療用画像処理装置。 - 前記第3の特徴量算出部は、前記第3の特徴量として、前記複数の単位領域に対して前記注目単位領域に隣接する次数を算出し、
前記判別対象領域設定部は、前記閾値以下の次数を有する前記複数の単位領域を前記判別対象領域として設定することを特徴とする請求項13に記載の医療用画像処理装置。 - 前記第3の特徴量算出部は、前記第3の特徴量として、前記複数の単位領域に対して前記注目単位領域からの距離を算出し、
前記判別対象領域設定部は、前記注目単位領域からの距離が前記閾値以下である前記複数の単位領域を前記判別対象領域として設定することを特徴とする請求項13に記載の医療用画像処理装置。 - 前記単位領域設定部は、前記第2の特徴量に基づき前記注目単位領域を中心とする矩形領域の大きさを設定する矩形領域設定部を有し、
前記矩形領域設定部において設定された矩形領域に含まれる前記複数の単位領域を前記判別対象領域として設定することを特徴とする請求項12に記載の医療用画像処理装置。 - 前記第3の特徴量算出部は、前記注目単位領域を0次の単位領域として設定し、前記複数の単位領域のうち前記0次の単位領域に隣接する単位領域を1次の単位領域と認識し、Nを自然数として前記複数の単位領域のうちN次の単位領域に隣接しN-1次の単位領域に隣接しない単位領域をN+1次の単位領域として設定することにより算出される前記複数の単位領域の次数を前記第3の特徴量として算出することを特徴とする請求項14に記載の医療用画像処理装置。
- さらに、前記複数の単位領域のうち、共通の粘膜微細構造を有している単位領域を互いに隣接する単位領域であると認識する認識部を有し、
前記第3の特徴量算出部は、前記認識部による認識結果に基づき前記複数の単位領域に対して前記注目単位領域に隣接する次数を算出することを特徴とする請求項14に記載の医療用画像処理装置。 - 前記領域抽出部は、前記生体粘膜画像において帯状の構造を前記粘膜微細構造領域として抽出することを特徴とする請求項11乃至請求項18に記載の医療用画像処理装置。
- 前記判別部は、前記第1の特徴量の統計量を算出する統計量算出部を備え、
前記判別部は、前記統計量に基づいて前記粘膜状態を判別することを特徴とする請求項11乃至請求項19に記載の医療用画像処理装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP13853195.9A EP2815692B1 (en) | 2012-11-07 | 2013-11-05 | Medical image processing device |
CN201380018296.3A CN104203075B (zh) | 2012-11-07 | 2013-11-05 | 医疗用图像处理装置 |
JP2014523526A JP5593009B1 (ja) | 2012-11-07 | 2013-11-05 | 医療用画像処理装置 |
US14/300,406 US9129384B2 (en) | 2012-11-07 | 2014-06-10 | Medical image processing device |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-245667 | 2012-11-07 | ||
JP2012245667 | 2012-11-07 | ||
JP2012266445 | 2012-12-05 | ||
JP2012-266445 | 2012-12-05 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/300,406 Continuation US9129384B2 (en) | 2012-11-07 | 2014-06-10 | Medical image processing device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014073527A1 true WO2014073527A1 (ja) | 2014-05-15 |
Family
ID=50684630
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/079884 WO2014073527A1 (ja) | 2012-11-07 | 2013-11-05 | 医療用画像処理装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9129384B2 (ja) |
EP (1) | EP2815692B1 (ja) |
JP (1) | JP5593009B1 (ja) |
CN (1) | CN104203075B (ja) |
WO (1) | WO2014073527A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017122431A1 (ja) * | 2016-01-15 | 2017-07-20 | オリンパス株式会社 | 画像解析装置、画像解析システム、及び画像解析装置の作動方法 |
WO2017199635A1 (ja) * | 2016-05-18 | 2017-11-23 | オリンパス株式会社 | 画像解析装置、画像解析システム、及び画像解析装置の作動方法 |
JPWO2019012911A1 (ja) * | 2017-07-14 | 2020-06-11 | 富士フイルム株式会社 | 医療画像処理装置、内視鏡システム、診断支援装置、並びに医療業務支援装置 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7034102B2 (ja) * | 2016-06-30 | 2022-03-11 | ギブン イメージング リミテッド | 対象の消化管における粘膜疾患の評価及び監視のためのシステム及び方法 |
CN111050628B (zh) * | 2017-09-15 | 2022-09-06 | 富士胶片株式会社 | 医疗图像处理装置 |
KR102210806B1 (ko) * | 2018-10-02 | 2021-02-01 | 한림대학교 산학협력단 | 위 내시경 이미지의 딥러닝을 이용하여 위 병변을 진단하는 장치 및 방법 |
JP2023005896A (ja) * | 2021-06-29 | 2023-01-18 | 富士フイルム株式会社 | 内視鏡システム、医療画像処理装置及びその作動方法 |
CN114494247B (zh) * | 2022-04-01 | 2022-06-21 | 武汉大学 | 齿状线分割方法、装置、计算机设备及存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2918162B2 (ja) | 1988-11-02 | 1999-07-12 | オリンパス光学工業株式会社 | 内視鏡画像処理装置 |
JP4409166B2 (ja) | 2002-12-05 | 2010-02-03 | オリンパス株式会社 | 画像処理装置 |
JP4451460B2 (ja) | 2007-03-16 | 2010-04-14 | オリンパス株式会社 | 内視鏡診断支援装置 |
WO2012002012A1 (ja) * | 2010-06-30 | 2012-01-05 | オリンパスメディカルシステムズ株式会社 | 画像処理装置及び画像処理方法 |
JP2012045055A (ja) * | 2010-08-24 | 2012-03-08 | Olympus Corp | 画像処理装置、画像処理方法、および画像処理プログラム |
WO2012105141A1 (ja) * | 2011-02-01 | 2012-08-09 | オリンパスメディカルシステムズ株式会社 | 診断支援装置 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE69932037T2 (de) * | 1998-12-21 | 2007-06-28 | Koninklijke Philips Electronics N.V. | Grössenbestimmung eines objektdetails |
CN101404923B (zh) * | 2006-03-16 | 2010-12-22 | 奥林巴斯医疗株式会社 | 医疗用图像处理装置 |
US8184888B2 (en) * | 2007-09-19 | 2012-05-22 | Siemens Medical Solutions Usa, Inc. | Method and system for polyp segmentation for 3D computed tomography colonography |
JP5281826B2 (ja) * | 2008-06-05 | 2013-09-04 | オリンパス株式会社 | 画像処理装置、画像処理プログラムおよび画像処理方法 |
JP5395725B2 (ja) * | 2010-04-05 | 2014-01-22 | 富士フイルム株式会社 | 電子内視鏡システム |
JP5011452B2 (ja) * | 2010-04-12 | 2012-08-29 | オリンパスメディカルシステムズ株式会社 | 医用画像処理装置および医用画像処理装置の制御方法 |
JP5570866B2 (ja) * | 2010-04-30 | 2014-08-13 | オリンパス株式会社 | 画像処理装置、画像処理装置の作動方法、および画像処理プログラム |
-
2013
- 2013-11-05 JP JP2014523526A patent/JP5593009B1/ja active Active
- 2013-11-05 CN CN201380018296.3A patent/CN104203075B/zh active Active
- 2013-11-05 EP EP13853195.9A patent/EP2815692B1/en not_active Not-in-force
- 2013-11-05 WO PCT/JP2013/079884 patent/WO2014073527A1/ja active Application Filing
-
2014
- 2014-06-10 US US14/300,406 patent/US9129384B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2918162B2 (ja) | 1988-11-02 | 1999-07-12 | オリンパス光学工業株式会社 | 内視鏡画像処理装置 |
JP4409166B2 (ja) | 2002-12-05 | 2010-02-03 | オリンパス株式会社 | 画像処理装置 |
JP4451460B2 (ja) | 2007-03-16 | 2010-04-14 | オリンパス株式会社 | 内視鏡診断支援装置 |
WO2012002012A1 (ja) * | 2010-06-30 | 2012-01-05 | オリンパスメディカルシステムズ株式会社 | 画像処理装置及び画像処理方法 |
JP2012045055A (ja) * | 2010-08-24 | 2012-03-08 | Olympus Corp | 画像処理装置、画像処理方法、および画像処理プログラム |
WO2012105141A1 (ja) * | 2011-02-01 | 2012-08-09 | オリンパスメディカルシステムズ株式会社 | 診断支援装置 |
Non-Patent Citations (2)
Title |
---|
See also references of EP2815692A4 |
TAKESHI YAO, STOMACH MAGNIFYING ENDOSCOPE, 2009, pages 79 - 87 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017122431A1 (ja) * | 2016-01-15 | 2017-07-20 | オリンパス株式会社 | 画像解析装置、画像解析システム、及び画像解析装置の作動方法 |
JP6234648B1 (ja) * | 2016-01-15 | 2017-11-22 | オリンパス株式会社 | 画像解析装置、画像解析システム、及び画像解析装置の作動方法 |
US10736499B2 (en) | 2016-01-15 | 2020-08-11 | Olympus Corporation | Image analysis apparatus, image analysis system, and method for operating image analysis apparatus |
WO2017199635A1 (ja) * | 2016-05-18 | 2017-11-23 | オリンパス株式会社 | 画像解析装置、画像解析システム、及び画像解析装置の作動方法 |
JPWO2017199635A1 (ja) * | 2016-05-18 | 2018-05-31 | オリンパス株式会社 | 画像解析装置、画像解析システム、及び画像解析装置の作動方法 |
JPWO2019012911A1 (ja) * | 2017-07-14 | 2020-06-11 | 富士フイルム株式会社 | 医療画像処理装置、内視鏡システム、診断支援装置、並びに医療業務支援装置 |
US10891737B2 (en) | 2017-07-14 | 2021-01-12 | Fujifilm Corporation | Medical image processing device, endoscope system, diagnosis support device, and medical service support device |
Also Published As
Publication number | Publication date |
---|---|
EP2815692B1 (en) | 2017-09-20 |
JPWO2014073527A1 (ja) | 2016-09-08 |
EP2815692A1 (en) | 2014-12-24 |
CN104203075B (zh) | 2017-03-01 |
CN104203075A (zh) | 2014-12-10 |
JP5593009B1 (ja) | 2014-09-17 |
US20140334698A1 (en) | 2014-11-13 |
EP2815692A4 (en) | 2016-01-13 |
US9129384B2 (en) | 2015-09-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5593009B1 (ja) | 医療用画像処理装置 | |
JP6657480B2 (ja) | 画像診断支援装置、画像診断支援装置の作動方法および画像診断支援プログラム | |
JP5242381B2 (ja) | 医療用画像処理装置及び医療用画像処理方法 | |
JP5865606B2 (ja) | 内視鏡装置及び内視鏡装置の作動方法 | |
JP5855358B2 (ja) | 内視鏡装置及び内視鏡装置の作動方法 | |
WO2020162275A1 (ja) | 医療画像処理装置、内視鏡システム、及び医療画像処理方法 | |
WO2013140667A1 (ja) | 画像処理装置 | |
JP7048732B2 (ja) | 画像処理装置、内視鏡システム、及び画像処理方法 | |
JP6132901B2 (ja) | 内視鏡装置 | |
WO2014168128A1 (ja) | 内視鏡システム及び内視鏡システムの作動方法 | |
JP6941233B2 (ja) | 画像処理装置、内視鏡システム、及び画像処理方法 | |
JPWO2019198637A1 (ja) | 画像処理装置、内視鏡システム、及び画像処理方法 | |
WO2020008834A1 (ja) | 画像処理装置、方法及び内視鏡システム | |
WO2020188682A1 (ja) | 診断支援装置、診断支援方法及びプログラム | |
JPWO2019130868A1 (ja) | 画像処理装置、プロセッサ装置、内視鏡システム、画像処理方法、及びプログラム | |
WO2020184257A1 (ja) | 医用画像処理装置及び方法 | |
JP2023026480A (ja) | 医療画像処理装置、内視鏡システム、及び医療画像処理装置の作動方法 | |
JP6112859B2 (ja) | 医用画像処理装置 | |
JP7387859B2 (ja) | 医用画像処理装置、プロセッサ装置、内視鏡システム、医用画像処理装置の作動方法及びプログラム | |
WO2021157487A1 (ja) | 医用画像処理装置、内視鏡システム、医用画像処理方法、及びプログラム | |
CN116322465A (zh) | 图像处理装置、内窥镜系统、图像处理装置的工作方法及图像处理装置用程序 | |
Cui et al. | Detection of lymphangiectasia disease from wireless capsule endoscopy images with adaptive threshold | |
WO2022181748A1 (ja) | 医療画像処理装置、内視鏡システム、医療画像処理方法、及び医療画像処理プログラム | |
JP2022163581A (ja) | 制御装置、制御プログラムおよび制御方法 | |
JP2023129277A (ja) | 周波数帯域を用いたハイパースペクトルによる物体画像の検出方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2014523526 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13853195 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2013853195 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013853195 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |