WO2015064435A1 - 画像処理装置及びその作動方法 - Google Patents
画像処理装置及びその作動方法 Download PDFInfo
- Publication number
- WO2015064435A1 WO2015064435A1 PCT/JP2014/078046 JP2014078046W WO2015064435A1 WO 2015064435 A1 WO2015064435 A1 WO 2015064435A1 JP 2014078046 W JP2014078046 W JP 2014078046W WO 2015064435 A1 WO2015064435 A1 WO 2015064435A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- ratio
- signal
- range
- color
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 101
- 238000000034 method Methods 0.000 title claims abstract description 66
- 230000002159 abnormal effect Effects 0.000 claims abstract description 50
- 210000004400 mucous membrane Anatomy 0.000 claims description 66
- 210000004204 blood vessel Anatomy 0.000 claims description 48
- 239000003086 colorant Substances 0.000 claims description 18
- 230000003902 lesion Effects 0.000 claims description 18
- 206010003694 Atrophy Diseases 0.000 abstract description 27
- 230000037444 atrophy Effects 0.000 abstract description 27
- 208000004300 Atrophic Gastritis Diseases 0.000 abstract description 26
- 208000036495 Gastritis atrophic Diseases 0.000 abstract description 26
- 208000016644 chronic atrophic gastritis Diseases 0.000 abstract description 26
- 210000002784 stomach Anatomy 0.000 abstract description 13
- 210000004877 mucosa Anatomy 0.000 abstract description 9
- 238000006243 chemical reaction Methods 0.000 description 61
- 239000000203 mixture Substances 0.000 description 24
- 235000005811 Viola adunca Nutrition 0.000 description 19
- 235000013487 Viola odorata Nutrition 0.000 description 19
- 235000002254 Viola papilionacea Nutrition 0.000 description 19
- 240000009038 Viola odorata Species 0.000 description 16
- OAICVXFJPJFONN-UHFFFAOYSA-N Phosphorus Chemical compound [P] OAICVXFJPJFONN-UHFFFAOYSA-N 0.000 description 14
- 230000000875 corresponding effect Effects 0.000 description 12
- 230000002194 synthesizing effect Effects 0.000 description 11
- 238000003384 imaging method Methods 0.000 description 10
- 238000001914 filtration Methods 0.000 description 9
- 239000010410 layer Substances 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 9
- 238000003745 diagnosis Methods 0.000 description 8
- 230000003595 spectral effect Effects 0.000 description 8
- 230000002496 gastric effect Effects 0.000 description 7
- 230000015572 biosynthetic process Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000005286 illumination Methods 0.000 description 6
- 238000003786 synthesis reaction Methods 0.000 description 6
- 208000007882 Gastritis Diseases 0.000 description 5
- 208000005718 Stomach Neoplasms Diseases 0.000 description 5
- 206010017758 gastric cancer Diseases 0.000 description 5
- 238000003780 insertion Methods 0.000 description 5
- 230000037431 insertion Effects 0.000 description 5
- 230000002250 progressing effect Effects 0.000 description 5
- 201000011549 stomach cancer Diseases 0.000 description 5
- 210000001156 gastric mucosa Anatomy 0.000 description 4
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 4
- 210000004907 gland Anatomy 0.000 description 4
- 244000154870 Viola adunca Species 0.000 description 3
- 238000005452 bending Methods 0.000 description 3
- 230000000295 complement effect Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 3
- 230000010339 dilation Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 239000002775 capsule Substances 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000001575 pathological effect Effects 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 229910052724 xenon Inorganic materials 0.000 description 2
- FHNFHKCVQCLJFQ-UHFFFAOYSA-N xenon atom Chemical compound [Xe] FHNFHKCVQCLJFQ-UHFFFAOYSA-N 0.000 description 2
- 238000005253 cladding Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 210000000936 intestine Anatomy 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 210000004379 membrane Anatomy 0.000 description 1
- 239000012528 membrane Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 239000011241 protective layer Substances 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 210000004876 tela submucosa Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/005—Flexible endoscopes
- A61B1/0051—Flexible endoscopes with controlled bending of insertion part
- A61B1/0052—Constructional details of control elements, e.g. handles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/05—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0661—Endoscope light sources
- A61B1/0676—Endoscope light sources at distal tip of an endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/273—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the upper alimentary canal, e.g. oesophagoscopes, gastroscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
- A61B5/004—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/42—Detecting, measuring or recording for evaluating the gastrointestinal, the endocrine or the exocrine systems
- A61B5/4222—Evaluating particular parts, e.g. particular organs
- A61B5/4238—Evaluating particular parts, e.g. particular organs stomach
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30092—Stomach; Gastric
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30101—Blood vessel; Artery; Vein; Vascular
Definitions
- the present invention relates to an image processing apparatus for processing an image used in diagnosis of atrophic gastritis and an operation method thereof.
- the shape and size of the lesion can be clarified, so detection of the lesion can be facilitated.
- the doctor finds out the lesion not only from the shape and size of the lesion but also from slight differences in color of the mucous membrane. For example, a portion having a slight difference in color from the mucous membrane of reddish is detected as an early lesion. Such a slightly reddish portion may not be found even if the high-definition version is simply increased in resolution.
- Patent Document 1 the border of a lesion is made to be noticeable by performing a color emphasizing process in which the reddish part is reddered and the white part is whitened. Further, as in Patent Document 2, the size of a discolored area different in color from mucous membranes such as brownish area (BA (Brownish Area)) is detected, and frequency band emphasizing processing is performed according to the size of the discolored area. The structural emphasis on the discolored area has been made. By performing such color emphasizing processing and frequency band emphasizing processing, it is possible to find out a lesion that can not be detected only by high definition.
- BA Brownish Area
- Patent 3228627 gazette JP 2012-71012
- the atrophy progresses to a high degree (for example, in the case of the atrophy included in the C group and the D group in the ABC examination), the above two features (A) and (B) are clarified on the endoscopic image Can be observed.
- the atrophy is not progressing so much (for example, in the case of atrophy included in groups B and C at ABC screening)
- the atrophy area and the normal area on the endoscopic image The difference is slight, and it may be difficult to judge the progress of atrophy or to distinguish the boundary between the normal part and the gastritis part. Therefore, it is required to clarify the above two features (A) and (B) on the endoscopic image to clarify the boundary between the normal part and the gastritis part.
- Patent Document 1 emphasizes color so that reddishness is further increased in reddish areas, and the change in color that becomes faded with the atrophy of the stomach as described above, and atrophy of the stomach It does not emphasize the change in color that the blood vessels in the submucosal layer see through.
- the method of Patent Document 2 performs emphasis processing of the frequency band according to the size of the color change area, and does not perform the emphasis processing according to the color change accompanying the atrophy of the stomach as described above. .
- An object of the present invention is to provide an image processing apparatus capable of color difference emphasis or structure emphasis according to a change in color of a mucous membrane or the like which may occur at the time of stomach atrophy due to atrophic gastritis and an operation method thereof.
- An image processing apparatus includes an image signal input unit, a base image generation unit, a color information acquisition unit, a frequency emphasis unit, a combination ratio setting unit, and a combination unit.
- the image signal input unit inputs a color image signal.
- the base image generation unit generates a base image based on the color image signal.
- the color information acquisition unit acquires a plurality of color information from the color image signal.
- the frequency emphasizing unit generates, based on the base image, a frequency component emphasized image in which the frequency component corresponding to the abnormal part different from the normal part is emphasized.
- the combining ratio setting unit sets a combining ratio indicating a ratio of combining the frequency component emphasized image to the base image based on the plurality of pieces of color information.
- the combining unit combines the frequency component-emphasized image with the base image at the combining ratio set by the combining ratio setting unit to generate a structure-emphasized image in which the abnormal portion is emphasized.
- the color image signal is an image signal of three colors
- the color information acquisition unit is different from the first signal ratio between the image signals of two colors among the image signals of three colors as the plurality of color information and the first signal ratio.
- the signal ratio calculation unit calculates a second signal ratio between the two color image signals.
- the composition ratio setting unit sets “0” as the composition ratio for a pixel in which the first and second signal ratios calculated by the signal ratio calculation unit are included in the first range having many first and second signal ratios in the normal part.
- the synthesis ratio is “100 It is preferable to set "%”.
- the abnormal part is a depigmented mucous membrane, the specific range is a second range including many of the first and second signal ratios of the depigmented mucous membrane, and the first range and the second range have the same hue;
- the range is preferably lower in saturation than the first range.
- the abnormal part is a blood vessel region having a blood vessel seen through with the progress of the lesion under the darkened mucous membrane, and the specific range is a third range including a large number of first and second signal ratios of the blood vessel region, It is preferable that the range and the third range have the same saturation but different hues.
- the abnormal portion is a brownish area, and the specific range is preferably a fourth range including a large number of first and second signal ratios of the brownish area.
- the abnormal portion is reddish, and the specific range is preferably a fifth range including a large ratio of the first and second reddish signals.
- the image signals of three colors are RGB image signals, the first signal ratio is the B / G ratio between the B image signal and the G image signal, and the second signal ratio is the G / R between the G image signal and the R image signal The ratio is preferred.
- the image processing apparatus includes an image signal input unit, a color information acquisition unit, a color difference emphasis unit, a frequency emphasis unit, a combination ratio setting unit, and a combination unit.
- the image signal input unit inputs a color image signal.
- the color information acquisition acquires a plurality of color information from a color image signal.
- the color difference emphasizing unit performs an expansion process to expand the difference between the first range and a specific range different from the first range in the feature space formed by a plurality of color information, and the color difference between the normal portion and the abnormal portion.
- the frequency emphasizing unit generates a frequency component emphasized image in which the frequency component corresponding to the abnormal part is emphasized based on the color difference emphasized image.
- the combining ratio setting unit sets a combining ratio indicating a ratio of combining the frequency component emphasized image to the color difference emphasized image based on the plurality of pieces of color information.
- the synthesizing unit synthesizes the frequency component emphasized image with the color difference emphasized image at the synthesizing ratio set by the synthesizing ratio setting unit to emphasize the color difference and generate the color difference and structure emphasized image in which the abnormal portion is emphasized.
- the color image signal is an image signal of three colors
- the color information acquisition unit is different from the first signal ratio between the image signals of two colors among the image signals of three colors as the plurality of color information and the first signal ratio.
- the signal ratio calculation unit calculates a second signal ratio between the two color image signals.
- the operation method of the image processing apparatus includes an input step, a base image generation step, a color information acquisition step, a frequency component enhanced image generation step, a combination ratio setting step, and a combination step.
- the image signal input unit inputs a color image signal.
- the base image generation step the base image generation unit generates a base image based on the color image signal.
- the color information acquisition step the color information acquisition unit acquires a plurality of color information from the color image signal.
- the frequency emphasizing unit generates a frequency component emphasized image in which the frequency component corresponding to the abnormal part different from the normal part is emphasized based on the base image.
- the combining ratio setting unit sets a combining ratio indicating the ratio of combining the frequency component emphasized image to the base image based on the plurality of color information.
- the synthesizing unit synthesizes the frequency component emphasized image with the base image at the synthesizing ratio set by the synthesizing ratio setting unit to generate a structure emphasized image in which the abnormal portion is emphasized.
- the operation method of the image processing apparatus includes an input step, a color information acquisition step, a color difference emphasized image generation step, a frequency component emphasized image generation step, a synthesis ratio setting step, and a synthesis step.
- the image signal input unit inputs a color image signal.
- the color information acquisition step acquires a plurality of color information from the color image signal.
- the color difference emphasizing image generation step the color difference emphasizing unit performs an expansion process to expand the difference between the first range and a specific range different from the first range in the feature space formed by a plurality of color information A color difference-emphasized image in which the color difference between the and the abnormal portion is emphasized is generated.
- the frequency emphasizing unit In the frequency component emphasized image generation step, the frequency emphasizing unit generates a frequency component emphasized image in which the frequency component corresponding to the abnormal part is emphasized based on the color difference emphasized image.
- the combining ratio setting unit sets a combining ratio indicating the ratio of combining the frequency component emphasized image to the color difference emphasized image based on the plurality of color information.
- the synthesizing step the synthesizing unit synthesizes the frequency component emphasized image with the color difference emphasized image with the synthesizing ratio set by the synthesizing ratio setting unit to emphasize the color difference and structure-emphasized the color difference and structure-emphasized image Generate
- color difference emphasis or structure emphasis can be performed according to a change in color of a mucous membrane or the like that may occur during atrophy of the stomach due to atrophic gastritis.
- It is a block diagram which shows the function of a structure emphasizing part. 6 is a table showing the relationship between B / G ratio, G / R ratio, and synthesis ratios g1 to g4 (B / G ratio, G / R ratio). It is a flowchart which shows a series of flows in a diagnosis of atrophic gastritis. It is a block diagram which shows the function of a special image processing part different from FIG. It is a block diagram which shows the internal structure of the endoscope system of 2nd Embodiment. It is a top view of a rotation filter. It is a block diagram which shows the one part function in the processor apparatus in 3rd Embodiment.
- (A) is a cross-sectional view of the mucous membrane structure in a normal mucous membrane
- (B) is a plan view of the normal mucous membrane viewed from the surface side.
- (A) is a cross-sectional view of the mucosal structure (reduction of gastric gland cells, or replacement of gastric tissue with intestine and fibrous tissue at the time of atrophic gastritis);
- (B) is atrophic It is a top view at the time of seeing the mucous membrane at the time of gastritis from the surface side.
- the endoscope system 10 of the first embodiment includes an endoscope 12, a light source device 14, a processor device 16, a monitor 18, and a console 20.
- the endoscope 12 is optically connected to the light source device 14 and electrically connected to the processor device 16 through the universal cord 13.
- the endoscope 12 has an insertion portion 21 to be inserted into a sample, an operation portion 22 provided at a proximal end portion of the insertion portion, and a bending portion 23 and a distal end portion 24 provided on the distal end side of the insertion portion 21. doing.
- the angle knob 22a of the operation unit 22 By operating the angle knob 22a of the operation unit 22, the bending unit 23 bends. With this bending operation, the tip 24 is oriented in the desired direction.
- the operation unit 22 is provided with a mode switching switch 22b and a zoom operation unit 22c.
- the mode switching SW 22 b is used to switch between the normal observation mode and the special observation mode.
- the normal observation mode is a mode in which white light is used to illuminate the subject.
- the special observation mode is a mode using bluish special light for illuminating the inside of a sample, and is a mode that emphasizes a change in mucous membrane color and blood vessel visibility which may occur at the time of stomach atrophy due to atrophic gastritis.
- the zoom operation unit 22c drives the zooming lens 47 (see FIG. 2) in the endoscope 12 and is used for a zoom operation to magnify the sample. In the special observation mode, white light may be used instead of the special light.
- the processor unit 16 is electrically connected to the monitor 18 and the console 20.
- the monitor 18 outputs and displays image information and the like.
- the console 20 functions as a UI (user interface) that receives an input operation such as function setting.
- an external recording unit (not shown) that records image information and the like may be connected to the processor device 16.
- the light source device 14 comprises a blue laser light source (445 LD) 34 that emits blue laser light with a central wavelength of 445 nm and a blue violet laser light source (405 LD) 36 that emits blue purple laser light with a central wavelength of 405 nm. It is equipped as a light source.
- the light emission from the semiconductor light emitting element of each of the light sources 34 and 36 is individually controlled by the light source control unit 40, and the light quantity ratio of the emitted light of the blue laser light source 34 and the emitted light of the blue violet laser light source 36 can be changed. It has become.
- the light source control unit 40 mainly controls the blue laser light source 34 to control to slightly emit blue-violet laser light.
- the blue-violet laser light source 36 may be driven. However, in this case, it is preferable to keep the emission intensity of the blue-violet laser light source 36 low.
- both the blue laser light source 34 and the blue-violet laser light source 36 are driven, and the emission ratio of the blue laser light is made larger than the emission ratio of the blue-violet laser light.
- the half width of the blue laser light or the blue-violet laser light is preferably about ⁇ 10 nm.
- a broad area type InGaN-based laser diode can be used, and an InGaNAs-based laser diode or a GaNAs-based laser diode can also be used.
- a light emitter such as a light emitting diode may be used as the light source.
- Laser light emitted from each of the light sources 34 and 36 is incident on the light guide (LG) 41 via an optical member such as a condenser lens, an optical fiber, or a multiplexer (none of which is shown).
- the light guide 41 is incorporated in the endoscope 12 and the universal cord 13.
- the blue laser light having a central wavelength of 445 nm or the blue-violet laser light having a central wavelength of 405 nm is propagated to the tip portion 24 of the endoscope 12 through the light guide 41.
- a multimode fiber can be used.
- the distal end portion 24 of the endoscope 12 has an illumination optical system 24 a and an imaging optical system 24 b.
- the illumination optical system 24 a is provided with a fluorescent body 44 to which a blue laser light having a central wavelength of 445 nm or a blue-violet laser light having a central wavelength of 405 nm from the light guide 41 is incident, and an illumination lens 45.
- the phosphor 44 emits fluorescent light by being irradiated with the blue laser light. Also, part of the blue laser light passes through the phosphor 44 as it is.
- the blue-violet laser light transmits without exciting the phosphor 44.
- the light emitted from the phosphor 44 is irradiated into the sample through the illumination lens 45.
- the blue laser light mainly enters the fluorescent body 44, so the blue laser light and the fluorescent light excited and emitted from the fluorescent body 44 by the blue laser light as shown in FIG. 3A are multiplexed. White light is emitted into the sample.
- the special observation mode since both the blue-violet laser light and the blue laser light are incident on the phosphor 44, the blue-violet laser light, the blue laser light, and the blue laser light as shown in FIG. Special light obtained by multiplexing the fluorescence emitted from 44 is emitted into the sample.
- the blue-violet laser light is included, so the special light contains a large amount of the blue component and the wavelength range almost covers the entire visible light It is broadband light.
- the phosphor 44 absorbs a part of blue laser light, and emits a plurality of green to yellow light (for example, a phosphor such as YAG phosphor or BAM (BaMgAl 10 O 17 )). It is preferable to use what is comprised including. As in this configuration example, when the semiconductor light emitting element is used as an excitation light source of the phosphor 44, high intensity white light can be obtained with high luminous efficiency, and the intensity of white light can be easily adjusted. Changes in temperature and chromaticity can be kept small.
- the imaging optical system 24 b of the endoscope 12 includes an imaging lens 46, a zooming lens 47, and a sensor 48. Reflected light from the subject is incident on the sensor 48 through the imaging lens 46 and the zooming lens 47. Thereby, a reflection image of the subject is formed on the sensor 48.
- the zooming lens 47 moves between the tele end and the wide end by operating the zoom operation unit 22 c. When the zooming lens 47 moves to the tele end side, the reflection image of the subject is enlarged, while by moving to the wide end side, the reflection image of the subject is reduced.
- the sensor 48 is a color imaging device, which captures a reflection image of a subject and outputs an image signal.
- the sensor 48 is preferably a charge coupled device (CCD) image sensor or a complementary metal-oxide semiconductor (CMOS) image sensor.
- CMOS complementary metal-oxide semiconductor
- the image sensor used in the present invention is an RGB image sensor having RGBch in which an RGB color filter is provided on an imaging surface, and R (red) color filter is provided by performing photoelectric conversion in each channel.
- the R image signal is output from the pixel
- the G image signal is output from the G pixel provided with the G (green) color filter
- the B image signal is output from the B pixel provided with the B (blue) color filter .
- the sensor 48 may be a so-called complementary color image sensor provided with C (cyan), M (magenta), Y (yellow) and G (green) color filters on the imaging surface.
- C cyan
- M magenta
- Y yellow
- G green
- an image signal of three colors RGB can be obtained by color conversion from an image signal of four colors CMYG.
- any of the endoscope 12, the light source device 14 or the processor device 16 needs to be provided with means for color-converting the CMYG four-color image signal to the RGB three-color image signal.
- the image signal output from the sensor 48 is transmitted to the CDS / AGC circuit 50.
- the CDS-AGC circuit 50 performs correlated double sampling (CDS) and automatic gain control (AGC) on an image signal which is an analog signal.
- a gamma conversion unit 51 performs gamma conversion on the image signal that has passed through the CDS-AGC circuit 50. Thereby, an image signal having a gradation suitable for an output device such as the monitor 18 can be obtained.
- the image signal after gamma conversion is converted into a digital image signal by an A / D converter (A / D converter) 52.
- the A / D converted digital image signal is input to the processor unit 16.
- the processor device 16 mainly functions as an image processing device, and includes a receiving unit 54, an image processing switching unit 60, a normal image processing unit 62, a special image processing unit 64, and a video signal generating unit 66. There is.
- the receiving unit 54 functions as an image signal input unit that receives a digital image signal from the endoscope 12 and inputs the digital image signal to the processor device 16.
- the receiving unit 54 includes a DSP (Digital Signal Processor) 56 and a noise removing unit 58.
- the DSP 56 performs gamma correction and color correction processing on the digital image signal.
- the noise removing unit 58 removes noise from the digital image signal by performing noise removal processing (for example, moving average method, median filter method, and the like) on the digital image signal subjected to gamma correction and the like by the DSP 56.
- noise removal processing for example, moving average method, median filter method, and the like
- the digital image signal from which the noise has been removed is transmitted to the image processing switching unit 60.
- the image processing switching unit 60 transmits the digital image signal to the normal image processing unit 62 when the normal observation mode is set by the mode switching switch 22 b, and when the special observation mode is set, the digital image signal The signal is sent to the special image processing unit 64.
- the normal image processing unit 62 performs color conversion processing, color enhancement processing, and structure enhancement processing on the RGB image signal.
- color conversion processing 3 ⁇ 3 matrix processing, tone conversion processing, three-dimensional LUT processing, and the like are performed on digital RGB image signals, and conversion is performed to RGB image signals subjected to color conversion processing.
- various color emphasizing processing is performed on the color converted RGB image signal.
- a structure emphasizing process such as spatial frequency emphasizing is performed on the RGB image signal subjected to the color emphasizing process.
- the RGB image signal subjected to the structure emphasizing processing is input from the normal image processing unit 62 to the video signal generation unit 66.
- the special image processing unit 64 emphasizes the color of the abnormal part whose color has changed along with the lesion such as atrophy of the stomach due to atrophic gastritis based on the RGB image signal, and generates a special image in which the abnormal part is structure-emphasized Do. Details of the special image processing unit 64 will be described later.
- the generated RGB image signal of the special image is input from the special image processing unit 64 to the video signal generation unit 66.
- the video signal generation unit 66 converts the RGB image signal input from the normal image processing unit 62 or the special image processing unit 64 into a video signal to be displayed on the monitor 18 as a displayable image.
- the monitor 18 displays a normal image or a special image based on the video signal.
- the special image processing unit 64 includes an inverse gamma conversion unit 70, a Log conversion unit 71, a signal ratio calculation unit 72, a color difference emphasis unit 73, an RGB conversion unit 74, and a structure emphasis unit 75. , An inverse log converter 76, and a gamma converter 77.
- the inverse gamma conversion unit 70 performs inverse gamma conversion on the input digital image signals of RGB three channels.
- the RGB image signal after the inverse gamma conversion is a linear reflectance linear RGB signal with respect to the reflectance from the sample
- various biological information of the sample among the RGB image signals in the present embodiment, atrophy
- atrophy There is a large proportion of signals related to the stomach atrophy, such as color changes associated with gastritis.
- the Log converter performs Log conversion of the reflectance linear RGB image signals.
- the Log converted R image signal (log R), the Log converted G image signal (log G), and the Log converted B image signal (log B) are obtained.
- the “B / G ratio” represents the one in which “ ⁇ log” is omitted in ⁇ log (B / G).
- the G / R ratio is the same as the B / G ratio, and represents the one in which “ ⁇ log” is omitted in ⁇ log (G / R).
- the color difference emphasizing unit 73 has the B / G ratio and the G / R ratio in the first range in a two-dimensional space (corresponding to the “feature space” of the present invention) of the B / G ratio and G / R ratio shown in FIG. Ratio, and the difference between the B / G ratio and the G / R ratio in the second to fifth ranges, the normal area and the abnormal area on the observation area (atrophy mucosa area, deep blood vessel area, BA (brownish area) Emphasize the color difference with the area (Brownish Area), redness).
- the first range includes many B / G ratios and G / R ratios in portions with normal parts such as normal mucous membranes (in FIG. This first range is approximately centered on the two-dimensional space.
- the second range contains a large amount of B / G ratio and G / R ratio in the part where the atrophied mucous membrane is present (in FIG. 6, the atrophied mucous membrane is indicated by “x”), and this second range is two-dimensional It is located on the lower left of the first range in space.
- the relationship between the first range and the second range is defined in terms of hue and saturation, the first range and the second range have the same hue, and the second range is more than the first range. The saturation is low.
- the third range includes many B / G ratios and G / R ratios of the part where deep blood vessels are present (the deep blood vessels are indicated by “ ⁇ ” in FIG. 6), and the third range is the first range. It is located at the lower right.
- the fourth range includes many B / G ratios and G / R ratios of a part having a BA (in FIG. 6, “BA” is indicated by “ ⁇ ”), and the fourth range is a rightward oblique of the first range. Located on the top.
- the fifth range includes many of the first and second signal ratios of the portion where there is redness (redness is indicated by “ ⁇ ” in FIG. 6), and this fifth range is located on the right side of the first range. ing.
- the color difference emphasizing unit 73 includes a two-dimensional LUT 27 a and a weighting addition unit 27 b.
- the two-dimensional LUT 73a includes the B / G ratios, G / R ratios in the second to fifth ranges, the B / G ratios in the second to fourth ranges, the G / R ratios and the B / G ratios in the first range, It is stored in association with the amount of color difference emphasis used to expand the color difference of the G / R ratio.
- the B / G ratio in the third range, the color difference enhancement amount ⁇ ** (B / G ratio) in the third range associated with the G / R ratio, and ⁇ ** (G / R ratio) in the third range It is obtained by the color difference emphasis amount calculation process for the third range based on the B / G ratio and the G / R ratio.
- the fourth range B / G ratio, the fourth range color difference enhancement amount ⁇ *** (B / G ratio) associated with the G / R ratio, and the ⁇ *** (G / R ratio) are the fourth. It is obtained by the color difference emphasis amount calculation process for the fourth range based on the range B / G ratio and G / R ratio.
- the B / G ratio, the color difference enhancement amount ⁇ * 4 (B / G ratio) in the fifth range associated with the G / R ratio, and ⁇ * 4 (G / R ratio) are in the fifth range. It is obtained by the color difference emphasis amount calculation process for the fifth range based on the B / G ratio and the G / R ratio.
- the color difference emphasis amount calculation process for the second to fifth ranges is performed in the following procedure.
- the second range color difference emphasis amount calculation process includes a first range average value calculation process, a polar coordinate conversion process, a radial difference expansion process, an orthogonal coordinate conversion process, and a difference process.
- First, an average value of B / G ratio and G / R ratio within the first range is calculated by the first range average value calculation processing.
- polar coordinate conversion processing is performed to polar-coordinate convert the average value of the B / G ratio and G / R ratio within the first range to obtain the first range average value (rm, ⁇ m) after polar coordinate conversion.
- polar coordinate conversion of the B / G ratio and G / R ratio in the second range is performed to obtain a second range signal (ri, ⁇ i) after polar coordinate conversion.
- the radial difference ⁇ r between the first range average value (rm, ⁇ m) after polar coordinate conversion and the second range signal (ri, ⁇ i) is obtained by radial difference expansion processing. Expand.
- the radial difference expansion process is performed by the following equation (1-1).
- the radial difference expanded second range signal (Eri, ⁇ i) as shown in FIG. 8 is obtained.
- the radial difference expanded second range signal (Eri, ⁇ i) is converted into orthogonal coordinates by orthogonal coordinate conversion processing.
- the B / G ratio and the G / R ratio in the second range in which the radial difference has been expanded can be obtained.
- the B / G ratio and G / R ratio of the second range in which the radial difference has been expanded, and the second range before the radial difference expansion A difference process with the B / G ratio and the G / R ratio is performed to calculate a color difference expansion amount ⁇ * (B / G ratio) and ⁇ * (G / R ratio) in the second range.
- the radial difference expansion process in the second range is to expand the radial difference ⁇ r in the direction in which the saturation decreases while maintaining the hue.
- the change in color due to this dilation treatment is matched to the change in color of the atrophied mucous membrane which becomes discolored as the atrophic gastritis progresses as shown in FIG.
- FIG. 9 it is shown that the progression 2 is progressing of atrophic gastritis more than the progression 1, and in the case of the progression 1, the difference between the area of the atrophic mucosa and the area of the normal mucosa is small.
- the hue is almost the same, and it can be seen that the difference between the area of the atrophied mucous membrane and the normal mucous membrane is increased by decreasing only the chroma.
- the third range color difference emphasis amount calculation process includes a first range average value calculation process, a polar coordinate conversion process, an argument difference expansion process, an orthogonal coordinate conversion process, and a difference process.
- polar coordinate conversion processing is performed to polar-coordinate convert the average value of the B / G ratio and G / R ratio within the first range to obtain the first range average value (rm, ⁇ m) after polar coordinate conversion.
- polar coordinate conversion of the B / G ratio and G / R ratio in the third range is performed to obtain a third range signal (rv, ⁇ v) after polar coordinate conversion.
- the declination difference ⁇ between the first range average value (rm, ⁇ m) after the polar coordinate conversion and the third range signal (rv, ⁇ v) is converted by the declination difference expansion process. Expand.
- the argument difference expansion process is performed by the following equation (2-1).
- a third range signal (Erv, ⁇ v) with a declination difference expanded as shown in FIG. 8 is obtained.
- the third range signal (Erv, ⁇ v) after the declination difference expansion is converted into orthogonal coordinates by orthogonal coordinate conversion processing.
- the B / G ratio and G / R ratio in the third range in which the differential angle difference has been expanded can be obtained.
- the B / G ratio and G / R ratio of the third range in which the differential angle difference has been expanded, and the third range before the differential angle difference expansion The difference between the B / G ratio and the G / R ratio is calculated to calculate the third range of the color difference expansion amounts ⁇ ** (B / G ratio) and ⁇ ** (G / R ratio).
- the argument angle difference ⁇ is expanded in the hue direction while maintaining the saturation.
- the color change due to this dilation treatment is matched to the color change in which the color of deep blood vessels becomes apparent as atrophic gastritis progresses.
- the progression 2 is progressing of atrophic gastritis more than the progression 1 and, in the case of the progression 1, the difference between the deep blood vessel region and the region of normal mucous membrane is small.
- the saturation is almost the same, and it can be seen that the difference between the deep blood vessel region and the region of the normal mucous membrane is large by changing only the hue.
- the fourth range color difference emphasis amount calculation process includes a first range average value calculation process, a polar coordinate conversion process, a radial difference and argument difference expansion process, an orthogonal coordinate conversion process, and a difference process.
- polar coordinate conversion processing is performed to polar-coordinate convert the average value of the B / G ratio and G / R ratio within the first range to obtain the first range average value (rm, ⁇ m) after polar coordinate conversion.
- polar coordinate conversion of the B / G ratio and G / R ratio in the fourth range is performed to obtain a fourth range signal (rk, ⁇ k) after polar coordinate conversion.
- the fourth range signals (Erv, ⁇ v) after radial difference and argument difference expansion are converted into rectangular coordinates by orthogonal coordinate conversion processing.
- the B / G ratio and G / R ratio of the fourth range in which the radial difference and the declination difference have been expanded can be obtained.
- the B / G ratio, G / R ratio, radial difference and deviation of the fourth range in which the radial difference and declination difference are expanded.
- the fifth range color difference emphasis amount calculation process includes a first range average value calculation process, a polar coordinate conversion process, a radial difference and declination difference expansion process, an orthogonal coordinate conversion process, and a difference process.
- polar coordinate conversion processing is performed to polar-coordinate convert the average value of the B / G ratio and G / R ratio within the first range to obtain the first range average value (rm, ⁇ m) after polar coordinate conversion.
- polar coordinate conversion of the B / G ratio and G / R ratio within the fifth range is performed to obtain the fifth range signal (rp, ⁇ p) after polar coordinate conversion.
- the fifth range signal (Erp, ⁇ p) on which the radial difference and the declination difference have been expanded is converted to rectangular coordinates by orthogonal coordinate conversion processing.
- the B / G ratio and the G / R ratio in the fifth range in which the radial difference and the declination difference have been expanded can be obtained.
- the B / G ratio, G / R ratio, radial difference and deviation of the fifth range in which the radial difference and declination difference are expanded.
- the weighting addition unit 73 b refers to the two-dimensional LUT 73 a to specify the B / G ratio and the color difference emphasis amount corresponding to the G / R ratio obtained by the signal ratio calculation unit 72.
- the weighting addition unit 73b calculates an emphasis coefficient f (R) to be multiplied by the color difference emphasis amount based on the R image signal (logR) after Log conversion.
- the R image signal includes more signals corresponding to the light reflected without being absorbed by the observation target, as compared with the B image signal and the G image signal of other colors. Therefore, the light amount of the reflected light can be grasped from the R image signal.
- the emphasis coefficient f (R) is set to “1” when the Log converted R image signal falls within a predetermined range, as shown in FIG.
- the emphasis coefficient f (R) is set smaller than “1”.
- the emphasis coefficient is set to be smaller as the Log converted R image signal exceeding the upper limit increases or as the Log converted R image signal falling below the lower limit decreases. As described above, it is possible to suppress halation etc. by reducing the color difference emphasis amount by reducing the enhancement coefficient of the high luminance portion where the R image signal exceeds the upper limit value and the low luminance portion below the lower limit value. .
- the weighted addition unit 73b performs weighted addition processing on the B / G ratio and the G / R ratio based on the specified color difference emphasis amount and the calculated emphasis coefficient.
- the two-dimensional LUT73a B / G ratio of the second range when the color difference enhancement amount corresponding G / R ratio delta * (B / G ratio), delta * (G / R ratio) is specified, the following ( 4) Weighted addition processing by the equation is performed. Thereby, the B / G ratio and G / R ratio of the second range in which the color difference has been enhanced can be obtained.
- the atrophic mucosal area is clearly represented in a color different from the area of the normal mucous membrane
- a color difference-weighted image is displayed in which the color of the atrophic mucosal region is approximately equal to the color of the mucosal membrane during atrophic gastritis. This makes it possible to reliably determine the boundary between the area of normal mucous membrane and the area of atrophic area.
- the deep blood vessel region is clearly displayed in a color different from the normal mucous membrane region
- the color-emphasized image in which the deep-layer blood vessel can be reliably seen is displayed due to the coloration of the deep-layer blood vessel. This makes it possible to reliably determine the boundary between the region of normal mucosa and the deep blood vessel region.
- This declination difference expansion process is particularly effective when the deep-layer blood vessel is not seen so much (for example, when atrophy is progressing and is included in the B and C groups in the ABC examination).
- the B / G ratio in the fourth range and the color difference emphasis amount ⁇ *** (B / G ratio) and ⁇ *** (G / R ratio) corresponding to the G / R ratio are specified by the two-dimensional LUT 73a.
- weighted addition processing according to the following equation (6-1) is performed.
- the B / G ratio and G / R ratio of the fourth range in which the color difference is emphasized can be obtained.
- (6-1): B / G ratio of color-difference-emphasized fourth range, G / R ratio Fourth range of B / G ratio, G / R ratio + ⁇ *** (B / G ratio), ⁇ *** (G / R ratio) x f (R)
- the color difference emphasis in which the reddish area is clearly expressed in a color different from the area of the normal mucous membrane The image is displayed. This makes it possible to reliably determine the boundary between the area of normal mucous membrane and the reddening area.
- the RGB conversion unit 74 reconverts the B / G ratio and G / R ratio after color difference emphasis obtained by the color difference emphasis unit 82 into a color difference emphasized RGB image signal. As a result, a color difference enhanced image composed of RGB image signals is obtained.
- the structure emphasizing unit 75 performs structure emphasis of the atrophied mucous membrane area, the deep blood vessel area, the BA area, and the red area based on the B / G ratio before color difference emphasis, the G / R ratio, and the color difference emphasized image. As shown in FIG. 11, the structure emphasizing unit 75 includes a frequency emphasizing unit 75a, a combining ratio setting unit 75b, and a combining unit 75c.
- the frequency emphasizing unit 75a obtains a plurality of frequency-weighted images by performing a plurality of frequency filtering (BPF (Band Pass Filtering)) on the RGB image signal of the color-weighted image.
- the frequency emphasizing unit 75a extracts a low frequency first frequency component including many atrophied mucous regions for frequency filtering for the atrophied mucous membrane region, and for a deep blood vessel region for extracting a middle frequency second frequency component including many deep blood vessel regions Frequency filtering for extracting the low frequency third frequency component including many BA regions, and red light frequency filtering for extracting the low frequency fourth frequency components including many reddish regions.
- BPF Band Pass Filtering
- the first frequency component-weighted image BPF1 (RGB) is obtained by performing frequency filtering for the atrophic mucosal region.
- the second frequency component enhanced image BPF2 (RGB) is obtained by performing frequency filtering for the deep blood vessel region.
- the third frequency component-weighted image BPF3 (RGB) is obtained by performing frequency filtering for the BA region.
- the fourth frequency component-weighted image BPF4 (RGB) is obtained by performing frequency filtering for the red area.
- the composition ratio setting unit 75b sets the first to fourth frequency component enhanced images BPF1 to 3 (RGB) to the RGB image signal of the color difference enhanced image based on the B / G ratio and G / R ratio before color difference enhancement.
- Synthesis ratio g1 (B / G ratio, G / R ratio) indicating the ratio to be synthesized, g2 (B / G ratio, G / R ratio), g3 (B / G ratio, G / R ratio), g4 (B / G ratio) G ratio, G / R ratio) is set for each pixel. As shown in FIG.
- the composition ratio g1 (B / G ratio, G / R ratio) is set to “100%”.
- the other synthesis ratios g2, g3 and g4 (B / G ratio, G / R ratio) are set to "0%”.
- the composition ratio g2 (B / G ratio, G / R ratio) is set to "100%", and other composition ratios are set.
- g1, g3 and g4 (B / G ratio, G / R ratio) are set to “0%”.
- the composition ratio g3 (B / G ratio, G / R ratio) is set to "100%”, and other composition ratios are set.
- g1, g2 and g4 (B / G ratio, G / R ratio) are set to "0%”.
- the composition ratio g4 (B / G ratio, G / R ratio) is set to "100%", and other composition ratios are set. g1, g2, g3 (B / G ratio, G / R ratio) are set to "0%”. On the other hand, the composition ratio g1 to g4 (B / G ratio, G / R ratio) is “0%” for pixels whose B / G ratio and G / R ratio do not fall within any of the 2nd to 5th ranges. It is set.
- the combining unit 75c generates the RGB image signals (color difference enhanced image (RGB)) and the first to fourth RGB image signals at the combining ratio set for each pixel by the combining ratio setting unit 75b based on the following equation (7).
- the frequency component-weighted images BPF1 to 4 (RGB) are synthesized. Thereby, a color difference and structure-weighted image (color difference and structure-weighted image (RGB)) composed of RGB image signals is obtained.
- Color difference and structure emphasized image color difference emphasized image (RGB) + BPF 1 (RGB) x Gain 1 (RGB) x g 1 (B / G ratio, G / R ratio) + BPF 2 (RGB) x Gain 2 (RGB) x g 2 (B / G ratio, G / R ratio) + BPF 3 (RGB) x Gain 3 (RGB) x g 3 (B / G ratio, G / R ratio) + BPF 4 (RGB) x Gain 4 (RGB) x g 4 (B / G ratio, G / R ratio)
- the gains 1 to 4 (RGB) of the equation (7) are determined in advance by the characteristics of the edges of the first to fourth frequency component-weighted images. For example, in the deep blood vessels, in the second and third frequency component-weighted images containing a large amount of BA, the deep blood vessels and BA have down-edges whose image values are less than “0”. RGB) is preferably set to a negative value.
- the composition ratio setting unit 75 b sets the composition ratio g1 (B / G ratio, G / R ratio) to “100%”.
- the composition ratios g2, g3 and g4 are set to "0%”
- the B / G ratio and G / R ratio of the color difference enhanced image are second
- the first frequency emphasis component is added to the pixels within the range.
- the B / G ratio and G / R ratio of the atrophied mucous membrane region are contained in a large amount in the second range, so that the atrophied mucous membrane can be structurally emphasized by the addition of the first frequency emphasis component.
- the composition ratio setting unit 75b sets the composition ratio g2 (B / G ratio, G / R ratio) to "100%". Since the composition ratios g1, g3 and g4 (B / G ratio and G / R ratio) are set to "0%", the B / G ratio and G / R ratio in the third range of the color difference enhanced image are in the third range.
- the second frequency enhancement component is added to the pixel located inside. The B / G ratio and the G / R ratio of the deep blood vessel region are included in a large amount in the third range, so that the deep blood vessel region can be structurally emphasized by the addition of the second frequency emphasis component.
- the composition ratio setting unit 75b sets the composition ratio g3 (B / G ratio, G / R ratio) to "100%". Since the composition ratios g1, g2 and g4 (B / G ratio and G / R ratio) are set to "0%", the B / G ratio and G / R ratio in the fourth range of the color difference enhanced image are in the fourth range.
- the third frequency enhancement component is added to the pixel located inside. The B / G ratio and the G / R ratio of the BA area are included in the fourth range in a large amount, so that the BA area can be structurally emphasized by the addition of the third frequency emphasis component.
- the composition ratio setting unit 75b sets the composition ratio g4 (B / G ratio, G / R ratio) to "100%". Since the composition ratios g1, g2 and g3 (B / G ratio and G / R ratio) are set to "0%", the B / G ratio and G / R ratio in the fifth range of the color difference enhanced image are in the fifth range.
- the fourth frequency emphasis component is added to the pixel located inside. The B / G ratio and G / R ratio of the red area are included in the fourth range, so that the red area can be structurally emphasized by the addition of the fourth frequency emphasis component.
- the combining ratio is set for each pixel based on the B / G ratio and the G / R ratio, and the frequency component emphasized image is combined with the color difference emphasized image at the combining ratio set for each pixel. It is possible to selectively emphasize the mucous membrane area, deep blood vessel area, BA and redness. For example, when the first or third frequency component-weighted image is added to all the pixels of the color difference-weighted image regardless of the B / G ratio or G / R ratio, the first or third frequency component image emphasizes low frequency components Because of the image, both the atrophic mucosa and the BA will be emphasized.
- the first frequency component-emphasized image is added only to the pixels in the second range of the B / G ratio and the G / R ratio in the color difference-emphasized image, without emphasizing the BA. , It becomes possible to emphasize only the atrophic mucosa.
- the third frequency component-weighted image is added only to pixels in the fourth range of the B / G ratio and G / R ratio in the color difference-weighted image, the atrophied mucosa is not emphasized. It is possible to emphasize only
- the inverse Log conversion unit 76 performs inverse log conversion on the RGB image signals of the color difference and structure-emphasized image. Thereby, an RGB image signal of a color difference and structure-emphasized image having true pixel values is obtained.
- the gamma conversion unit 77 performs gamma conversion on the RGB image signal of the color difference and structure-emphasized image having a true pixel value. Thereby, an RGB image signal of a color difference and structure-emphasized image having a gradation suitable for an output device such as the monitor 18 is obtained.
- the color difference and structure-emphasized image is sent to the video signal generation unit 66 as a special image.
- the normal observation mode is set, and the insertion portion 21 of the endoscope 12 is inserted into the sample.
- the tip 24 of the insertion portion 21 reaches the stomach, it is diagnosed whether atrophic gastritis has occurred.
- the boundary referred to as an endoscopic gland boundary
- the doctor will judge it as a pathological finding that a lesion such as gastric cancer is occurring due to atrophic gastritis (a judgment method by Kimura-Takemoto classification).
- Such gastric cancer is also known to occur due to atrophy of the gastric mucosa caused by H. pylori infection.
- the mode switching SW 22b is operated to make a diagnosis more reliably.
- Switch to special observation mode By switching the special observation mode, special light including both blue laser light and blue-violet laser light is emitted.
- the B / G ratio and the G / R ratio are calculated from the RGB image signal obtained at the time of the special light emission.
- the color difference emphasis amount corresponding to the calculated B / G ratio and G / R ratio is determined. Further, the enhancement coefficient f (R) is obtained from the R image signal. Then, the color difference emphasis amount obtained by multiplying the B / G ratio and G / R ratio by the emphasis coefficient f (R) is added to obtain the color difference emphasized B / G ratio and G / R ratio.
- the color difference emphasized B / G ratio and G / R ratio are converted into RGB image signals.
- the RGB image signal By subjecting the RGB image signal to a plurality of frequency filters, first to fourth frequency component-weighted images are obtained. Further, the composition ratios g1 to g4 (B / G ratio, G / R ratio) are determined for each pixel from the B / G ratio and G / R ratio. Then, the first to fourth frequency component-emphasized images obtained by multiplying Gain1 to 4 (RGB) and combining ratios g1 to g4 (B / G ratio, G / R ratio) determined in advance are subjected to color difference enhanced RGB image signals To obtain a color difference and structure-emphasized RGB image signal. A special image is displayed on the monitor 18 based on the color difference and the structure-emphasized RGB image signal.
- RGB Gain1 to 4
- combining ratios g1 to g4 B / G ratio, G / R ratio
- the mucous membrane is displayed in a normal color.
- the doctor concludes that there is no occurrence of a lesion such as gastric cancer due to atrophic gastritis as a normal finding.
- the atrophy of the stomach is a little advanced, the color of the atrophied mucous membrane is displayed in a faded color, and the deep blood vessel is displayed through and the atrophied mucous membrane and the deep blood vessel are The structure is emphasized. This allows the endoscopic gland boundary to be clearly displayed.
- the color of atrophied mucous membrane is not displayed in a very discolored color, and even if the deep blood vessels are not seen so much, the doctor says that gastric cancer due to atrophic gastritis etc. It is possible to judge that the lesion is a pathological finding that is occurring.
- the structure emphasis process is performed on the color difference emphasized image in which the color difference between the normal part and the abnormal part is emphasized to generate the color difference and structure emphasized image, but the present invention is not limited to this.
- structure enhancement processing may be performed directly to generate a structure enhanced image.
- the special image processing unit 100 shown in FIG. 14 is used instead of the special image processing unit 64 in FIG.
- the special image processing unit 100 is not provided with the color difference emphasizing unit 73 and the RGB conversion unit 74.
- a base image creation unit 101 is provided between the Log conversion unit 71 and the structure emphasizing unit 75.
- the base image creation unit 101 creates a base image based on the RGB image signal that has been Log-transformed by the Log transform unit 71. This base image is sent to the structure emphasizing unit 75. Further, the RGB image signal subjected to the Log conversion is sent to the signal ratio calculation unit 72. The signal ratio calculating unit 72 calculates the B / G ratio and the G / R ratio based on the RGB image signal, and sends the calculated B / G ratio and G / R ratio to the structure emphasizing unit 75.
- the structure emphasizing unit 75 generates a structure-emphasized image in which the atrophied mucous membrane, the deep blood vessel, and the BA are selectively structure-emphasized based on the RGB image signal of the base image, the B / G ratio, and the G / R ratio.
- the B / G ratio and the G / R ratio are subjected to polar coordinate conversion, and the radial difference or declination difference between the first range average value after polar coordinate conversion and the signal value within the second to fifth ranges is
- the difference in color between the normal part and the abnormal part is emphasized by the extension
- the color difference between the normal part and the abnormal part is emphasized using other coordinate conversion methods and color difference emphasis methods. It is also good.
- the color difference enhanced image obtained by using the color difference emphasizing method for expanding the radial difference or the declination difference on the polar coordinates as in the above embodiment only the color of the abnormal part is changed without much changing the color of the normal part. Because it is possible to change, it does not cause discomfort.
- the color of the atrophied mucous membrane area and the deep blood vessel area on the color difference-weighted image is the same as the color of the mucous membrane at the time of atrophic gastritis or the color at the time of blood vessel penetration, so atrophic gastritis diagnosis (for example, Diagnosis can be performed in the same manner as ABC examination).
- the first range average value is used for the color difference enhancement of the normal part and the abnormal part.
- the pixel value average value of the entire image signal may be used.
- the merit is that the slight difference between the colors of the normal part and the abnormal part can be expanded according to the distribution of each area on the image. is there.
- the phosphor 44 is provided at the distal end portion 24 of the endoscope 12.
- the phosphor 44 may be provided in the light source device 14. In this case, it is preferable to provide the phosphor 44 between the light guide 41 and the blue laser light source 34.
- the RGB image signals are simultaneously acquired by the color sensor, but in the second embodiment, the RGB image signals are sequentially acquired by the monochrome sensor.
- a wide band light source 202 and a rotation filter 204 instead of the blue laser light source 34, the blue-violet laser light source 36, and the light source control unit 40, a wide band light source 202 and a rotation filter 204.
- the filter switching unit 205 is provided.
- the phosphor 44 is not provided.
- the imaging optical system 24b is provided with a monochrome sensor 206 provided with no color filter. Other than that, it is the same as that of endoscope system 10 of a 1st embodiment.
- the broadband light source 202 is a xenon lamp, a white LED or the like, and emits white light whose wavelength range extends from blue to red.
- the rotary filter 204 includes a normal observation mode filter 208 provided on the inner side and a special observation mode filter 209 provided on the outer side (see FIG. 16).
- the filter switching unit 205 moves the rotary filter 204 in the radial direction, and inserts the normal observation mode filter 208 of the rotary filter 204 in the optical path of the white light when the normal observation mode is set by the mode switch SW 22 b.
- the special observation mode filter 209 of the rotation filter 204 is inserted into the optical path of the white light.
- the normal observation mode filter 208 includes a B filter 208a for transmitting blue light of white light, a G filter 208b for transmitting green light of white light, and a white light along the circumferential direction.
- the R filter 208c transmits red light among them. Therefore, in the normal observation mode, the blue light, the green light, and the red light are alternately irradiated into the sample by rotating the rotary filter 204.
- the filter for special observation mode 209 includes a Bn filter 209a for transmitting blue narrow band light of central wavelength 415 nm of white light, a G filter 209b for transmitting green light of white light, and white light along the circumferential direction.
- the R filter 209 c is provided to transmit red light. Therefore, in the special observation mode, the blue narrow band light, the green light, and the red light are alternately irradiated into the specimen by rotating the rotary filter 204.
- the endoscope system 200 in the normal observation mode, the inside of the subject is imaged by the monochrome sensor 206 each time blue light, green light, and red light are irradiated into the subject. Thereby, image signals of three colors of RGB are obtained. Then, based on the RGB image signals, a normal image is generated in the same manner as in the first embodiment.
- the inside of the sample is imaged by the monochrome sensor 206 each time the blue narrow band light, green light, and red light are irradiated into the sample.
- a Bn image signal, a G image signal, and an R image signal are obtained.
- a special image is generated based on the Bn image signal, the G image signal, and the R image signal.
- the Bn image signal is used to generate the special image.
- the special image is generated in the same manner as in the first embodiment.
- the B image signal which is a narrow band signal including narrow band wavelength information of blue laser light and blue violet laser light is used to create a special image in the second embodiment.
- a Bn image signal which is a narrow band signal including narrow band wavelength information of blue narrow band light is used to create a special image, but in the third embodiment, a wide band image such as a white image
- a blue narrow band image signal is generated by the spectral operation based on R.sub.k, and a special image is generated using this blue narrow band image signal.
- a special image is generated in the same manner as in the first embodiment.
- white light in addition to the white light obtained by the phosphor 44, broad band light emitted from a wide band light source such as a xenon lamp may be used.
- the mucous membrane becomes discolored by atrophic gastritis and an example in which a deep blood vessel under atrophied mucous membrane is seen through are shown.
- the mucous membrane may be discolored due to the According to the present invention, it is possible to emphasize the color difference from the normal part even for a dull mucous membrane other than such atrophied mucous membrane.
- the present invention can also highlight the viewing of deep blood vessels below a darkened mucous membrane other than the atrophic mucous membrane.
- the B / G ratio and the G / R ratio are used as the plurality of pieces of color information, but instead, color difference signals Cr and color difference signals Cb may be used.
- color difference signals Cr and color difference signals Cb may be used.
- the CrCb space which is a feature space formed from the color difference signals Cr and Cb
- color difference enhancement of the normal part and the abnormal part or structure enhancement of the abnormal part is performed.
- the hue H and the saturation S may be used as the plurality of pieces of color information.
- the HS space which is a feature space formed of the hue H and the saturation S color difference emphasis of the normal part and the abnormal part or structure emphasis of the abnormal part is performed.
- the elements a * and b * of the color in the CIE Lab space may be used.
- color difference emphasis of the normal part and the abnormal part or structure emphasis of the abnormal part is performed using ab space which is a feature space formed from the elements a * and b * .
- the present invention is carried out during diagnosis of the endoscope.
- the present invention is not limited to this.
- the endoscope recorded in the recording unit of the endoscope system Implementation of the present invention may be performed based on an image, and implementation of the present invention may be performed based on a capsule endoscope image acquired by a capsule endoscope.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Optics & Photonics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Gastroenterology & Hepatology (AREA)
- Quality & Reliability (AREA)
- Astronomy & Astrophysics (AREA)
- Multimedia (AREA)
- Endocrinology (AREA)
- Physiology (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
- Image Processing (AREA)
Abstract
Description
(A)白に近い色の粘膜筋板が透けて見えることになり、萎縮粘膜の色は正常部より退色した色になる。
(B)萎縮粘膜がある領域では、萎縮に伴って粘膜層が薄くなるにつれて、粘膜下層の血管が透見されるようになる(図19(B)参照)。
したがって、萎縮性胃炎に基づく胃病変部の診断においては、上記2つの特徴(A)、(B)を利用して、萎縮の進行度の判断や、正常部と胃炎部との境界の判別を行っている。
図1に示すように、第1実施形態の内視鏡システム10は、内視鏡12と、光源装置14と、プロセッサ装置16と、モニタ18と、コンソール20とを有する。内視鏡12は、ユニバーサルコード13を介して、光源装置14と光学的に接続されるとともにプロセッサ装置16と電気的に接続される。内視鏡12は、検体内に挿入される挿入部21と、挿入部の基端部分に設けられた操作部22と、挿入部21の先端側に設けられる湾曲部23及び先端部24を有している。操作部22のアングルノブ22aを操作することにより、湾曲部23は湾曲動作する。この湾曲動作に伴って、先端部24が所望の方向に向けられる。
(1-1):Eri=(ri-rm)・α+rm (ただし、α≧1)
Δ*(B/G比)=|動径差拡張済みの第2範囲のB/G比-動径差拡張前の第2範囲のB/G比|・・・(1-2)
Δ*(G/R比)=|動径差拡張済みの第2範囲のG/R比-動径差拡張前の第2範囲のG/R比|・・・(1-3)
(2-1):Eθv=(θv-θm)・β+θm (ただし、β≧1)
Δ**(B/G比)=|偏角差拡張済みの第3範囲のB/G比-偏角差拡張前の第3範囲のB/G比|・・・(2-2)
Δ**(G/R比)=|偏角差拡張済みの第3範囲のG/R比-偏角差拡張前の第3範囲のG/R比|・・・(2-3)
(3-1):Erk=(rk-rm)・α+rm (ただし、α≧1)
(3-2):Eθk=(θk-θm)・β+θm (ただし、β≧1)
Δ***(B/G比)=|動径差及び偏角差拡張済みの第4範囲のB/G比-動径差及び偏角差拡張前の第4範囲のB/G比|・・・(3-3)
Δ***(G/R比)=|動径差及び偏角差拡張済みの第4範囲のG/R比-動径差及び偏角差拡張前の第4範囲のG/R比|・・・(3-4)
(4-1):Erp=(rp-rm)・α+rm (ただし、α≧1)
(4-2):Eθp=(θp-θm)・β+θm (ただし、β≧1)
Δ*4(B/G比)=|動径差及び偏角差拡張済みの第5範囲のB/G比-動径差及び偏角差拡張前の第5範囲のB/G比|・・・(4-3)
Δ*4(G/R比)=|動径差及び偏角差拡張済みの第5範囲のG/R比-動径差及び偏角差拡張前の第5範囲のG/R比|・・・(4-4)
(4):色差強調済みの第2範囲のB/G比、G/R比=
第2範囲のB/G比、G/R比+Δ*(B/G比)、Δ*(G/R比)×f(R)
(5):色差強調済みの第3範囲のB/G比、G/R比=
第3範囲のB/G比、G/R比+Δ**(B/G比)、Δ**(G/R比)×f(R)
(6-1):色差強調済みの第4範囲のB/G比、G/R比=
第4範囲のB/G比、G/R比+Δ***(B/G比)、Δ***(G/R比)×f(R)
(6-2):色差強調済みの第5範囲のB/G比、G/R比=
第5範囲のB/G比、G/R比+Δ*4(B/G比)、Δ*4(G/R比)×f(R)
(7):色差及び構造強調画像(RGB)=色差強調画像(RGB)
+BPF1(RGB)×Gain1(RGB)×g1(B/G比、G/R比)
+BPF2(RGB)×Gain2(RGB)×g2(B/G比、G/R比)
+BPF3(RGB)×Gain3(RGB)×g3(B/G比、G/R比)
+BPF4(RGB)×Gain4(RGB)×g4(B/G比、G/R比)
上記第1実施形態では、カラーのセンサでRGB画像信号を同時に取得したが、第2実施形態では、モノクロのセンサでRGB画像信号を順次取得する。図15に示すように、第2実施形態の内視鏡システム200の光源装置14には、青色レーザ光源34、青紫色レーザ光源36、光源制御部40の代わりに、広帯域光源202、回転フィルタ204、フィルタ切替部205が設けられている。また、内視鏡12の照明光学系24aには、蛍光体44が設けられていない。また、撮像光学系24bには、カラーのセンサ48の代わりに、カラーフィルタが設けられていないモノクロのセンサ206が設けられている。それ以外については、第1実施形態の内視鏡システム10と同様である。
第1実施形態の内視鏡システム10では、特殊画像の作成に、青色レーザ光及び青紫色レーザ光の狭帯域波長情報が含まれる狭帯域信号であるB画像信号を用い、第2実施形態の内視鏡システム200では、特殊画像の作成に、青色狭帯域光の狭帯域波長情報が含まれる狭帯域信号であるBn画像信号を用いたが、第3実施形態では、白色画像などの広帯域画像に基づく分光演算により青色狭帯域画像信号を生成し、この青色狭帯域画像信号を用いて特殊画像を生成する。
48、206 センサ
72 信号比算出部
73 色差強調部
75a 周波数強調部
75b 合成比率設定部
75c 合成部
101 ベース画像作成部
Claims (18)
- カラー画像信号を入力する画像信号入力部と、
前記カラー画像信号に基づいて、ベース画像を作成するベース画像作成部と、
前記カラー画像信号から複数の色情報を取得する色情報取得部と、
前記ベース画像に基づいて、正常部と異なる異常部に対応する周波数成分を強調した周波数成分強調画像を生成する周波数強調部と、
前記複数の色情報に基づいて、前記ベース画像に対して前記周波数成分強調画像を合成する割合を示す合成比率を設定する合成比率設定部と、
前記合成比率設定部で設定された合成比率で前記ベース画像に前記周波数成分強調画像を合成して、前記異常部を構造強調した構造強調画像を生成する合成部とを備えることを特徴とする画像処理装置。 - 前記カラー画像信号は3色の画像信号であり、前記色情報取得部は、前記複数の色情報として、前記3色の画像信号のうち2色の画像信号間の第1信号比と、前記第1信号比と異なる2色の画像信号間の第2信号比を算出する信号比算出部であることを特徴とする請求項1記載の画像処理装置。
- 前記合成比率設定部は、
前記正常部の第1及び第2信号比を有する第1範囲に、前記信号比算出部で算出した第1及び第2信号比が含まれる画素については、前記合成比率を「0%」に設定し、前記異常部の第1及び第2信号比を有する特定の範囲に、前記信号比算出部で算出した第1及び第2信号比が含まれる画素については、前記合成比率を「100%」に設定することを特徴とする請求項2記載の画像処理装置。 - 前記異常部は退色調粘膜であり、
前記特定の範囲は、前記退色調粘膜の第1及び第2信号比を含む第2範囲であり、
前記第1範囲と前記第2範囲とは色相が同じで、前記第2範囲は前記第1範囲よりも彩度が低いことを特徴とする請求項3記載の画像処理装置。 - 前記異常部は、退色調粘膜下で病変の進行とともに透見する血管を有する血管領域であり、
前記特定の範囲は、前記血管領域の第1及び第2信号比を含む第3範囲であり、
前記第1範囲と前記第3範囲とは、彩度が同じで、色相が異なっていることを特徴とする請求項3記載の画像処理装置。 - 前記異常部はブラウニッシュエリアであり、
前記特定の範囲は、前記ブラウニッシュエリアの第1及び第2信号比を含む第4範囲であることを特徴とする請求項3記載の画像処理装置。 - 前記異常部は発赤であり、
前記特定の範囲は、前記発赤の第1及び第2信号比を含む第5範囲であることを特徴とする請求項3記載の画像処理装置。 - 前記3色の画像信号はRGB画像信号であり、
前記第1信号比はB画像信号とG画像信号間のB/G比であり、前記第2信号比はG画像信号とR画像信号間のG/R比であることを特徴とする請求項2ないし7いずれか1項記載の画像処理装置。 - カラー画像信号を入力する画像信号入力部と、
前記カラー画像信号から複数の色情報を取得する色情報取得部と、
前記複数の色情報で形成される特徴空間において、第1範囲と前記第1範囲と異なる特定の範囲との差を拡張する拡張処理を行って、正常部と異常部との色差を強調した色差強調画像を生成する色差強調部と、
前記色差強調画像に基づいて、前記異常部に対応する周波数成分を強調した周波数成分強調画像を生成する周波数強調部と、
前記複数の色情報に基づいて、前記色差強調画像に対して前記周波数成分強調画像を合成する割合を示す合成比率を設定する合成比率設定部と、
前記合成比率設定部で設定された合成比率で前記色差強調画像に前記周波数成分強調画像を合成して、前記色差を強調するとともに前記異常部を構造強調した色差及び構造強調画像を生成する合成部とを備えることを特徴とする画像処理装置。 - 前記カラー画像信号は3色の画像信号であり、前記色情報取得部は、前記複数の色情報として、前記3色の画像信号のうち2色の画像信号間の第1信号比と、前記第1信号比と異なる2色の画像信号間の第2信号比を算出する信号比算出部であることを特徴とする請求項9記載の画像処理装置。
- 前記合成比率設定部は、
前記信号比算出部で算出した第1及び第2信号比が前記第1範囲に含まれる画素については、前記合成比率を「0%」に設定し、前記信号比算出部で算出した第1及び第2信号比が前記特定の範囲に含まれる画素については、前記合成比率を「100%」に設定することを特徴とする請求項10記載の画像処理装置。 - 前記異常部は退色調粘膜であり、
前記特定の範囲は、前記退色調粘膜の第1及び第2信号比を含む第2範囲であり、
前記第1範囲と前記第2範囲とは色相が同じで、前記第2範囲は前記第1範囲よりも彩度が低いことを特徴とする請求項11記載の画像処理装置。 - 前記異常部は、退色調粘膜下で病変の進行とともに透見する血管を有する血管領域であり、
前記特定の範囲は、前記血管領域の第1及び第2信号比を含む第3範囲であり、
前記第1範囲と前記第3範囲とは、彩度が同じで、色相が異なっていることを特徴とする請求項11記載の画像処理装置。 - 前記異常部はブラウニッシュエリアであり、
前記特定の範囲は、前記ブラウニッシュエリアの第1及び第2信号比を含む第4範囲であることを特徴とする請求項11記載の画像処理装置。 - 前記異常部は発赤であり、
前記特定の範囲は、前記発赤の第1及び第2信号比を含む第5範囲であることを特徴とする請求項11記載の画像処理装置。 - 前記3色の画像信号はRGB画像信号であり、
前記第1信号比はB画像信号とG画像信号間のB/G比であり、前記第2信号比はG画像信号とR画像信号間のG/R比であることを特徴とする請求項10ないし15いずれか1項記載の画像処理装置。 - 画像信号入力部がカラー画像信号を入力するステップと、
ベース画像作成部が、前記カラー画像信号に基づいて、ベース画像を作成するステップと、
色情報取得部が、前記カラー画像信号から複数の色情報を取得するステップと、
周波数強調部が、前記ベース画像に基づいて、正常部と異なる異常部に対応する周波数成分を強調した周波数成分強調画像を生成するステップと、
合成比率設定部が、前記複数の色情報に基づいて、前記ベース画像に対して前記周波数成分強調画像を合成する割合を示す合成比率を設定するステップと、
合成部が、前記合成比率設定部で設定された合成比率で、前記ベース画像に前記周波数成分強調画像を合成して、前記異常部を構造強調した構造強調画像を生成するステップとを有することを特徴とする画像処理装置の作動方法。 - 画像信号入力部がカラー画像信号を入力するステップと、
色情報取得部が、前記カラー画像信号から複数の色情報を取得するステップと、
色差強調部が、前記複数の色情報で形成される特徴空間において、第1範囲と前記第1範囲と異なる特定の範囲との差を拡張する拡張処理を行って、正常部と異常部との色差を強調した色差強調画像を生成するステップと、
周波数強調部が、前記色差強調画像に基づいて、前記異常部に対応する周波数成分を強調した周波数成分強調画像を生成するステップと、
合成比率設定部が、前記複数の色情報に基づいて、前記色差強調画像に対して前記周波数成分強調画像を合成する割合を示す合成比率を設定するステップと、
合成部が、前記合成比率設定部で設定された合成比率で、前記色差強調画像に前記周波数成分強調画像を合成して、前記色差を強調するとともに前記異常部を構造強調した色差及び構造強調画像を生成するステップとを有することを特徴とする画像処理装置の作動方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015544939A JP6151372B2 (ja) | 2013-10-28 | 2014-10-22 | 画像処理装置及びその作動方法 |
CN201480059325.5A CN105705075B (zh) | 2013-10-28 | 2014-10-22 | 图像处理装置及其工作方法 |
EP14858260.4A EP3072439B1 (en) | 2013-10-28 | 2014-10-22 | Image processing device and operation method therefor |
US15/139,940 US9582878B2 (en) | 2013-10-28 | 2016-04-27 | Image processing device and operation method therefor |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-222964 | 2013-10-28 | ||
JP2013222964 | 2013-10-28 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/139,940 Continuation US9582878B2 (en) | 2013-10-28 | 2016-04-27 | Image processing device and operation method therefor |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015064435A1 true WO2015064435A1 (ja) | 2015-05-07 |
Family
ID=53004043
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/078046 WO2015064435A1 (ja) | 2013-10-28 | 2014-10-22 | 画像処理装置及びその作動方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9582878B2 (ja) |
EP (1) | EP3072439B1 (ja) |
JP (2) | JP6151372B2 (ja) |
CN (2) | CN108109134B (ja) |
WO (1) | WO2015064435A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017023620A (ja) * | 2015-07-28 | 2017-02-02 | 富士フイルム株式会社 | 画像処理装置及び内視鏡システム並びに画像処理装置の作動方法 |
CN106687023A (zh) * | 2015-08-13 | 2017-05-17 | Hoya株式会社 | 评价值计算装置以及电子内窥镜系统 |
JP2017158929A (ja) * | 2016-03-11 | 2017-09-14 | 富士フイルム株式会社 | 画像処理装置、画像処理装置の作動方法、および画像処理プログラム |
CN109068945A (zh) * | 2016-03-29 | 2018-12-21 | 富士胶片株式会社 | 图像处理装置、图像处理装置的工作方法及图像处理程序 |
WO2019039354A1 (ja) * | 2017-08-23 | 2019-02-28 | 富士フイルム株式会社 | 光源装置及び内視鏡システム |
JPWO2018163644A1 (ja) * | 2017-03-07 | 2020-01-09 | ソニー株式会社 | 情報処理装置、支援システム及び情報処理方法 |
JPWO2018230396A1 (ja) * | 2017-06-15 | 2020-04-09 | 富士フイルム株式会社 | 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法 |
JP2020078647A (ja) * | 2020-02-27 | 2020-05-28 | 富士フイルム株式会社 | 画像処理装置、画像処理装置の作動方法、および画像処理プログラム |
EP3442397B1 (en) * | 2016-04-13 | 2021-06-09 | Inspektor Research Systems B.V. | Bi-frequency dental examination |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106163367B (zh) * | 2014-03-31 | 2018-05-08 | 富士胶片株式会社 | 医用图像处理装置及其工作方法以及内窥镜系统 |
USD841159S1 (en) * | 2015-06-12 | 2019-02-19 | Fujifilm Corporation | Operating grip for endoscope |
CN107072509B (zh) * | 2015-09-30 | 2019-03-19 | Hoya株式会社 | 内窥镜系统以及评价值计算装置 |
JP6883880B2 (ja) | 2016-05-17 | 2021-06-09 | リバウンド セラピュティクス コーポレーションRebound Therapeutics Corporation | 脳内血腫の血液塊を特定する色検出のための方法および装置 |
TWI616180B (zh) * | 2016-06-29 | 2018-03-01 | 國立成功大學 | 上消化道出血偵測裝置及方法 |
US10646102B2 (en) * | 2016-08-31 | 2020-05-12 | Hoya Corporation | Processor for electronic endoscope, and electronic endoscope system |
JP6392486B1 (ja) * | 2017-02-01 | 2018-09-19 | オリンパス株式会社 | 内視鏡システム |
WO2018155560A1 (ja) * | 2017-02-24 | 2018-08-30 | 富士フイルム株式会社 | 内視鏡システム、プロセッサ装置、及び、内視鏡システムの作動方法 |
CN110381806B (zh) * | 2017-03-31 | 2021-12-10 | Hoya株式会社 | 电子内窥镜系统 |
EP3620098B1 (en) * | 2018-09-07 | 2021-11-03 | Ambu A/S | Enhancing the visibility of blood vessels in colour images |
US11109741B1 (en) | 2020-02-21 | 2021-09-07 | Ambu A/S | Video processing apparatus |
US10835106B1 (en) | 2020-02-21 | 2020-11-17 | Ambu A/S | Portable monitor |
US11166622B2 (en) | 2020-02-21 | 2021-11-09 | Ambu A/S | Video processing apparatus |
US10980397B1 (en) * | 2020-02-21 | 2021-04-20 | Ambu A/S | Video processing device |
CN111345831B (zh) * | 2020-03-06 | 2023-02-28 | 重庆金山医疗技术研究院有限公司 | 一种消化道生理数据显示方法及系统 |
US11328390B2 (en) | 2020-05-13 | 2022-05-10 | Ambu A/S | Method for adaptive denoising and sharpening and visualization systems implementing the method |
CN113344860B (zh) * | 2021-05-17 | 2022-04-29 | 武汉大学 | 胃黏膜染色放大图像微结构的异常程度量化方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06335451A (ja) * | 1993-03-19 | 1994-12-06 | Olympus Optical Co Ltd | 内視鏡用画像処理装置 |
JP2000148987A (ja) * | 1998-11-17 | 2000-05-30 | Olympus Optical Co Ltd | 画像処理装置 |
JP2003093336A (ja) | 2001-09-26 | 2003-04-02 | Toshiba Corp | 電子内視鏡装置 |
JP2012071012A (ja) | 2010-09-29 | 2012-04-12 | Fujifilm Corp | 内視鏡装置 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63173182A (ja) * | 1987-01-13 | 1988-07-16 | Olympus Optical Co Ltd | 色彩画像処理方式 |
US5550582A (en) * | 1993-03-19 | 1996-08-27 | Olympus Optical Co., Ltd. | Endoscope-image processing apparatus for performing image processing of emphasis in endoscope image by pigment concentration distribution |
JPH08262692A (ja) * | 1995-03-23 | 1996-10-11 | Dainippon Screen Mfg Co Ltd | 色分解方法および装置 |
US7328059B2 (en) * | 1996-08-23 | 2008-02-05 | The Texas A & M University System | Imaging of light scattering tissues with fluorescent contrast agents |
US6422994B1 (en) * | 1997-09-24 | 2002-07-23 | Olympus Optical Co., Ltd. | Fluorescent diagnostic system and method providing color discrimination enhancement |
US7340079B2 (en) * | 2002-09-13 | 2008-03-04 | Sony Corporation | Image recognition apparatus, image recognition processing method, and image recognition program |
JP4554647B2 (ja) * | 2003-04-25 | 2010-09-29 | オリンパス株式会社 | 画像表示装置、画像表示方法および画像表示プログラム |
JP4388318B2 (ja) * | 2003-06-27 | 2009-12-24 | オリンパス株式会社 | 画像処理装置 |
JP2007060457A (ja) * | 2005-08-26 | 2007-03-08 | Hitachi Ltd | 画像信号処理装置および画像信号処理方法 |
JP5057675B2 (ja) * | 2006-03-03 | 2012-10-24 | オリンパスメディカルシステムズ株式会社 | 生体観察装置 |
JP4855868B2 (ja) * | 2006-08-24 | 2012-01-18 | オリンパスメディカルシステムズ株式会社 | 医療用画像処理装置 |
CN100565582C (zh) * | 2006-12-31 | 2009-12-02 | 四川大学华西医院 | 一种筛选食管微小病变的图像处理方法 |
JP5100596B2 (ja) * | 2008-10-03 | 2012-12-19 | キヤノン株式会社 | 情報処理装置及び情報処理方法 |
JP5305850B2 (ja) * | 2008-11-14 | 2013-10-02 | オリンパス株式会社 | 画像処理装置、画像処理プログラムおよび画像処理方法 |
JP5541914B2 (ja) * | 2009-12-28 | 2014-07-09 | オリンパス株式会社 | 画像処理装置、電子機器、プログラム及び内視鏡装置の作動方法 |
JP5335017B2 (ja) * | 2011-02-24 | 2013-11-06 | 富士フイルム株式会社 | 内視鏡装置 |
JP5451802B2 (ja) * | 2011-04-01 | 2014-03-26 | 富士フイルム株式会社 | 電子内視鏡システム及び電子内視鏡システムの校正方法 |
JP5667917B2 (ja) * | 2011-04-01 | 2015-02-12 | 富士フイルム株式会社 | 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡システムの作動方法 |
CN103098480B (zh) * | 2011-08-25 | 2018-08-28 | 松下电器(美国)知识产权公司 | 图像处理装置及方法、三维摄像装置 |
JP5980555B2 (ja) * | 2012-04-23 | 2016-08-31 | オリンパス株式会社 | 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム |
-
2014
- 2014-10-22 CN CN201711274536.2A patent/CN108109134B/zh active Active
- 2014-10-22 WO PCT/JP2014/078046 patent/WO2015064435A1/ja active Application Filing
- 2014-10-22 CN CN201480059325.5A patent/CN105705075B/zh active Active
- 2014-10-22 EP EP14858260.4A patent/EP3072439B1/en not_active Not-in-force
- 2014-10-22 JP JP2015544939A patent/JP6151372B2/ja active Active
-
2016
- 2016-04-27 US US15/139,940 patent/US9582878B2/en active Active
-
2017
- 2017-05-24 JP JP2017102318A patent/JP6427625B2/ja not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06335451A (ja) * | 1993-03-19 | 1994-12-06 | Olympus Optical Co Ltd | 内視鏡用画像処理装置 |
JP3228627B2 (ja) | 1993-03-19 | 2001-11-12 | オリンパス光学工業株式会社 | 内視鏡用画像処理装置 |
JP2000148987A (ja) * | 1998-11-17 | 2000-05-30 | Olympus Optical Co Ltd | 画像処理装置 |
JP2003093336A (ja) | 2001-09-26 | 2003-04-02 | Toshiba Corp | 電子内視鏡装置 |
JP2012071012A (ja) | 2010-09-29 | 2012-04-12 | Fujifilm Corp | 内視鏡装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3072439A4 |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017023620A (ja) * | 2015-07-28 | 2017-02-02 | 富士フイルム株式会社 | 画像処理装置及び内視鏡システム並びに画像処理装置の作動方法 |
CN106388756A (zh) * | 2015-07-28 | 2017-02-15 | 富士胶片株式会社 | 图像处理装置及其工作方法以及内窥镜系统 |
CN106687023A (zh) * | 2015-08-13 | 2017-05-17 | Hoya株式会社 | 评价值计算装置以及电子内窥镜系统 |
JP2017158929A (ja) * | 2016-03-11 | 2017-09-14 | 富士フイルム株式会社 | 画像処理装置、画像処理装置の作動方法、および画像処理プログラム |
CN109068945B (zh) * | 2016-03-29 | 2020-12-01 | 富士胶片株式会社 | 图像处理装置、图像处理装置的工作方法及记录介质 |
US10779714B2 (en) | 2016-03-29 | 2020-09-22 | Fujifilm Corporation | Image processing apparatus, method for operating image processing apparatus, and image processing program |
CN109068945A (zh) * | 2016-03-29 | 2018-12-21 | 富士胶片株式会社 | 图像处理装置、图像处理装置的工作方法及图像处理程序 |
EP3442397B1 (en) * | 2016-04-13 | 2021-06-09 | Inspektor Research Systems B.V. | Bi-frequency dental examination |
US11123150B2 (en) | 2017-03-07 | 2021-09-21 | Sony Corporation | Information processing apparatus, assistance system, and information processing method |
JPWO2018163644A1 (ja) * | 2017-03-07 | 2020-01-09 | ソニー株式会社 | 情報処理装置、支援システム及び情報処理方法 |
JP7095679B2 (ja) | 2017-03-07 | 2022-07-05 | ソニーグループ株式会社 | 情報処理装置、支援システム及び情報処理方法 |
JPWO2018230396A1 (ja) * | 2017-06-15 | 2020-04-09 | 富士フイルム株式会社 | 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法 |
JP7051845B2 (ja) | 2017-06-15 | 2022-04-11 | 富士フイルム株式会社 | 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法 |
US11328415B2 (en) | 2017-06-15 | 2022-05-10 | Fujifilm Corporation | Medical image processing device, endoscope system, and method of operating medical image processing device |
JPWO2019039354A1 (ja) * | 2017-08-23 | 2020-07-16 | 富士フイルム株式会社 | 光源装置及び内視鏡システム |
CN111050629A (zh) * | 2017-08-23 | 2020-04-21 | 富士胶片株式会社 | 光源装置及内窥镜系统 |
WO2019039354A1 (ja) * | 2017-08-23 | 2019-02-28 | 富士フイルム株式会社 | 光源装置及び内視鏡システム |
CN111050629B (zh) * | 2017-08-23 | 2022-09-06 | 富士胶片株式会社 | 内窥镜系统 |
JP2020078647A (ja) * | 2020-02-27 | 2020-05-28 | 富士フイルム株式会社 | 画像処理装置、画像処理装置の作動方法、および画像処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
EP3072439B1 (en) | 2018-09-05 |
US20160239965A1 (en) | 2016-08-18 |
CN105705075B (zh) | 2018-02-02 |
JP2017176856A (ja) | 2017-10-05 |
CN108109134B (zh) | 2021-12-03 |
JP6151372B2 (ja) | 2017-06-21 |
JP6427625B2 (ja) | 2018-11-21 |
EP3072439A4 (en) | 2017-07-05 |
EP3072439A1 (en) | 2016-09-28 |
JPWO2015064435A1 (ja) | 2017-03-09 |
CN105705075A (zh) | 2016-06-22 |
CN108109134A (zh) | 2018-06-01 |
US9582878B2 (en) | 2017-02-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6427625B2 (ja) | 画像処理装置及びその作動方法 | |
US10779714B2 (en) | Image processing apparatus, method for operating image processing apparatus, and image processing program | |
JP5647752B1 (ja) | 画像処理装置及び内視鏡システムの作動方法 | |
JP5662623B1 (ja) | 画像処理装置及び内視鏡システムの作動方法 | |
JP5789280B2 (ja) | プロセッサ装置、内視鏡システム、及び内視鏡システムの作動方法 | |
US10003774B2 (en) | Image processing device and method for operating endoscope system | |
WO2019039354A1 (ja) | 光源装置及び内視鏡システム | |
JP6054806B2 (ja) | 画像処理装置及び内視鏡システムの作動方法 | |
US20190246874A1 (en) | Processor device, endoscope system, and method of operating processor device | |
JP6669539B2 (ja) | 画像処理装置、画像処理装置の作動方法、および画像処理プログラム | |
JP6855610B2 (ja) | 画像処理装置、画像処理装置の作動方法、および画像処理プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14858260 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015544939 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2014858260 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014858260 Country of ref document: EP |