WO2015025620A1 - Système d'endoscopie, processeur et procédé de fonctionnement - Google Patents

Système d'endoscopie, processeur et procédé de fonctionnement Download PDF

Info

Publication number
WO2015025620A1
WO2015025620A1 PCT/JP2014/067499 JP2014067499W WO2015025620A1 WO 2015025620 A1 WO2015025620 A1 WO 2015025620A1 JP 2014067499 W JP2014067499 W JP 2014067499W WO 2015025620 A1 WO2015025620 A1 WO 2015025620A1
Authority
WO
WIPO (PCT)
Prior art keywords
oxygen saturation
image
image signal
frequency component
specimen
Prior art date
Application number
PCT/JP2014/067499
Other languages
English (en)
Japanese (ja)
Inventor
加來 俊彦
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2015025620A1 publication Critical patent/WO2015025620A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/1459Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters invasive, e.g. introduced into the body by a catheter
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/043Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/063Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for monochromatic or narrow-band illumination
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0638Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0646Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with illumination filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0653Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with wavelength conversion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0661Endoscope light sources
    • A61B1/0684Endoscope light sources using light emitting diodes [LED]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • A61B5/14556Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases by fluorescence

Definitions

  • the present invention relates to an endoscope system, a processor device, and an operating method for acquiring biological function information related to oxygen saturation of blood hemoglobin from an image signal obtained by imaging in a specimen.
  • diagnosis is generally performed using an endoscope system including a light source device, an endoscope, and a processor device.
  • lesions are being diagnosed using the oxygen saturation of blood hemoglobin in the biological function information.
  • the first signal light and the second signal light which have different wavelength bands and different absorption coefficients of oxyhemoglobin and deoxyhemoglobin, are alternately applied to the blood vessels in the mucous membrane.
  • a method is known in which each reflected light from the blood vessel of the first and second signal lights is detected by a sensor at the distal end of the endoscope (Patent Document 1).
  • the ratio of the first signal light image signal corresponding to the reflected light of the first signal light detected by the sensor and the second signal light image signal corresponding to the reflected light of the second signal light (hereinafter referred to as signal ratio) is a blood vessel. If there is no change in the oxygen saturation, a constant value is maintained, but if a change in oxygen saturation occurs, it changes accordingly. Therefore, the oxygen saturation can be calculated based on the signal ratio between the first signal light image signal and the second signal light image signal.
  • the calculation of the oxygen saturation is based on the premise that the specimen is uniformly irradiated with the first and second signal lights. For this reason, when the first and second signal lights are non-uniform, the reliability of the calculated oxygen saturation is reduced. Therefore, in an endoscope system that acquires oxygen saturation, the first and second signals The irradiation range of the first and second signal lights, the distribution of the amount of light, and the like are strictly adjusted in advance so that the sample is irradiated with light almost uniformly.
  • Endoscope systems such as general digital cameras, are not indefinite, and the distance to the subject (specimen) and subject is limited, so the illumination illumination range, light intensity distribution, etc. are precisely adjusted in advance. Can be kept. For this reason, in an endoscope system, it is usually not difficult to partially observe a specimen due to uneven illumination unlike a digital camera.
  • oxygen saturation is calculated. May cause a large error (hereinafter referred to as artifact) that is not caused by the properties of the specimen.
  • artifact a large error that is not caused by the properties of the specimen.
  • non-magnifying observation observation where the tip of the endoscope is very close to the specimen, or observation that magnifies the specimen by operating the zoom lens
  • it occurs during non-magnifying observation.
  • the low oxygen region and the high oxygen region that did not exist are generated. That is, at the time of magnified observation, an artifact of oxygen saturation that could not occur at the time of non-magnified observation occurs.
  • the oxygen saturation is extremely sensitive to the light amount distribution of the illumination (first and second signal lights), and even if there is an error such as the light amount distribution of the very small illumination, the oxygen saturation depends on the magnification rate during magnification observation. The main reason is that the contribution to saturation is increased.
  • An object of the present invention is to provide an endoscope system, a processor device, and an operation method that reduce an oxygen saturation artifact generated during magnified observation and calculate and display the oxygen saturation distribution more finely and accurately than before.
  • the endoscope system of the present invention includes a light source device, a sensor, an oxygen saturation calculation unit, an image generation unit, a low frequency component extraction unit, a high frequency component extraction unit, and a synthesis processing unit.
  • the light source device emits illumination light.
  • the sensor irradiates the specimen with illumination light, receives reflected light reflected by the specimen, outputs a first image signal obtained by imaging the specimen, and outputs the specimen more than when the first image signal is output.
  • a second image signal obtained by enlarging and imaging is output.
  • the oxygen saturation calculation unit calculates the oxygen saturation of the specimen based on the first image signal and the second image signal.
  • the image generation unit generates a first oxygen saturation image based on the first image signal and the oxygen saturation calculated based on the first image signal, and based on the second image signal and the second image signal.
  • a second oxygen saturation image is generated based on the calculated oxygen saturation.
  • the low frequency component extraction unit extracts a low frequency component less than the cutoff frequency from the first oxygen saturation image.
  • the high frequency component extraction unit extracts a high frequency component equal to or higher than a cutoff frequency from the second oxygen saturation image.
  • the synthesis processing unit synthesizes the low frequency component and the high frequency component to generate a synthesized oxygen saturation image.
  • a zooming lens for enlarging or reducing an image of the specimen formed on the sensor, and a zoom detection unit for detecting whether or not magnification observation is performed based on an operation state of the zooming lens may be provided.
  • an artifact detection unit for detecting an oxygen saturation artifact from the first oxygen saturation image and the second oxygen saturation image may be provided.
  • the corresponding region detection unit for detecting a region corresponding to the second oxygen saturation image from the first oxygen saturation image and the region detected by the corresponding region detection unit are enlarged to the same size as the second oxygen saturation image.
  • an enlargement processing unit it is preferable that the low frequency component extraction unit extracts the low frequency component from the image of the region enlarged to the same size as the second oxygen saturation image.
  • the corresponding region detection unit can detect a corresponding region by pattern matching between the first oxygen saturation image and the second oxygen saturation image, for example.
  • the display screen of the monitor indicates that the processing by the low frequency component extraction unit, the high frequency component extraction unit, and the synthesis processing unit is performed together with the synthetic oxygen saturation image. It is preferable to do.
  • the processor device of the present invention outputs a first image signal obtained by imaging a specimen by receiving a reflected light reflected from the specimen by illuminating the specimen with the light source that emits illumination light, And a sensor device that outputs a second image signal obtained by enlarging and imaging the specimen as compared with the output of the first image signal, a processor device of an endoscope system, a receiving unit, and an oxygen saturation level A calculation unit, an image generation unit, a low frequency component extraction unit, a high frequency component extraction unit, and a synthesis processing unit are provided.
  • the receiving unit receives the first image signal and the second image signal from the sensor.
  • the oxygen saturation calculation unit calculates the oxygen saturation of the specimen based on the first image signal and the second image signal.
  • the image generation unit generates a first oxygen saturation image based on the first image signal and the oxygen saturation calculated based on the first image signal, and based on the second image signal and the second image signal. Based on the calculated oxygen saturation, a second oxygen saturation image is generated.
  • the low frequency component extraction unit extracts a low frequency component less than the cutoff frequency from the first oxygen saturation image.
  • the high frequency component extraction unit extracts a high frequency component equal to or higher than a cutoff frequency from the second oxygen saturation image.
  • the synthesis processing unit synthesizes the low frequency component and the high frequency component to generate a synthesized oxygen saturation image.
  • the operation method of the endoscope system includes a first imaging step, a second imaging step, an oxygen saturation calculation step, a first image generation step, a second image generation step, and a low frequency component extraction step. And a high frequency component extraction step and a synthesis processing step.
  • the first imaging step the sample is imaged by irradiating the sample with illumination light emitted from the light source device and receiving the reflected light reflected by the sample to obtain a first image signal.
  • the second imaging step the second image signal is obtained by enlarging and imaging the specimen compared to when the first image signal is obtained.
  • the oxygen saturation calculation step the oxygen saturation of the specimen is calculated based on the first image signal and the second image signal.
  • a first oxygen saturation image is generated based on the first image signal and the oxygen saturation calculated based on the first image signal.
  • a second oxygen saturation image is generated based on the second image signal and the oxygen saturation calculated based on the second image signal.
  • the low frequency component extraction step a low frequency component less than the cutoff frequency is extracted from the first oxygen saturation image.
  • a high frequency component equal to or higher than the cutoff frequency is extracted from the second oxygen saturation image.
  • the low frequency component and the high frequency component are synthesized to generate a synthesized oxygen saturation image.
  • Another endoscope system includes a light source device, a sensor, a low frequency component extraction unit, a high frequency component extraction unit, a synthesis processing unit, an oxygen saturation calculation unit, an image generation unit, Is provided.
  • the light source device emits illumination light.
  • the sensor irradiates the specimen with illumination light, receives reflected light reflected by the specimen, outputs a first image signal obtained by imaging the specimen, and outputs the specimen more than when the first image signal is output.
  • a second image signal obtained by enlarging and imaging is output.
  • the low frequency component extraction unit extracts a low frequency component having a frequency lower than the cutoff frequency from the first image signal.
  • the high frequency component extraction unit extracts a high frequency component equal to or higher than a cutoff frequency from the second image signal.
  • the synthesis processing unit synthesizes the low frequency component and the high frequency component to generate a synthesized image signal.
  • the oxygen saturation calculation unit calculates the oxygen saturation of the specimen based on the composite image signal.
  • the image generation unit generates an oxygen saturation image representing the oxygen saturation of the specimen based on the composite image signal and the oxygen saturation.
  • Another processor device of the present invention includes a light source device that emits illumination light, and a first image signal obtained by imaging the sample by receiving the reflected light that is irradiated with the illumination light and reflected by the sample. And a sensor that outputs a second image signal obtained by enlarging and imaging the specimen as compared with the output of the first image signal, and a processor device of an endoscope system, A low-frequency component extraction unit, a high-frequency component extraction unit, a synthesis processing unit, an oxygen saturation calculation unit, and an image generation unit.
  • the receiving unit receives the first image signal and the second image signal from the sensor.
  • the low frequency component extraction unit extracts a low frequency component having a frequency lower than the cutoff frequency from the first image signal.
  • the high frequency component extraction unit extracts a high frequency component equal to or higher than a cutoff frequency from the second image signal.
  • the synthesis processing unit synthesizes the low frequency component and the high frequency component to generate a synthesized image signal.
  • the oxygen saturation calculation unit calculates the oxygen saturation of the specimen based on the composite image signal.
  • the image generation unit generates an oxygen saturation image representing the oxygen saturation of the specimen based on the composite image signal and the oxygen saturation.
  • another endoscope system operating method of the present invention includes a first imaging step, a second imaging step, a low frequency component extraction step, a high frequency component extraction step, a synthesis processing step, and oxygen saturation calculation.
  • a step and an image generation step In the first imaging step, the specimen is imaged by irradiating the specimen with illumination light emitted from the light source device and receiving the reflected light reflected by the specimen, thereby obtaining a first image signal.
  • the second imaging step the second image signal is obtained by enlarging and imaging the specimen compared to when the first image signal is obtained.
  • the low frequency component extraction step a low frequency component less than the cutoff frequency is extracted from the first image signal.
  • the high frequency component extraction step a high frequency component equal to or higher than the cutoff frequency is extracted from the second image signal.
  • the low frequency component and the high frequency component are synthesized to generate a synthesized image signal.
  • the oxygen saturation calculation step the oxygen saturation of the specimen is calculated based on the composite image signal.
  • an oxygen saturation image representing the oxygen saturation of the specimen is generated based on the composite image signal and the oxygen saturation.
  • the processor device According to the endoscope system, the processor device, and the operating method of the present invention, it is possible to reduce the oxygen saturation artifact during magnified observation, and to calculate and display the oxygen saturation distribution more finely and accurately than before. .
  • the endoscope system 10 includes an endoscope 12, a light source device 14, a processor device 16, a monitor 18, and a console 20.
  • the endoscope 12 is optically connected to the light source device 14 and electrically connected to the processor device 16.
  • the endoscope 12 includes an insertion portion 21 to be inserted into a specimen, an operation portion 22 provided at a proximal end portion of the insertion portion 21, a bending portion 23 and a distal end portion 24 provided on the distal end side of the insertion portion 21. have.
  • the angle knob 22a of the operation unit 22 By operating the angle knob 22a of the operation unit 22, the bending unit 23 performs a bending operation. With this bending operation, the distal end portion 24 is directed in a desired direction.
  • the operation unit 22 is provided with a mode switch SW (mode switch) 22b and a zoom operation unit 22c.
  • the mode switching SW 22b is used for switching operation between two types of modes, a normal observation mode and a special observation mode.
  • the normal observation mode is a mode in which a normal light image in which the inside of the specimen is converted into a full color image is displayed on the monitor 18.
  • the special observation mode is a mode in which an oxygen saturation image obtained by imaging the oxygen saturation of blood hemoglobin in the specimen is displayed on the monitor 18.
  • the zoom operation unit 22c is used for a zoom operation for driving the zooming lens 47 (see FIG. 2) in the endoscope 12 to enlarge the specimen.
  • the processor device 16 is electrically connected to the monitor 18 and the console 20.
  • the monitor 18 displays images such as normal light images and oxygen saturation images, and information related to these images (hereinafter referred to as image information and the like).
  • the console 20 functions as a UI (user interface) that receives input operations such as function settings.
  • a recording unit (not shown) for recording image information or the like may be connected to the processor device 16.
  • the light source device 14 includes a first blue laser light source (473LD (laser diode)) 34 that emits a first blue laser beam having a center wavelength of 473 nm and a second blue laser beam that emits a second blue laser beam having a center wavelength of 445 nm.
  • Two blue laser light sources (445LD) 36 are provided as light emission sources. Light emission of the first blue laser light source 34 and the second blue laser light source 36 made of these semiconductor light emitting elements is individually controlled by the light source control unit 40. For this reason, the light quantity ratio between the emitted light from the first blue laser light source 34 and the emitted light from the second blue laser light source 36 is freely changeable.
  • the light source control unit 40 turns on the second blue laser light source 36 and emits the second blue laser light.
  • the first blue laser light source 34 and the second blue laser light source 36 are alternately turned on at intervals of one frame, and the first blue laser light and the second blue laser light are alternately turned on. Make it emit light.
  • the half width of the first blue laser beam and the second blue laser beam is preferably about ⁇ 10 nm.
  • the first blue laser light source 34 and the second blue laser light source 36 can use broad area type InGaN laser diodes, and can also use InGaNAs laser diodes or GaNAs laser diodes.
  • the light source may be configured to use a light emitter such as a light emitting diode.
  • the first blue laser light and the second blue laser light emitted from the first blue laser light source 34 and the second blue laser light source 36 are optical members such as a condensing lens, an optical fiber, a multiplexer (not shown). ) Through the light guide (LG) 41.
  • the light guide 41 is built in a universal cord that connects the light source device 14 and the endoscope 12.
  • the light guide 41 propagates the first blue laser light and the second blue laser light from the first blue laser light source 34 and the second blue laser light source 36 to the distal end portion 24 of the endoscope 12.
  • a multimode fiber can be used as the light guide 41.
  • a thin fiber cable having a core diameter of 105 ⁇ m, a cladding diameter of 125 ⁇ m, and a diameter of ⁇ 0.3 to 0.5 mm including a protective layer serving as an outer shell can be used.
  • the distal end portion 24 of the endoscope 12 has an illumination optical system 24a and an imaging optical system 24b.
  • the illumination optical system 24a is provided with a phosphor 44 and an illumination lens 45.
  • the first blue laser light and the second blue laser light are incident on the phosphor 44 from the light guide 41.
  • the phosphor 44 emits fluorescence when irradiated with the first blue laser light or the second blue laser light. Further, a part of the first blue laser light or the second blue laser light passes through the phosphor 44 as it is. The light emitted from the phosphor 44 is irradiated into the specimen through the illumination lens 45.
  • the second blue laser light is incident on the phosphor 44, white light having the spectrum shown in FIG. 3 (second white light) is irradiated into the specimen.
  • the second white light is composed of second blue laser light and green to red second fluorescence excited and emitted from the phosphor 44 by the second blue laser light. Therefore, the wavelength range of the second white light extends to the entire visible light range.
  • the first white light and the second white light having the spectrum shown in FIG. Irradiated inside.
  • the first white light is composed of first blue laser light and green to red first fluorescence that is excited and emitted from the phosphor 44 by the first blue laser light. Therefore, the first white light has a wavelength range covering the entire visible light range.
  • the second white light is the same as the second white light irradiated in the normal observation mode.
  • the first fluorescence and the second fluorescence have substantially the same waveform (spectrum shape), and the ratio of the intensity of the first fluorescence (I1 ( ⁇ )) to the intensity of the second fluorescence (I2 ( ⁇ )) (hereinafter referred to as a frame).
  • the intensity ratio) is the same at any wavelength ⁇ .
  • I2 ( ⁇ 1) / I1 ( ⁇ 1) I2 ( ⁇ 2) / I1 ( ⁇ 2). Since the inter-frame intensity ratio I2 ( ⁇ ) / I1 ( ⁇ ) affects the calculation accuracy of the oxygen saturation, the light source control unit 40 maintains a preset reference inter-frame intensity ratio. It is controlled with high accuracy.
  • the phosphor 44 absorbs a part of the first and second blue laser beams and excites and emits green to red light (for example, YAG phosphor or BAM (BaMgAl 10 O 17 )). It is preferable to use a material comprising a phosphor such as.
  • a material comprising a phosphor such as
  • high intensity first white light and second white light can be obtained with high luminous efficiency. If a semiconductor light emitting element is used as the excitation light source of the phosphor 44, the intensity of each white light can be easily adjusted, and changes in color temperature and chromaticity can be suppressed to a small level.
  • the imaging optical system 24b of the endoscope 12 includes an imaging lens 46, a zooming lens 47, and a sensor 48 (see FIG. 2). Reflected light from the specimen enters the sensor 48 via the imaging lens 46 and zooming lens 47. Thereby, a reflected image of the specimen is formed on the sensor 48.
  • the zooming lens 47 moves between the tele end and the wide end by operating the zoom operation unit 22c. When the zooming lens 47 moves to the wide end side, the reflected image of the specimen is reduced. On the other hand, when the zooming lens 47 moves to the tele end side, the reflected image of the specimen is enlarged. Note that the zoom lens 47 is disposed at the wide end when magnification observation is not performed (during non-magnification observation). When the zoom operation unit 22c is operated to perform magnified observation, the zooming lens 47 is moved from the wide end to the tele end side.
  • the sensor 48 is a color image sensor, picks up a reflected image of the specimen, and outputs an image signal.
  • the sensor 48 is, for example, a CCD (Charge-Coupled Device) image sensor or a CMOS (Complementary Metal-Oxide Semiconductor) image sensor.
  • the sensor 48 has RGB pixels provided with RGB color filters on the imaging surface, and outputs image signals of three colors of R, G, and B by performing photoelectric conversion with pixels of each color of RGB. .
  • the B color filter has a spectral transmittance of 380 to 560 nm
  • the G color filter has a spectral transmittance of 450 to 630 nm
  • the R color filter has a spectral transmittance of 580 to 760 nm. It has transmittance. Therefore, when the second white light is irradiated into the specimen in the normal observation mode, the second blue laser light and a part of the green component of the second fluorescence are incident on the B pixel, and the second light is incident on the G pixel. A part of the green component of the fluorescence is incident, and the red component of the second fluorescence is incident on the R pixel. However, since the emission intensity of the second blue laser light is much higher than that of the second fluorescence, most of the B image signal output from the B pixel is occupied by the reflected light component of the second blue laser light.
  • the special observation mode when the first white light is irradiated into the specimen and the reflected light reflected in the specimen is received by the sensor 48, the first blue laser light and the first fluorescent green are applied to the B pixel. Part of the component is incident, part of the green component of the first fluorescence is incident on the G pixel, and the red component of the first fluorescence is incident on the R pixel. However, since the emission intensity of the first blue laser light is much higher than that of the first fluorescence, most of the B image signal is occupied by the reflected light component of the first blue laser light. It should be noted that the light incident components at the RGB pixels when the second white light is irradiated into the specimen in the special observation mode are the same as in the normal observation mode.
  • the sensor 48 may be a so-called complementary color image sensor having C (cyan), M (magenta), Y (yellow), and G (green) complementary color filters on the imaging surface.
  • the color conversion unit that performs color conversion from the CMYG four-color image signal to the RGB three-color image signal is any of the endoscope 12, the light source device 14, and the processor device 16. It should be provided in. In this way, even when a complementary color image sensor is used, it is possible to obtain RGB three-color image signals by color conversion from the four-color CMYG image signals.
  • the imaging control unit 49 performs imaging control of the sensor 48.
  • the one frame period of the sensor 48 includes an accumulation period for photoelectrically converting reflected light from the specimen and accumulating charges, and a readout period for reading the accumulated charges and outputting an image signal thereafter. Consists of. In the normal observation mode, the inside of the specimen illuminated with the second white light is imaged by the sensor 48 every frame period. Thereby, RGB image signals are output from the sensor 48 for each frame.
  • the imaging control unit 49 causes the sensor 48 to perform an accumulation period and a reading period in the special observation mode as in the normal observation mode.
  • the special observation mode the first white light and the second white light are alternately irradiated into the specimen in synchronization with the imaging frame of the sensor 48. Therefore, as shown in FIG.
  • the sample is irradiated with the first white light in the eye, and the reflected light reflected in the sample is received by the sensor 48 to image the inside of the sample.
  • the second white light is irradiated into the sample. Then, the reflected light reflected in the specimen is received by the sensor 48 to image the inside of the specimen.
  • the sensor 48 outputs RGB color image signals in both the first frame and the second frame, but the spectrum of the white light on which it depends is different. Therefore, for the sake of distinction, the reflection of the first white light in the first frame will be described below.
  • the RGB color image signals obtained by imaging light by the sensor 48 are called R1 image signal, G1 image signal, and B1 image signal, respectively, and the reflected light of the second white light is imaged by the sensor 48 in the second frame.
  • the obtained RGB color image signals are referred to as an R2 image signal, a G2 image signal, and a B2 image signal.
  • Each image signal at the time of non-magnification observation is referred to as a first image signal
  • each image signal at the time of magnification observation is referred to as a second image signal.
  • the image signals of the respective colors output from the sensor 48 are transmitted to a CDS (correlated double sampling) / AGC (automatic gain control) circuit 50 (see FIG. 2).
  • the CDS / AGC circuit 50 performs correlated double sampling (CDS) and automatic gain control (AGC) on the analog image signal output from the sensor 48.
  • CDS correlated double sampling
  • AGC automatic gain control
  • the image signal that has passed through the CDS / AGC circuit 50 is converted into a digital image signal by the A / D converter 52.
  • the digitized image signal is input to the processor device 16.
  • the processor device 16 includes a receiving unit 54, an image processing switching unit 60, a normal observation image processing unit 62, a special observation image processing unit 64, and an image display signal generation unit 66.
  • the receiving unit 54 receives an image signal input from the endoscope 12.
  • the reception unit 54 includes a DSP (Digital Signal Processor) 56 and a noise removal unit 58, and the DSP 56 performs digital signal processing such as color correction processing on the received image signal.
  • the noise removal unit 58 performs noise removal processing by, for example, a moving average method or a median filter method on the image signal that has been subjected to color correction processing or the like by the DSP 56.
  • the image signal from which the noise has been removed is input to the image processing switching unit 60.
  • the image processing switching unit 60 inputs an image signal to the normal observation image processing unit 62 when the mode switching SW 22b is set to the normal observation mode. On the other hand, when the mode switching SW 22 b is set to the special observation mode, the image processing switching unit 60 inputs an image signal to the special observation image processing unit 64.
  • the normal observation image processing unit 62 includes a color conversion unit 68, a color enhancement unit 70, and a structure enhancement unit 72.
  • the color conversion unit 68 generates RGB image data in which each input RGB image signal for one frame is assigned to an R pixel, a G pixel, or a B pixel.
  • the RGB image data is further subjected to color conversion processing such as 3 ⁇ 3 matrix processing, gradation conversion processing, and three-dimensional LUT processing.
  • the color enhancement unit 70 performs various color enhancement processes on the RGB image data that has been subjected to the color conversion process.
  • the structure enhancement unit 72 performs structure enhancement processing such as spatial frequency enhancement on the RGB image data that has been subjected to color enhancement processing.
  • the RGB image data subjected to the structure enhancement process by the structure enhancement unit 72 is input to the image display signal generation unit 66 as a normal observation image.
  • the special observation image processing unit 64 includes an oxygen saturation image generation unit 76 and a structure enhancement unit 78.
  • the oxygen saturation image generation unit 76 calculates the oxygen saturation and generates an oxygen saturation image representing the calculated oxygen saturation.
  • the oxygen saturation image generation unit 76 calculates the oxygen saturation and generates an oxygen saturation image during magnification observation, and further corrects the oxygen saturation artifact to generate a synthesized oxygen saturation image. To do.
  • the oxygen saturation image at the time of non-enlarged observation is referred to as a non-enlarged oxygen saturation image (first oxygen saturation image).
  • An oxygen saturation image at the time of magnified observation is referred to as a magnified oxygen saturation image (second oxygen saturation image), and an oxygen saturation image in which artifacts are reduced using the magnified oxygen saturation image or the like is referred to as a synthetic oxygen saturation image.
  • the structure enhancement unit 78 performs structure enhancement processing such as spatial frequency enhancement processing on the non-enlarged oxygen saturation image or the synthetic oxygen saturation image input from the oxygen saturation image generation unit 76.
  • structure enhancement processing such as spatial frequency enhancement processing on the non-enlarged oxygen saturation image or the synthetic oxygen saturation image input from the oxygen saturation image generation unit 76.
  • the oxygen saturation image that has undergone the structure enhancement processing by the structure enhancement unit 72 is input to the image display signal generation unit 66.
  • the display image signal generation unit 66 converts the normal observation image or the oxygen saturation image into a display format signal (display image signal) and inputs it to the monitor 18. As a result, the normal observation image or the oxygen saturation image is displayed on the monitor 18.
  • the oxygen saturation image generation unit 76 includes a signal ratio calculation unit 81, a correlation storage unit 82, an oxygen saturation calculation unit 83, an image generation unit 84, a zoom detection unit 86, An image storage unit 87 and an artifact correction unit 88 are provided.
  • the signal ratio calculation unit 81 receives the B1 image signal, the G2 image signal, and the R2 image signal among the image signals for two frames input to the oxygen saturation image generation unit 76.
  • the signal ratio calculation unit 81 calculates a signal ratio B1 / G2 between the B1 image signal and the G2 image signal and a signal ratio R2 / G2 between the G2 image signal and the R2 image signal for each pixel.
  • the correlation storage unit 82 stores the correlation between the signal ratio B1 / G2 and the signal ratio R2 / G2 and the oxygen saturation.
  • This correlation is stored in a two-dimensional table in which contour lines of oxygen saturation are defined on the two-dimensional space shown in FIG.
  • the positions and shapes of the contour lines with respect to the signal ratio B1 / G2 and the signal ratio R2 / G2 are obtained in advance by a physical simulation of light scattering, and the interval between the contour lines changes according to the blood volume (signal ratio R2 / G2). To do.
  • the correlation between the signal ratio B1 / G2 and the signal ratio R2 / G2 and the oxygen saturation is stored on a log scale.
  • the above correlation is closely related to the light absorption characteristics and light scattering characteristics of oxyhemoglobin (graph 90) and reduced hemoglobin (graph 91).
  • information on oxygen saturation is easy to handle at a wavelength where the difference in absorption coefficient between oxygenated hemoglobin and reduced hemoglobin is large, such as the center wavelength of 473 nm of the first blue laser beam.
  • the B1 image signal including a signal corresponding to 473 nm light is highly dependent not only on the oxygen saturation but also on the blood volume.
  • a signal ratio B1 / G2 obtained from an R2 image signal corresponding to light that changes mainly depending on blood volume, and a G2 image signal serving as a reference signal for the B1 image signal and the R2 image signal, and By using R2 / G2, the oxygen saturation can be accurately calculated without depending on the blood volume.
  • the oxygen saturation calculation unit 83 refers to the correlation stored in the correlation storage unit 82, and calculates the oxygen saturation corresponding to the signal ratio B1 / G2 and the signal ratio R2 / G2 calculated by the signal ratio calculation unit 81. Calculate for each pixel. For example, when the signal ratio B1 / G2 and the signal ratio R2 / G2 at a predetermined pixel are B1 * / G2 * and R2 * / G2 * , respectively, referring to the correlation as shown in FIG. 11, the signal ratio B1 * The oxygen saturation corresponding to / G2 * and the signal ratio R2 * / G2 * is “60%”. Therefore, the oxygen saturation calculation unit 83 calculates the oxygen saturation of this pixel as “60%”.
  • the signal ratio B1 / G2 and the signal ratio R2 / G2 are hardly increased or extremely decreased.
  • the values of the signal ratio B1 / G2 and the signal ratio R2 / G2 hardly exceed the lower limit line 93 with an oxygen saturation of 0%, or conversely fall below the upper limit line 94 with an oxygen saturation of 100%.
  • the oxygen saturation calculation unit 83 sets the oxygen saturation to 0%.
  • the oxygen saturation is set to 100. %.
  • the image generation unit 84 generates an oxygen saturation image obtained by imaging oxygen saturation using the oxygen saturation calculated by the oxygen saturation calculation unit 86 and the B2 image signal, the G2 image signal, and the R2 image signal. To do. Specifically, the image generation unit 84 applies a gain corresponding to the oxygen saturation to the input original B2 image signal, G2 image signal, and R2 image signal for each pixel, and applies the gained B2 RGB image data is generated using the image signal, the G2 image signal, and the R2 image signal. For example, the image generation unit 84 multiplies all of the B2 image signal, the G2 image signal, and the R2 image signal by the same gain “1” for a pixel having an oxygen saturation of 60% or more.
  • the B2 image signal is multiplied by a gain less than “1”
  • the G2 image signal and the R2 image signal are multiplied by a gain of “1” or more.
  • the RGB image data generated using the B1 image signal, the G2 image signal, and the R2 image signal after the gain processing is an oxygen saturation image.
  • those generated based on the image signals of the respective colors at the time of non-magnifying observation are non-magnified oxygen saturation images, and are based on the image signals of the respective colors at the time of magnified observation. What is generated is an enlarged oxygen saturation image.
  • the high oxygen region (region where the oxygen saturation is 60 to 100%) is represented by the same color as the normal observation image.
  • a low oxygen region where the oxygen saturation is below a predetermined value (region where the oxygen saturation is 0 to 60%) is represented by a color (pseudo color) different from that of the normal observation image.
  • the image generation unit 84 multiplies the gain for pseudo-coloring only the low oxygen region, but the gain corresponding to the oxygen saturation is applied even in the high oxygen region, and the entire oxygen saturation image is obtained.
  • a pseudo color may be used.
  • the low oxygen region and the high oxygen region are separated by oxygen saturation 60%, this boundary is also arbitrary.
  • the zoom detection unit 86 monitors the operation status of the zoom operation unit 22c and detects the presence / absence of zoom (whether or not magnification observation is performed). The detection result by the zoom detection unit 86 is input to the image generation unit 84. In addition, when performing magnified observation, the zoom detection unit 86 calculates the magnification of the observation range based on the operation amount of the zoom operation unit 22c, for example. When magnification observation is not performed, the image generation unit 84 outputs the generated oxygen saturation image to the structure enhancement unit 78 and stores it in the image storage unit 87. On the other hand, at the time of magnifying observation, the image generation unit 84 inputs the generated enlarged oxygen saturation image to the artifact correction unit 88, corrects the oxygen saturation artifact, and outputs it to the structure enhancement unit 78.
  • the image storage unit 87 is a memory for storing a non-expanded oxygen saturation image.
  • the image storage unit 87 stores the latest one of the non-enlarged oxygen saturation images. That is, every time the image generation unit 84 generates a non-enlarged oxygen saturation image, the non-enlarged oxygen saturation image stored in the image storage unit 87 is sequentially updated to the latest one. When the magnification observation is performed, the update of the non-magnified oxygen saturation image stored in the image storage unit 87 is stopped.
  • the artifact correction unit 88 includes a high frequency component extraction unit 101, an enlarged portion extraction unit 102, a low frequency component extraction unit 103, and a synthesis processing unit 104.
  • the high frequency component extraction unit 101 extracts a high frequency component equal to or higher than the cutoff frequency from the enlarged oxygen saturation image input from the image generation unit 84 at the time of magnification observation.
  • the enlarged oxygen saturation image corresponds to an enlarged part of the non-enlarged oxygen saturation image stored in the image storage unit 87.
  • An image of a high frequency component extracted from the enlarged oxygen saturation image (hereinafter referred to as a high frequency component image) is input to the synthesis processing unit 104.
  • the cut-off frequency is determined in advance according to the enlargement ratio, and the cut-off frequency is determined to shift to the lower frequency side as the enlargement ratio increases. Thereby, the high frequency component extraction part 101 extracts the appropriate high frequency component according to the expansion condition of the specimen.
  • the enlarged portion extraction unit 102 includes a corresponding region detection unit 102a and an enlargement processing unit 102b.
  • the corresponding region detection unit 102 a acquires the expanded oxygen saturation image from the image generation unit 84 and acquires the non-enlarged oxygen saturation image from the image storage unit 87. Then, by performing pattern matching between the expanded oxygen saturation image and the non-enlarged oxygen saturation image, a region corresponding to the expanded oxygen saturation image is extracted from the non-enlarged oxygen saturation image.
  • the enlargement processing unit 102b enlarges the image of the region extracted from the non-enlarged oxygen saturation image so as to have the same size as the enlarged oxygen saturation image.
  • the low frequency component extraction unit 103 extracts a low frequency component having a frequency lower than the cutoff frequency from the image extracted by the expansion portion extraction unit 102 from the non-expansion oxygen saturation image and enlarged by the expansion processing unit 102b.
  • the low frequency component image extracted by the low frequency component extraction unit from the non-enlarged oxygen saturation image (hereinafter referred to as a low frequency component image) is input to the synthesis processing unit 104.
  • the cutoff frequency used as the threshold value by the low frequency component extraction unit 103 is the same value as that used by the high frequency component extraction unit 101.
  • the synthesis processing unit 104 corrects artifacts by aligning and synthesizing the high frequency component image input from the high frequency component extraction unit 101 and the low frequency component image input from the low frequency component extraction unit 103.
  • a synthesized oxygen saturation image is generated.
  • the synthesized oxygen saturation image generated by the synthesis processing unit 104 is input to the structure enhancement unit 78, subjected to structure enhancement processing, converted into a display image signal by the image display signal generation unit 66, and displayed on the monitor 18. Is done.
  • the flow of observation by the endoscope system 10 of this embodiment will be described along the flowchart of FIG.
  • screening is performed from the farthest view state (S10).
  • a normal observation image is displayed on the monitor 18.
  • the mode switching SW 22b is operated to switch to the special observation mode. (S12).
  • a diagnosis is made as to whether or not the likely lesion site is in a hypoxic state.
  • the first white light and the second white light are alternately irradiated into the specimen in synchronization with the imaging frame of the sensor 48. Therefore, the sensor 48 detects the R1 image signal in the frame irradiated with the first white light. , G1 image signal, and B1 image signal are output, and R2 image signal, G2 image signal, and B2 image signal are output in the frame irradiated with the second white light. Based on the image signals for these two frames, the oxygen saturation is calculated for each pixel (S13).
  • the oxygen saturation image generation unit 76 detects whether or not the zoom detection unit 86 performs magnified observation (S14). At the time of non-magnifying observation, the image generating unit 84 applies gain to the R2 image signal, G2 image signal, and B2 image signal according to the oxygen saturation, and generates a non-enlarged oxygen saturation image (S15). The generated non-enlarged oxygen saturation image is stored in the image storage unit 87 (S16) and displayed on the monitor 18.
  • the doctor confirms whether the lesion possibility site is in a hypoxic state.
  • Such display of the oxygen saturation is continuously performed until the normal observation mode is switched (S25).
  • the insertion portion 21 of the endoscope 12 is extracted from the sample (S26).
  • the zoom detection unit 86 detects that magnified observation is being performed (S14).
  • the image generation unit 84 applies a gain to the R2 image signal, the G2 image signal, and the B2 image signal at the time of enlarged observation according to the oxygen saturation, and generates an enlarged oxygen saturation image (S18).
  • an oxygen saturation artifact that may not occur in the non-enlarged oxygen saturation image may occur. Therefore, the endoscope system 10 corrects a unique artifact during magnification observation. To do.
  • the high-frequency component extraction unit 101 extracts a high-frequency component equal to or higher than the cutoff frequency from the enlarged oxygen saturation image 120 (S19), and a high-frequency component image 121 composed of the extracted high-frequency components. Is generated.
  • the enlarged oxygen saturation image 120 is a fine structure that cannot be confirmed by the non-expanded oxygen saturation image 125 due to the expansion of the low oxygen region 123, a local change in oxygen saturation, etc. (hereinafter collectively referred to as a high-frequency structure 124).
  • an artifact 122 having a low frequency component is generated.
  • the high-frequency component 124 since the low frequency component which is mainly the artifact 122 is cut, the high frequency structure 124 appears more clearly.
  • the high-frequency component image 121 since the low-frequency component is cut, the structure of the specimen having the low-frequency component and a gentle change in oxygen saturation cannot be confirmed.
  • the enlarged portion extraction unit 102 acquires the enlarged oxygen saturation image 120, acquires the non-enlarged oxygen saturation image 125 from the image storage unit 87, and performs pattern matching by the corresponding region detection unit 102a.
  • a region 126 corresponding to the portion (enlarged oxygen saturation image 120) being magnified is detected from the non-enlarged oxygen saturation image 125 (S20).
  • the detected image of the region 126 is size-converted by the enlargement processing unit 102b according to the enlarged oxygen saturation image 120 (S21).
  • the low-frequency component extraction unit 103 extracts a low-frequency component less than the cutoff frequency from the image 127 in the region 126 after the size conversion, and generates a low-frequency component image 128 including the extracted low-frequency component (S22). ).
  • the low-frequency component image 128 generated based on the non-enlarged oxygen saturation image 125 has a low-frequency component such as a structure of the specimen and a gentle change in oxygen saturation (hereinafter collectively referred to as a low-frequency structure 129). Appears correctly.
  • the synthesis processing unit 104 synthesizes them to generate a synthetic oxygen saturation image 130 (S23). That is, the synthetic oxygen saturation image 130 is an image in which the high frequency structure 124 of the high frequency component image 121 and the low frequency structure 129 of the low frequency component image 128 are superimposed. For this reason, the synthetic oxygen saturation image 130 has no artifact 122 as shown in the enlarged oxygen saturation image 120, and both the high-frequency structure 124 and the low-frequency structure 129 can be confirmed. The synthetic oxygen saturation image 130 thus generated is displayed on the monitor 18 after undergoing structure enhancement processing or the like (S24).
  • the endoscope system 10 performs special image processing for correcting the artifact 122 by the endoscope system 10.
  • a display 131 clearly indicating that the process has been performed is displayed on the display screen of the monitor 18 together with the synthetic oxygen saturation image 130.
  • the display 131 is a character such as “enlarged image processing”, for example.
  • the endoscope system 10 performs the low-frequency component of the non-enlarged oxygen saturation image 125 and the enlarged oxygen saturation image 120 when performing the enlarged observation in the special observation mode for calculating and displaying the oxygen saturation.
  • a synthetic oxygen saturation image 130 in which high frequency components are synthesized is generated and displayed.
  • the artifact 122 of the enlarged oxygen saturation image 125 is corrected (removed), and both the high-frequency structure 124 and the low-frequency structure 129 of the specimen can be observed. Can make an accurate diagnosis by confirming the hypoxic region 123 in more detail.
  • the low oxygen region 123 is magnified. However, even if the low oxygen region 123 is not detected at the time of non-magnifying observation, the magnified region is enlarged. You may observe. For example, as shown in FIG. 15, even when a possible lesion site is found during screening in the normal observation mode and the special observation mode is switched, a hypoxic region may not be recognized in the non-enlarged oxygen saturation image 141. . In this case, the doctor may magnify and observe the likely lesion site in order to confirm that the likely lesion site is not a hypoxic lesion. However, in the enlarged oxygen saturation image 142 in this case, although the high-frequency structure 143 can be confirmed, the oxygen saturation artifact 122 is also generated by performing the enlarged observation.
  • the endoscope system 10 extracts a high-frequency component from the enlarged oxygen saturation image 142 regardless of whether or not the non-enlarged oxygen saturation image 141 has a low oxygen region. Then, the high frequency component image 144 is generated. In the high frequency component image 144, the artifact 122 is corrected and the high frequency structure 143 can be confirmed. However, in the high frequency component image 144, the low frequency structure having the same frequency as the artifact 122 is also removed.
  • the endoscope system 10 extracts a region 145 corresponding to the enlarged oxygen saturation image 142 from the non-enlarged oxygen saturation image 141, and generates an image 146 that has undergone size conversion. Then, a low frequency component is extracted from the image 146 to generate a low frequency component image 147. In the low frequency component image 147, the low frequency structure 148 of the specimen appears.
  • the synthetic oxygen saturation image 150 is not disturbed by the artifact 122 and the high-frequency structure 143 of the specimen is low. Both frequency structures 148 can be observed. If the high-frequency structure 143 is in a low-oxygen state, it is displayed in pseudo color, so even if the low-oxygen state cannot be confirmed in the non-enlarged oxygen saturation image 141, the low-oxygen state that can be observed only after magnified observation The high-frequency structure 143 can be found from the synthetic oxygen saturation image 150.
  • the non-enlarged oxygen saturation image 125 or 141 is displayed as it is, or the synthetic oxygen saturation image 130 or 150 in which the artifact 122 is corrected is displayed.
  • Switching between generating and displaying, but instead of monitoring the zoom operation, by detecting the artifact 122 from the generated oxygen saturation image (non-enlarged oxygen saturation image or expanded oxygen saturation image) It may be switched whether to generate a synthetic oxygen saturation image.
  • the oxygen saturation generation unit 76 of the endoscope system 10 of the first embodiment is replaced with an oxygen saturation image generation unit 160 shown in FIG.
  • Other configurations are the same as those of the endoscope system 10 of the first embodiment.
  • the oxygen saturation image generation unit 160 is obtained by adding an artifact detection unit 161 to the oxygen saturation image generation unit 76 of the first embodiment except for the zoom detection unit 86. Further, the oxygen saturation image generation unit 160 has the same signal ratio detection unit 81, correlation storage unit 82, oxygen saturation calculation unit 83, image generation unit 84, and image as the oxygen saturation image generation unit 76 of the first embodiment. A storage unit 87 and an artifact correction unit 88 are provided.
  • the oxygen saturation image generated by the image generation unit 84 is classified into a non-enlarged oxygen saturation image and an enlarged oxygen saturation image according to the presence or absence of the zoom operation. In the present embodiment, The non-enlarged oxygen saturation image and the enlarged oxygen saturation image are not distinguished, and all images generated by the image generation unit 84 are referred to as oxygen saturation images.
  • the artifact detection unit 161 acquires the oxygen saturation image from the image generation unit 84, and detects the artifact 122 from the acquired oxygen saturation image.
  • the artifact 122 generated by the magnified observation is generated in the structure (distribution, intensity, etc.) of the endoscope 12 such as the arrangement of the illumination optical system 24a and the imaging optical system 24b at the distal end portion 24, and the magnification rate of the specimen ( Or the distance between the tip 24 and the specimen) is substantially determined. For this reason, the artifact detection unit 161 detects whether or not the artifact 122 has occurred by monitoring pixel values at one or more arbitrary points in the oxygen saturation image.
  • the B pixel value is compared with a first threshold value, and the B pixel value is equal to or lower than the first threshold value due to gain (low oxygen level). Occurrence of artifacts is detected. There may be no artifact at the position corresponding to the pixel being monitored, and the specimen itself may actually be hypoxic, but the artifact is generally larger than the hypoxic condition that can occur in the specimen. If the first threshold value is set to be large to some extent, the occurrence of artifacts can be detected without erroneous detection.
  • a pixel at a position where an artifact that always becomes a high oxygen state is generated may be monitored.
  • the detection accuracy is improved. Therefore, it is preferable to monitor the pixel values at two or more points.
  • the detection method of the artifact is arbitrary. Instead of comparing the pixel value with the threshold value, the occurrence of the artifact 122 may be detected by extracting the frequency component of the artifact.
  • the artifact detection unit 161 When no artifact is detected from the oxygen saturation image acquired from the image generation unit 84, the artifact detection unit 161 outputs the acquired oxygen saturation image to the structure enhancement unit 78 and displays it on the monitor 18. Further, the artifact detection unit 161 stores the oxygen saturation image in which no artifact is detected in the image storage unit 87. That is, the oxygen saturation image in which no artifact is detected by the artifact detection unit 161 corresponds to the non-enlarged oxygen saturation image of the first embodiment.
  • the artifact detection unit 161 inputs the oxygen saturation image in which the artifact is detected to the artifact correction unit 88, corrects the detected artifact, and outputs the corrected image to the structure enhancement unit 78. That is, the oxygen saturation image in which the artifact is detected corresponds to the enlarged oxygen saturation image of the first embodiment.
  • the configuration of the artifact correction unit 88 is the same as that of the first embodiment, and the high frequency component extraction unit 101 extracts a high frequency component from the oxygen saturation image in which the artifact is detected, and generates a high frequency component image.
  • the enlarged portion extraction unit 102 acquires a past oxygen saturation image in which no artifact has been detected from the image storage unit 87, acquires an oxygen saturation image in which the artifact has been detected from the artifact detection unit 161, and the corresponding region
  • the pattern matching is performed by the detection unit 102a, a region corresponding to the oxygen saturation image in which the artifact is detected is detected from the oxygen saturation image in which the artifact is not detected, and this region is sized by the enlargement processing unit 102b. Convert.
  • the low frequency component extraction unit 103 extracts a low frequency component from the size-converted image, and generates a low frequency component image.
  • the synthesis processing unit 104 synthesizes the high-frequency component image and the low-frequency component image, generates a synthesized oxygen saturation image in which artifacts are corrected, and outputs the synthesized oxygen saturation image to the structure enhancement unit 78.
  • the endoscope system including the oxygen saturation generation unit 160 of the second embodiment detects the occurrence of the artifact 122 from the oxygen saturation image instead of monitoring the zoom operation, and the artifact is detected.
  • a synthesized oxygen saturation image in which the detected artifact is corrected is generated. That is, the endoscope system including the oxygen saturation generation unit 160 detects magnified observation by detecting artifacts.
  • This endoscope system is capable of generating and displaying a synthetic oxygen saturation image with corrected artifacts when performing close-up observation with the distal end portion 24 of the endoscope 12 approaching the specimen, regardless of the zoom operation. it can.
  • first and second embodiments may be combined. Specifically, when the zoom operation is not detected while monitoring the zoom operation as in the first embodiment, the artifact is corrected in the flow of the second embodiment, and when the zoom operation is detected.
  • a synthetic oxygen saturation image in which artifacts are forcibly corrected as in the first embodiment may be generated as artifacts are generated.
  • the distal end portion 24 of the endoscope 12 is brought close to the specimen when performing magnified observation by the zoom operation and without performing the zoom operation.
  • an accurate oxygen saturation image synthetic oxygen saturation image
  • an oxygen saturation image (non-enlarged oxygen saturation image, enlarged oxygen saturation image) generated by the image generation unit 84 is used to generate a synthetic oxygen saturation image with corrected artifacts.
  • the artifact may be corrected at the stage of each image signal.
  • the oxygen saturation image generation unit 76 of the endoscope system 10 of the first embodiment is replaced with an oxygen saturation image generation unit 170 shown in FIG.
  • Other configurations are the same as those of the endoscope system 10 of the first embodiment.
  • the oxygen saturation image generation unit 170 includes a signal ratio detection unit 81, a correlation storage unit 82, an oxygen saturation calculation unit 83, an image generation unit 84, a zoom detection unit 86, An image storage unit 87, an artifact correction unit 171, an image signal storage unit 172, and a signal processing switching unit 173 are provided.
  • the signal ratio detection unit 81, the correlation storage unit 82, the oxygen saturation calculation unit 83, the image generation unit 84, and the zoom detection unit 86 are the same as those in the first embodiment.
  • the detection result by the zoom detection unit 86 is input to the signal processing switching unit 173.
  • the signal processing switching unit 173 switches the content of the signal processing applied to the input image signal by switching the output destination of the image signal of each color input in the special observation mode. Specifically, when a zoom operation is not detected by the zoom detection unit 86 (when non-magnifying observation is performed), the signal processing switching unit 173 converts each image signal input to the oxygen saturation image generation unit 170 into a signal ratio. It outputs to the calculation part 81 and the image generation part 84, and produces
  • the image signal input when the zoom operation is not detected is stored in the image signal storage unit 172. That is, the image signal storage unit 172 stores an image signal at the time of non-magnification observation corresponding to the non-magnification oxygen saturation image of the first embodiment (hereinafter referred to as a non-magnification image signal).
  • the signal processing switching unit 173 uses the image signal at the time of the magnified observation (hereinafter referred to as the magnified image signal) as an artifact.
  • the data is output to the correction unit 171.
  • the artifact correction unit 171 includes a high frequency component extraction unit 181, an enlarged portion extraction unit 182, a low frequency component extraction unit 183, and a synthesis processing unit 184.
  • the enlarged portion extraction unit 182 includes a corresponding area detection unit 182a and an enlargement processing unit 182b. The basic actions of these parts are the same as those of the first embodiment, but the high-frequency component extraction part 181 and the enlarged part extraction part 182 (corresponding area detection part 182a and enlargement processing part 182b) of the artifact correction part 171 are low.
  • the frequency component extraction unit 183 and the synthesis processing unit 184 perform each process on the image signal, not the oxygen saturation image.
  • each color image signal (hereinafter referred to as an enlarged image signal) output during magnified observation is input to the high frequency component extracting unit 181 and high frequency components are extracted from the enlarged image signals of these colors.
  • the high frequency component image signal which consists of the extracted high frequency component is produced
  • the corresponding area detection unit 182a acquires the enlarged image signal from the signal processing switching unit 173, and stores the image signal of each color (hereinafter referred to as a non-enlarged image signal) stored from the image signal storage unit 172 at the time of non-magnification observation. get. Then, by performing pattern matching between the enlarged image signal of each color and the non-enlarged image signal, a portion corresponding to the image represented by the enlarged image signal is extracted from the non-enlarged image signal.
  • the enlargement processing unit 182b performs size conversion for enlarging the portion extracted from the non-enlarged image signal so that the size corresponds to the enlarged image signal.
  • the low frequency component extraction unit 183 extracts a low frequency component from the image signal subjected to the size conversion, and generates a low frequency component image signal. Then, the synthesis processing unit 184 synthesizes the high-frequency component image signal and the low-frequency component image signal of the corresponding frame and color to generate a synthesized image signal.
  • the artifact correction unit 171 receives the R1 image signal, the G1 image signal, and the B1 image signal of the first frame and the R2 image signal, the G2 image signal, and the B2 image signal of the second frame.
  • each synthesized image signal for example, the R1 synthesized image signal, the G1 synthesized image signal, the B1 synthesized image signal, the R2 synthesized image signal of the second frame, and the G2 synthesized image signal in which the artifact is corrected.
  • B2 composite image signal for example, the R1 synthesized image signal, the G1 synthesized image signal, the B1 synthesized image signal, the R2 synthesized image signal of the second frame, and the G2 synthesized image signal
  • Each synthesized image signal output from the synthesis processing unit 184 is input to the signal ratio calculation unit 81 and the image generation unit 84. Thereby, a synthetic oxygen saturation image free from artifacts is generated and displayed based on each synthetic image signal.
  • the process of generating the synthesized oxygen saturation image in which the artifact is corrected according to the first embodiment can be performed in advance at the stage of the image signal before the oxygen saturation image is generated.
  • the same may be applied to the process of generating the synthetic oxygen saturation image of the second embodiment or the combination of the first embodiment and the second embodiment.
  • artifacts are corrected at the stage of each image signal input to the oxygen saturation image generation unit 170, but the signal ratio B1 / G2 output by the signal ratio calculation unit 81 and Similar processing for correcting the artifact may be applied to the signal ratio R2 / G2.
  • the same artifact correction processing may be performed on the oxygen saturation data output from the oxygen saturation calculation unit 83.
  • the phosphor 44 is provided at the distal end portion 24 of the endoscope 12, but instead of this, like an endoscope system 300 shown in FIG.
  • a phosphor 44 may be provided inside the light source device 14.
  • the phosphor 44 is provided between the first blue laser light source (473LD) 34 and the second blue laser light source (445LD) 36 and the light guide 41.
  • the first blue laser light source 34 or the second blue laser light source 36 is irradiated with the first blue laser light or the second blue laser light toward the phosphor 44.
  • 1st white light or 2nd white light is emitted.
  • the first or second white light is irradiated into the specimen through the light guide 41.
  • the rest is the same as the endoscope system of the first to third embodiments.
  • the first and second blue laser beams are made incident on the same phosphor 44.
  • the first blue laser beam and the second blue laser beam are separately used in the first fluorescence. Or the second phosphor.
  • the light source device 14 of the endoscope system 400 includes an LED (Light Emitting Diode) light source unit instead of the first blue laser light source 34, the second blue laser light source 36, and the light source control unit 40. 401 and an LED light source controller 404 are provided. Further, the phosphor 44 is not provided in the illumination optical system 24a of the endoscope system 400. The rest is the same as the endoscope system of the first to third embodiments.
  • LED Light Emitting Diode
  • the LED light source unit 401 includes an R-LED 401a, a G-LED 401b, and a B-LED 401c as light sources that emit light limited to a specific wavelength band.
  • the R-LED 401a has a red band light in the red region of 600 to 720 nm (hereinafter simply referred to as red light)
  • the G-LED 401b has a green band light in the green region of 480 to 620 nm (hereinafter referred to as “red light”). Simply emits green light).
  • the B-LED 401c emits blue band light in the blue region of 400 to 500 nm (hereinafter simply referred to as blue light).
  • the LED light source unit 401 has a high-pass filter (HPF) 402 that is inserted into and removed from the optical path of blue light emitted from the B-LED 401c.
  • the high pass filter 402 cuts blue light having a wavelength band of 450 nm or less and transmits light having a wavelength band longer than 450 nm.
  • the cut-off wavelength (450 nm) of the high-pass filter 402 is a wavelength in which the absorption coefficients of oxyhemoglobin and reduced hemoglobin are substantially equal (see FIG. 10), and the absorption coefficients of oxyhemoglobin and reduced hemoglobin are reversed at this wavelength.
  • the correlation stored in the correlation storage unit 82 is a case where the extinction coefficient of oxyhemoglobin is larger than the extinction coefficient of reduced hemoglobin. Therefore, a signal based on a wavelength band equal to or less than the cutoff wavelength is The signal ratio B1 / G2 is lower than the original value measured at 473 nm, causing inaccurate oxygen saturation to be calculated. For this reason, the high-pass filter 402 prevents the specimen from being irradiated with light in the wavelength band equal to or less than the cutoff wavelength when acquiring the B1 image signal for calculating the oxygen saturation.
  • the high-pass filter 402 is inserted in front of the B-LED 401c in the special observation mode, and is retracted to the retreat position in the normal observation mode.
  • the high-pass filter 402 is inserted / removed by the HPF insertion / extraction unit 403 under the control of the LED light source control unit 404.
  • the LED light source control unit 404 controls turning on / off of each LED 401 a to 401 c of the LED light source unit 401 and insertion / extraction of the high-pass filter 402. Specifically, as shown in FIG. 21, in the normal observation mode, the LED light source control unit 404 turns on all the LEDs 401a to 401c, and the high-pass filter 402 retracts from the optical path of the B-LED 401c.
  • the LED light source control unit 40 inserts the high-pass filter 402 on the optical path of the B-LED 401c.
  • the B-LED 401c is turned on and the R-LED 401a and the G-LED 401b are turned off to irradiate the sample with blue light with a wavelength band of 450 nm or less cut.
  • the R-LED 401a, the G-LED 401b, and the B-LED 401c are all turned on, and the blue light from which the wavelength band of 450 nm or less is cut out of the blue light emitted from the B-LED 401c and the R-LED 401a emits light.
  • the sensor 48 outputs a B1 image signal in the first frame, and outputs an R2 image signal, a G2 image signal, and a B2 image signal in the second frame, respectively.
  • the subsequent processing can be performed in the same manner as the endoscope system of the first to third embodiments.
  • the endoscope system 400 of the fourth embodiment using LEDs can also generate and display a synthetic oxygen saturation image in which artifacts are corrected.
  • the specimen is imaged with the high-pass filter 102 inserted in both the first frame and the second frame in the special observation mode.
  • the high-pass filter 102 is inserted only in the first frame, and the second frame.
  • the high pass filter 102 may be retracted.
  • the first frame in the special observation mode only the B-LED 401c is turned on and only the blue light is irradiated on the specimen, but the R-LED 401a and the G-LED 401b are also turned on in the first frame, and the R1 image signal
  • the G1 image signal may be output to the sensor 48.
  • the light source device 14 of the endoscope system 500 includes a broadband light source 501, a rotary filter 502, and a rotation instead of the first and second blue laser beams 34 and 36 and the light source control unit 40.
  • a filter control unit 503 is provided.
  • the sensor 505 of the endoscope system 500 is a monochrome image sensor that is not provided with a color filter. The rest is the same as the endoscope system of the first to third embodiments.
  • the broadband light source 501 includes, for example, a xenon lamp, a white LED, and the like, and emits white light whose wavelength band ranges from blue to red.
  • the rotation filter 502 includes a normal observation mode filter 510 and a special observation mode filter 511 (see FIG. 24), and the white light emitted from the broadband light source 501 is normally on the optical path on which the light guide 41 is incident. It is movable in the radial direction between a first position for the normal observation mode where the observation mode filter 510 is disposed and a second position for the special observation mode where the special observation mode filter 511 is disposed. The mutual movement of the rotary filter 502 to the first position and the second position is controlled by the rotary filter control unit 503 according to the selected observation mode.
  • the rotary filter 502 rotates according to the imaging frame of the sensor 505 in a state where the rotary filter 502 is disposed at the first position or the second position.
  • the rotation speed of the rotation filter 502 is controlled by the rotation filter control unit 503 according to the selected observation mode.
  • the normal observation mode filter 510 is provided on the inner periphery of the rotary filter 502.
  • the normal observation mode filter 510 includes an R filter 510a that transmits red light, a G filter 510b that transmits green light, and a B filter 510c that transmits blue light. Therefore, when the rotary filter 502 is disposed at the first position for the normal light observation mode, white light from the broadband light source 501 is selected from the R filter 510a, the G filter 510b, and the B filter 510c according to the rotation of the rotary filter 502. It enters the crab.
  • the specimen is sequentially irradiated with red light, green light, and blue light according to the transmitted filter, and the sensor 505 images the specimen with these reflected lights, so that an R image signal and a G image are obtained.
  • the signal and the B image signal are sequentially output.
  • the special observation mode filter 511 is provided on the outer peripheral portion of the rotary filter 502.
  • the special observation mode filter 511 includes an R filter 511a that transmits red light, a G filter 511b that transmits green light, a B filter 511c that transmits blue light, and a narrow band that transmits 473 ⁇ 10 nm narrow band light. And a filter 511d. Therefore, when the rotary filter 502 is disposed at the second position for the normal light observation mode, white light from the broadband light source 501 is reduced according to the rotation of the rotary filter 502 by the R filter 511a, the G filter 511b, the B filter 511c, and the narrow filter. It enters one of the band-pass filters 511d.
  • the specimen is sequentially irradiated with red light, green light, blue light, and narrowband light (473 nm) according to the transmitted filter, and the sensor 505 images each specimen with these reflected lights.
  • R image signal, G image signal, B image signal, and narrowband image signal are sequentially output.
  • the R image signal and the G image signal obtained in the special observation mode correspond to the R1 (or R2) image signal and the G1 (or G2) image signal of the first embodiment.
  • the B image signal obtained in the special observation mode corresponds to the B2 image signal of the first embodiment, and the narrowband image signal corresponds to the B1 image signal. Therefore, the subsequent processing can be performed in the same manner as in the endoscope systems of the first to third embodiments. For this reason, the endoscope system 500 of the fifth embodiment using the rotation filter 502 can also generate and display a synthetic oxygen saturation image in which artifacts are corrected.
  • the oxygen saturation is calculated based on the signal ratio B1 / G2 and the signal ratio R2 / G2, but the oxygen saturation is calculated based only on the signal ratio B1 / G2. May be.
  • the correlation storage unit 82 may store the correlation between the signal ratio B1 / G2 and the oxygen saturation.
  • an oxygen saturation image obtained by imaging oxygen saturation is generated and displayed.
  • a blood volume image obtained by imaging blood volume is generated and displayed. May be. Since the blood volume has a correlation with the signal ratio R2 / G2, a blood volume image in which the blood volume is imaged can be created by assigning a different color according to the signal ratio R2 / G2.
  • the oxygen saturation is calculated, but instead of or in addition to this, “blood volume (signal ratio R2 / G2) ⁇ oxygen saturation (%)”.
  • Other biological function information such as a calculated oxyhemoglobin index or a reduced hemoglobin index calculated by “blood volume ⁇ (1 ⁇ oxygen saturation) (%)” may be calculated.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Signal Processing (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Endoscopes (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Dans la présente invention, les artefacts caractéristiques sont corrigés dans le cas d'une observation agrandie au moyen d'une image de saturation en oxygène. Ce système d'endoscopie (10) comprend une unité de génération d'image (84), une unité d'extraction de composante de basse fréquence (103), une unité d'extraction de composante de haute fréquence (101) et une unité de traitement de combinaison (104). L'unité de génération d'image (84) génère une première image de saturation en oxygène sur la base d'un premier signal d'image obtenu par l'imagerie d'un spécimen et de la saturation en oxygène calculée sur la base du premier signal d'image. En outre, l'unité de génération d'image (84) génère un second signal d'image, obtenu par imagerie du spécimen plus agrandi que lors de l'émission du premier signal d'image, et une seconde image de saturation en oxygène basée sur la saturation en oxygène calculée sur la base du second signal d'image. L'unité d'extraction de composante de basse fréquence (103) extrait la composante basse fréquence qui est inférieure à une fréquence de coupure de la première image de saturation en oxygène, et l'unité d'extraction de composante de haute fréquence (101) extrait une composante de haute fréquence qui correspond au moins à la fréquence de coupure de la seconde image de saturation en oxygène. L'unité de traitement de combinaison (104) génère une image de saturation en oxygène combinée résultant de leur combinaison.
PCT/JP2014/067499 2013-08-21 2014-07-01 Système d'endoscopie, processeur et procédé de fonctionnement WO2015025620A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013171693A JP5990141B2 (ja) 2013-08-21 2013-08-21 内視鏡システム及びプロセッサ装置並びに作動方法
JP2013-171693 2013-08-21

Publications (1)

Publication Number Publication Date
WO2015025620A1 true WO2015025620A1 (fr) 2015-02-26

Family

ID=52483404

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/067499 WO2015025620A1 (fr) 2013-08-21 2014-07-01 Système d'endoscopie, processeur et procédé de fonctionnement

Country Status (2)

Country Link
JP (1) JP5990141B2 (fr)
WO (1) WO2015025620A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019130834A1 (fr) * 2017-12-26 2019-07-04 オリンパス株式会社 Dispositif et procédé de traitement d'image

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6920931B2 (ja) * 2017-09-01 2021-08-18 富士フイルム株式会社 医療画像処理装置、内視鏡装置、診断支援装置、及び、医療業務支援装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012125461A (ja) * 2010-12-16 2012-07-05 Fujifilm Corp 画像処理装置
JP2012143399A (ja) * 2011-01-12 2012-08-02 Fujifilm Corp 内視鏡システム、内視鏡システムのプロセッサ装置及び画像生成方法
JP2012213552A (ja) * 2011-04-01 2012-11-08 Fujifilm Corp 内視鏡システム、内視鏡システムのプロセッサ装置、及び画像処理方法
JP2012213551A (ja) * 2011-04-01 2012-11-08 Fujifilm Corp 生体情報取得システムおよび生体情報取得方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012125461A (ja) * 2010-12-16 2012-07-05 Fujifilm Corp 画像処理装置
JP2012143399A (ja) * 2011-01-12 2012-08-02 Fujifilm Corp 内視鏡システム、内視鏡システムのプロセッサ装置及び画像生成方法
JP2012213552A (ja) * 2011-04-01 2012-11-08 Fujifilm Corp 内視鏡システム、内視鏡システムのプロセッサ装置、及び画像処理方法
JP2012213551A (ja) * 2011-04-01 2012-11-08 Fujifilm Corp 生体情報取得システムおよび生体情報取得方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019130834A1 (fr) * 2017-12-26 2019-07-04 オリンパス株式会社 Dispositif et procédé de traitement d'image
JPWO2019130834A1 (ja) * 2017-12-26 2020-11-19 オリンパス株式会社 画像処理装置および画像処理方法
US11509834B2 (en) 2017-12-26 2022-11-22 Olympus Corporation Image processing apparatus and image processing method

Also Published As

Publication number Publication date
JP2015039502A (ja) 2015-03-02
JP5990141B2 (ja) 2016-09-07

Similar Documents

Publication Publication Date Title
JP5992936B2 (ja) 内視鏡システム、内視鏡システム用プロセッサ装置、内視鏡システムの作動方法、内視鏡システム用プロセッサ装置の作動方法
JP5887367B2 (ja) プロセッサ装置、内視鏡システム、及び内視鏡システムの作動方法
JP6039639B2 (ja) 内視鏡システム、内視鏡システム用プロセッサ装置、内視鏡システムの作動方法、及び内視鏡システム用プロセッサ装置の作動方法
JP5977772B2 (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法、プロセッサ装置の作動方法
JP5654511B2 (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡システムの作動方法
JP6010571B2 (ja) 内視鏡システム、内視鏡システム用プロセッサ装置、内視鏡システムの作動方法、内視鏡システム用プロセッサ装置の作動方法
JP6092792B2 (ja) 内視鏡システム用プロセッサ装置、内視鏡システム、内視鏡システム用プロセッサ装置の作動方法、内視鏡システムの作動方法
JP6085648B2 (ja) 内視鏡用光源装置及び内視鏡システム
US10285631B2 (en) Light source device for endoscope and endoscope system
WO2014156604A1 (fr) Système d'endoscope, procédé opérationnel s'y rapportant et dispositif de traitement
JP6214503B2 (ja) 内視鏡用光源装置及び内視鏡システム
JP6129686B2 (ja) 内視鏡システム及びプロセッサ装置並びに作動方法並びにテーブル作成方法
JP2018051364A (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法、プロセッサ装置の作動方法
JP5990141B2 (ja) 内視鏡システム及びプロセッサ装置並びに作動方法
JP6099518B2 (ja) 内視鏡システム及び作動方法
JP6099529B2 (ja) 内視鏡システム、プロセッサ装置及び光源装置並びに作動方法
JP6254502B2 (ja) 内視鏡用光源装置及び内視鏡システム
JP6272956B2 (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法、プロセッサ装置の作動方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14837635

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14837635

Country of ref document: EP

Kind code of ref document: A1