WO2015045576A1 - 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法及びプロセッサ装置の作動方法 - Google Patents

内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法及びプロセッサ装置の作動方法 Download PDF

Info

Publication number
WO2015045576A1
WO2015045576A1 PCT/JP2014/068764 JP2014068764W WO2015045576A1 WO 2015045576 A1 WO2015045576 A1 WO 2015045576A1 JP 2014068764 W JP2014068764 W JP 2014068764W WO 2015045576 A1 WO2015045576 A1 WO 2015045576A1
Authority
WO
WIPO (PCT)
Prior art keywords
region
oxygen saturation
score
endoscope system
disease state
Prior art date
Application number
PCT/JP2014/068764
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
泰士 白石
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to EP14848419.9A priority Critical patent/EP3050487B1/de
Publication of WO2015045576A1 publication Critical patent/WO2015045576A1/ja
Priority to US15/058,391 priority patent/US10231658B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/043Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/044Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for absorption imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0646Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with illumination filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0071Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0084Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for introduction into the body, e.g. by catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14503Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue invasive, e.g. introduced into the body by a catheter or needle or using implanted sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14542Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/1459Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters invasive, e.g. introduced into the body by a catheter
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16ZINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS, NOT OTHERWISE PROVIDED FOR
    • G16Z99/00Subject matter not provided for in other main groups of this subclass

Definitions

  • the present invention relates to an endoscope system for obtaining biological function information related to oxygen saturation of blood hemoglobin from an image signal obtained by imaging an observation target in a subject, a processor device of the endoscope system, and an endoscope system.
  • the present invention relates to an operating method and an operating method of a processor device.
  • the ratio of the first signal light image signal corresponding to the reflected light of the first signal light detected by the sensor and the second signal light image signal corresponding to the reflected light of the second signal light (hereinafter referred to as signal ratio) is If there is no change in the oxygen saturation in the blood vessel, a constant value is maintained, but if a change in the oxygen saturation occurs, it changes accordingly. Therefore, the oxygen saturation can be calculated based on the signal ratio between the first signal light image signal and the second signal light image signal.
  • oxygen saturation image image representing oxygen saturation
  • the accuracy of the diagnosis based on the oxygen saturation image is largely due to the knowledge and experience (skill level) of the doctor, and the diagnosis results may vary.
  • diagnosis results such as the degree of progression (for example, stage classification) may differ depending on the doctor.
  • the degree of progression of cancer cannot be diagnosed only with various images provided by an endoscope system such as an oxygen saturation image.
  • the present invention relates to an endoscope system that presents information for supporting a doctor so that a more accurate and detailed diagnosis can be performed based on the oxygen saturation level of an observation target, a processor device of the endoscope system, and an endoscope It is an object to provide a method of operating a system and a method of operating a processor device.
  • the endoscope system of the present invention includes a light source device, an image sensor, an oxygen saturation calculation unit, a distribution pattern generation unit, a medical condition score calculation unit, and a display unit.
  • the light source device irradiates the observation target with light.
  • the image sensor images an observation target with reflected light of light emitted from the light source device, and outputs an image signal.
  • the oxygen saturation calculation unit calculates the oxygen saturation of the observation target based on the image signal.
  • the distribution pattern generation unit generates a distribution pattern representing a distribution related to oxygen saturation.
  • the medical condition score calculation unit calculates a medical condition score representing a medical condition to be observed based on the distribution pattern.
  • the display unit displays a medical condition score or information based on the medical condition score.
  • the pathological score when there is a high oxygen area with a certain level of oxygen saturation in a low oxygen area where the oxygen saturation is less than a certain value is higher than the pathological score when there is no high oxygen area in the low oxygen area Is preferred. Further, when there is a high oxygen region in the low oxygen region, it is preferable that the disease state score is larger as the ratio of the high oxygen region to the low oxygen region is larger.
  • the pathology score when there is a high oxygen area with a certain level of oxygen saturation in a low oxygen area where the oxygen saturation is less than a certain value is higher than the pathology score when there is no high oxygen area within the low oxygen area. May be made smaller.
  • the disease state score decreases as the ratio of the high oxygen region to the low oxygen region increases.
  • the medical condition score calculation unit compares the reference pattern of the specific distribution shape with the distribution pattern calculated by the distribution pattern generation unit, and uses the similarity between the reference pattern and the distribution pattern calculated by the distribution pattern generation unit as the medical condition score. calculate.
  • the reference pattern having a specific distribution shape is a pattern in which a high oxygen region having an oxygen saturation level equal to or greater than a certain value is present in a low oxygen region having an oxygen saturation level less than a certain value.
  • the distribution pattern generation unit may generate a distribution pattern for a region of interest designated in advance.
  • the medical condition score calculation unit calculates a medical condition score for the region of interest.
  • the endoscope system may further include a similar clinical data selection unit.
  • the similar clinical data selection unit refers to a database in which a plurality of past clinical data is stored in advance, compares the distribution pattern with the clinical data, selects similar clinical data with a similar distribution pattern from the clinical data, and displays the same To display.
  • the similar clinical data selection unit displays, for example, an oxygen saturation image included in the similar clinical data on the display unit.
  • the endoscope system may further include a treatment effect score calculation unit.
  • the treatment effect score calculation unit calculates a treatment effect score representing the treatment effect by a specific treatment method based on the distribution pattern, and displays the treatment effect score on the display unit.
  • the endoscope system may further include an automatic storage control unit.
  • the automatic storage control unit is generated based on the image signal and the oxygen saturation when the disease state score becomes a value equal to or higher than a specified value. The oxygen saturation image and the medical condition score associated with each other are automatically saved.
  • the display unit may display a medical condition score or information based on the medical condition score when the medical condition score is equal to or higher than a specified value.
  • the endoscope system of the present invention preferably includes a feature region extraction unit that extracts a feature region to be observed based on an image signal.
  • the medical condition score calculation unit calculates the medical condition score based on the distribution pattern of the oxygen saturation in the feature region.
  • the feature region extraction unit extracts a feature region to be observed based on, for example, a blue image signal obtained from a blue pixel of an image sensor or a green image signal obtained from a green pixel.
  • the characteristic region is a red region
  • the disease state score when the oxygen saturation level is higher than a certain value in the red region is preferably larger than the disease state score when there is no high oxygen region in the red region.
  • the disease state score is larger as the ratio of the high oxygen region to the red region is larger.
  • the characteristic region is a red region, and the disease state score when the oxygen saturation is higher than a certain value in the red region may be smaller than the disease state score when there is no high oxygen region in the red region. good.
  • the disease state score may be reduced as the ratio of the high oxygen region to the red region increases.
  • a message display control unit that monitors the medical condition score and displays a message corresponding to the medical condition score on the display unit may be provided.
  • the processor device of the endoscope system of the present invention includes a light source device for irradiating light to an observation target, an image sensor that images the observation target with reflected light of light irradiated by the light source device, and outputs an image signal;
  • An endoscopic system processor device for an endoscopic system having a display unit, comprising an oxygen saturation calculation unit, a distribution pattern generation unit, and a medical condition score calculation unit.
  • the oxygen saturation calculation unit calculates the oxygen saturation of the observation target based on the image signal.
  • the distribution pattern generation unit generates a distribution pattern representing a distribution related to oxygen saturation.
  • the medical condition score calculation unit calculates a medical condition score representing a medical condition to be observed based on the distribution pattern.
  • a feature region extraction unit that extracts the feature region to be observed based on the image signal may be provided.
  • the medical condition score calculation unit calculates the medical condition score based on the distribution pattern of the oxygen saturation in the feature region.
  • the operation method of the endoscope system of the present invention includes a light source device for irradiating light to an observation target, an image sensor that images the observation target with reflected light of light irradiated by the light source device, and outputs an image signal;
  • An operation method of an endoscope system including a display unit, and includes an oxygen saturation calculation step, a distribution pattern generation step, a disease state score calculation step, and a display step.
  • the oxygen saturation calculation step calculates the oxygen saturation of the observation target based on the image signal.
  • the distribution pattern generation step the distribution pattern generation unit generates a distribution pattern representing a distribution related to oxygen saturation.
  • the disease state score calculation unit calculates a disease state score representing the disease state to be observed based on the distribution pattern.
  • the display step the display unit displays a medical condition score or information based on the medical condition score.
  • the feature region extraction unit may include a feature region extraction step for extracting the feature region to be observed based on the image signal.
  • the disease state score calculation unit calculates the disease state score based on the distribution pattern of the oxygen saturation in the feature region.
  • An operating method of the processor device of the present invention is an operating method of a processor device that processes an image signal obtained by imaging an observation target, an oxygen saturation calculating step, a distribution pattern generating step, a medical condition score calculating step, Is provided.
  • the oxygen saturation calculation step the oxygen saturation calculation unit calculates the oxygen saturation of the observation target based on the image signal.
  • the distribution pattern generation step the distribution pattern generation unit generates a distribution pattern representing a distribution related to oxygen saturation.
  • the disease state score calculation unit calculates a disease state score representing the disease state to be observed based on the distribution pattern.
  • the feature region extraction unit may include a feature region extraction step for extracting the feature region to be observed based on the image signal.
  • the disease state score calculation unit calculates the disease state score based on the distribution pattern of the oxygen saturation in the feature region.
  • the processor device of the endoscope system According to the endoscope system, the processor device of the endoscope system, the operation method of the endoscope system, and the operation method of the processor device of the present invention, more accurate and detailed diagnosis can be performed based on the oxygen saturation. Further information for supporting the physician can be presented.
  • the endoscope system 10 includes an endoscope 12, a light source device 14, a processor device 16, a monitor 18 (display unit), and a console 20.
  • the endoscope 12 is optically connected to the light source device 14 and electrically connected to the processor device 16.
  • the endoscope 12 includes an insertion portion 21 to be inserted into a subject, an operation portion 22 provided at a proximal end portion of the insertion portion 21, a bending portion 23 and a distal end portion provided at the distal end side of the insertion portion 21. 24.
  • the angle knob 22a of the operation unit 22 By operating the angle knob 22a of the operation unit 22, the bending unit 23 performs a bending operation. With this bending operation, the distal end portion 24 is directed in a desired direction.
  • the operation unit 22 includes a mode switch SW (mode switch) 22b, a zoom operation unit 22c, and a freeze button (not shown) for storing a still image.
  • the mode switching SW 22b is used for switching operation between two types of modes, a normal observation mode and a special observation mode.
  • the normal observation mode is a mode in which a normal light image in which an observation target in the subject is converted into a full color image is displayed on the monitor 18.
  • the special observation mode is a mode in which an oxygen saturation image obtained by imaging the oxygen saturation of blood hemoglobin to be observed is displayed on the monitor 18.
  • the zoom operation unit 22c is used for a zoom operation for driving the zoom lens 47 (see FIG. 2) in the endoscope 12 to enlarge the observation target.
  • the processor device 16 is electrically connected to the monitor 18 and the console 20.
  • the monitor 18 displays images such as normal light images and oxygen saturation images, and information related to these images (hereinafter referred to as image information and the like).
  • the console 20 functions as a UI (user interface) that receives input operations such as function settings.
  • a recording unit (not shown) for recording image information or the like may be connected to the processor device 16.
  • the light source device 14 includes a first blue laser light source (473LD (laser diode)) 34 that emits a first blue laser beam having a center wavelength of 473 nm and a second blue laser beam that emits a second blue laser beam having a center wavelength of 445 nm.
  • Two blue laser light sources (445LD) 36 are provided as light emission sources. Light emission of each of the light sources 34 and 36 made of these semiconductor light emitting elements is individually controlled by the light source control unit 40. For this reason, the light quantity ratio of the emitted light from the first blue laser light source 34 and the emitted light from the second blue laser light source 36 can be changed.
  • the light source control unit 40 turns on the second blue laser light source 36 in the normal observation mode.
  • the first blue laser light source 34 and the second blue laser light source 36 are alternately turned on at intervals of one frame.
  • the half width of the first and second blue laser beams is preferably about ⁇ 10 nm.
  • the first blue laser light source 34 and the second blue laser light source 36 can use broad area type InGaN laser diodes, and can also use InGaNAs laser diodes or GaNAs laser diodes.
  • the light source may be configured to use a light emitter such as a light emitting diode.
  • the first and second blue laser beams emitted from the light sources 34 and 36 are transmitted to a light guide (LG) 41 via optical members (all not shown) such as a condenser lens, an optical fiber, and a multiplexer.
  • the light guide 41 is built in the universal cord 17 (see FIG. 1) for connecting the light source device 14 and the endoscope 12 and the endoscope 12.
  • the light guide 41 propagates the first and second blue laser beams from the light sources 34 and 36 to the distal end portion 24 of the endoscope 12.
  • a multimode fiber can be used as the light guide 41.
  • a thin fiber cable having a core diameter of 105 ⁇ m, a cladding diameter of 125 ⁇ m, and a diameter of ⁇ 0.3 to 0.5 mm including a protective layer serving as an outer shell can be used.
  • the distal end portion 24 of the endoscope 12 has an illumination optical system 24a and an imaging optical system 24b.
  • the illumination optical system 24a is provided with a phosphor 44 and an illumination lens 45.
  • the first and second blue laser beams are incident on the phosphor 44 from the light guide 41.
  • the phosphor 44 emits fluorescence when irradiated with the first or second blue laser light. Further, a part of the first or second blue laser light passes through the phosphor 44 as it is. The light emitted from the phosphor 44 is irradiated to the observation target through the illumination lens 45.
  • the observation target is irradiated with white light having the spectrum shown in FIG. 3 (second white light).
  • the second white light is composed of second blue laser light and green to red second fluorescence excited and emitted from the phosphor 44 by the second blue laser light. Therefore, the wavelength range of the second white light extends to the entire visible light range.
  • the special observation mode when the first blue laser light and the second blue laser light are alternately incident on the phosphor 44, the first white light and the second white light having the spectrum shown in FIG. 4 are alternately observed. Subject is irradiated.
  • the first white light is composed of first blue laser light and green to red first fluorescence that is excited and emitted from the phosphor 44 by the first blue laser light. Therefore, the first white light has a wavelength range covering the entire visible light range.
  • the second white light is the same as the second white light irradiated in the normal observation mode.
  • the first fluorescence and the second fluorescence have substantially the same waveform (spectrum shape), and the ratio of the intensity of the first fluorescence (I1 ( ⁇ )) to the intensity of the second fluorescence (I2 ( ⁇ )) (hereinafter referred to as a frame).
  • the intensity ratio) is the same at any wavelength ⁇ .
  • I2 ( ⁇ 1) / I1 ( ⁇ 1) I2 ( ⁇ 2) / I1 ( ⁇ 2). Since the inter-frame intensity ratio I2 ( ⁇ ) / I1 ( ⁇ ) affects the calculation accuracy of the oxygen saturation, the light source control unit 40 maintains a preset reference inter-frame intensity ratio. It is controlled with high accuracy.
  • the phosphor 44 absorbs a part of the first and second blue laser beams and excites and emits green to red light (for example, YAG phosphor or BAM (BaMgAl 10 O 17 )). It is preferable to use a material comprising a phosphor such as.
  • a material comprising a phosphor such as
  • high intensity first white light and second white light can be obtained with high luminous efficiency.
  • the intensity of each white light can be easily adjusted, and changes in color temperature and chromaticity can be kept small.
  • the imaging optical system 24b of the endoscope 12 includes an imaging lens 46, a zoom lens 47, and a sensor 48 (see FIG. 2). Reflected light from the observation object enters the sensor 48 via the imaging lens 46 and the zoom lens 47. As a result, a reflected image of the observation object is formed on the sensor 48.
  • the zoom lens 47 moves between the tele end and the wide end by operating the zoom operation unit 22c. When the zoom lens 47 moves to the tele end side, the reflected image to be observed is enlarged. On the other hand, when the zoom lens 47 moves to the wide end side, the reflected image to be observed is reduced. Note that when not magnifying observation (non-magnifying observation), the zoom lens 47 is disposed at the wide end. When performing magnified observation, the zoom lens 47 is moved from the wide end to the tele end side by operating the zoom operation unit 22c.
  • the sensor 48 is a color image pickup device, picks up a reflected image of an observation target, and outputs an image signal.
  • the sensor 48 is, for example, a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal-Oxide Semiconductor) image sensor.
  • the sensor 48 has RGB pixels provided with RGB color filters on the imaging surface, and outputs image signals of three colors of R, G, and B by performing photoelectric conversion with pixels of each color of RGB. .
  • the B color filter has a spectral transmittance of 380 to 560 nm
  • the G color filter has a spectral transmittance of 450 to 630 nm
  • the emission intensity of the second blue laser light is much higher than that of the second fluorescence, most of the B image signal output from the B pixel is occupied by the reflected light component of the second blue laser light.
  • the first white light when the first white light is irradiated on the observation target in the special observation mode, the first blue laser light and a part of the green component of the first fluorescence are incident on the B pixel, and the first is applied to the G pixel. A part of the green component of the fluorescence is incident, and the red component of the first fluorescence is incident on the R pixel.
  • the emission intensity of the first blue laser light is much higher than that of the first fluorescence, most of the B image signal is occupied by the reflected light component of the first blue laser light.
  • the light incident component in each RGB pixel when the second white light is irradiated on the observation target in the special observation mode is the same as that in the normal observation mode.
  • the sensor 48 may be a so-called complementary color image sensor having C (cyan), M (magenta), Y (yellow), and G (green) complementary color filters on the imaging surface.
  • the color conversion unit that performs color conversion from the CMYG four-color image signal to the RGB three-color image signal is any of the endoscope 12, the light source device 14, and the processor device 16. It should be provided in. In this way, even when a complementary color image sensor is used, it is possible to obtain RGB three-color image signals by color conversion from the four-color CMYG image signals.
  • the imaging control unit 49 performs imaging control of the sensor 48. As shown in FIG. 6, in the normal observation mode, the observation object illuminated with the second white light is imaged by the sensor 48 for each frame period. Thereby, RGB image signals are output from the sensor 48 for each frame.
  • the imaging control unit 49 controls the imaging of the sensor 48 in the special observation mode in the same manner as in the normal observation mode.
  • the special observation mode the first white light and the second white light are alternately irradiated on the observation target in synchronization with the imaging frame of the sensor 48. Therefore, as shown in FIG.
  • the observation object is imaged with the first white light in the eyes, and the observation object is imaged with the second white light in the next second frame.
  • the sensor 48 outputs image signals of each color of RGB in both the first frame and the second frame.
  • the RGB image signals obtained in this manner are called R1 image signal, G1 image signal, and B1 image signal, respectively.
  • the RGB image signals obtained by imaging with the second white light in the second frame are R2 image signals and G2. This is called an image signal or B2 image signal.
  • the oxygen saturation is calculated using the signal ratio B1 / G2 between the B1 image signal and the G2 image signal and the signal ratio R2 / G2 between the R2 image signal and the G2 image signal.
  • the signal ratio essential for calculating the oxygen saturation is the signal ratio B1 / G2 between the B1 image signal and the G2 image signal.
  • the component (the first blue laser light transmitted through the phosphor 44) that becomes the B1 image signal in the first white light is the first signal light
  • the G2 image signal in the second white light is the second signal light.
  • the image signals of the respective colors output from the sensor 48 are transmitted to a CDS (correlated double sampling) / AGC (automatic gain control) circuit 50 (see FIG. 2).
  • the CDS / AGC circuit 50 performs correlated double sampling (CDS) and automatic gain control (AGC) on the analog image signal output from the sensor 48.
  • CDS correlated double sampling
  • AGC automatic gain control
  • the image signal that has passed through the CDS / AGC circuit 50 is converted into a digital image signal by the A / D converter 52.
  • the digitized image signal is input to the processor device 16.
  • the processor device 16 includes a receiving unit 54, an image processing switching unit 60, a normal observation image processing unit 62, a special observation image processing unit 64, an evaluation unit 65, and an image display signal generation unit 66.
  • the receiving unit 54 receives an image signal input from the endoscope 12.
  • the receiving unit 54 includes a DSP (Digital Signal Processor) 56 and a noise removing unit 58, and the DSP 56 performs digital signal processing such as color correction processing on the received image signal.
  • the noise removal unit 58 performs noise removal processing by, for example, a moving average method or a median filter method on the image signal that has been subjected to color correction processing or the like by the DSP 56.
  • the image signal from which the noise has been removed is input to the image processing switching unit 60.
  • the image processing switching unit 60 inputs an image signal to the normal observation image processing unit 62 when the mode switching SW 22b is set to the normal observation mode. On the other hand, when the mode switching SW 22 b is set to the special observation mode, the image processing switching unit 60 inputs an image signal to the special observation image processing unit 64.
  • the normal observation image processing unit 62 includes a color conversion unit 68, a color enhancement unit 70, and a structure enhancement unit 72.
  • the color conversion unit 68 generates RGB image data in which the input RGB image signals for one frame are assigned to R pixels, G pixels, and B pixels, respectively.
  • the RGB image data is further subjected to color conversion processing such as 3 ⁇ 3 matrix processing, gradation conversion processing, and three-dimensional LUT processing.
  • the color enhancement unit 70 performs various color enhancement processes on the RGB image data that has been subjected to the color conversion process.
  • the structure enhancement unit 72 performs structure enhancement processing such as spatial frequency enhancement on the RGB image data that has been subjected to color enhancement processing.
  • the RGB image data subjected to the structure enhancement process by the structure enhancement unit 72 is input to the image display signal generation unit 66 as a normal observation image.
  • the special observation image processing unit 64 includes an oxygen saturation image generation unit 76 and a structure enhancement unit 78.
  • the oxygen saturation image generation unit 76 calculates the oxygen saturation and generates an oxygen saturation image representing the calculated oxygen saturation.
  • the structure enhancement unit 78 performs structure enhancement processing such as spatial frequency enhancement processing on the oxygen saturation image input from the oxygen saturation image generation unit 76.
  • structure enhancement processing such as spatial frequency enhancement processing on the oxygen saturation image input from the oxygen saturation image generation unit 76.
  • the oxygen saturation image that has undergone the structure enhancement processing by the structure enhancement unit 72 is input to the image display signal generation unit 66.
  • the evaluation unit 65 acquires the oxygen saturation data calculated by the oxygen saturation image generation unit 76, and generates a distribution pattern representing the distribution related to the oxygen saturation based on the oxygen saturation data. Based on the calculated distribution pattern, a medical condition score representing the medical condition to be observed is calculated.
  • the medical condition is, for example, the degree of progression of cancer.
  • the image display signal generation unit 66 converts the normal observation image or the oxygen saturation image into a display format signal (display image signal) and inputs it to the monitor 18. As a result, the normal observation image or the oxygen saturation image is displayed on the monitor 18.
  • a medical condition score is input from the evaluation unit 65 to the image display signal generation unit 66. For this reason, the image display signal generation unit 66 displays the “disease state score” or “information based on the disease state score (warning message etc.)” on the monitor 18 together with the oxygen saturation image.
  • doctors refer to not only the oxygen saturation image but also the display of the “pathology score” or “information based on the pathology score”, so that the tissue with the possibility of lesion can be more objectively accurate and detailed. Can be diagnosed.
  • the oxygen saturation image generation unit 76 includes a signal ratio calculation unit 81, a correlation storage unit 82, an oxygen saturation calculation unit 83, and an image generation unit 87.
  • the signal ratio calculation unit 81 receives the B1 image signal, the G2 image signal, and the R2 image signal among the image signals for two frames input to the oxygen saturation image generation unit 76.
  • the signal ratio calculation unit 81 calculates a signal ratio B1 / G2 between the B1 image signal and the G2 image signal and a signal ratio R2 / G2 between the G2 image signal and the R2 image signal for each pixel.
  • the correlation storage unit 82 stores the correlation between the signal ratio B1 / G2 and the signal ratio R2 / G2 and the oxygen saturation. This correlation is stored in a two-dimensional table in which isolines of oxygen saturation are defined on the two-dimensional space shown in FIG. The positions and shapes of the isolines with respect to the signal ratio B1 / G2 and the signal ratio R2 / G2 are obtained in advance by a physical simulation of light scattering, and the interval between the isolines is the blood volume (signal ratio R2 / G2 ). The correlation between the signal ratio B1 / G2 and the signal ratio R2 / G2 and the oxygen saturation is stored on a log scale.
  • the above correlation is closely related to the light absorption characteristics, light scattering characteristics, and the like of oxyhemoglobin (graph 90) and reduced hemoglobin (graph 91).
  • information on oxygen saturation is easy to handle at wavelengths where the difference in extinction coefficient between oxyhemoglobin and reduced hemoglobin is large, such as the center wavelength of 473 nm of the first blue laser beam.
  • the B1 image signal including a signal corresponding to 473 nm light is highly dependent not only on the oxygen saturation but also on the blood volume.
  • G2 and R2 / G2 the oxygen saturation can be accurately obtained without depending on the blood volume.
  • the oxygen saturation calculation unit 83 refers to the correlation stored in the correlation storage unit 82, and calculates the oxygen saturation corresponding to the signal ratio B1 / G2 and the signal ratio R2 / G2 calculated by the signal ratio calculation unit 81. Calculate for each pixel. For example, when the signal ratio B1 / G2 and the signal ratio R2 / G2 in the specific pixel are B1 * / G2 * and R2 * / G2 * , respectively, referring to the correlation as shown in FIG. 11, the signal ratio B1 * The oxygen saturation corresponding to / G2 * and the signal ratio R2 * / G2 * is “60%”. Therefore, the oxygen saturation calculation unit 83 calculates the oxygen saturation of this pixel as “60%”.
  • the signal ratio B1 / G2 and the signal ratio R2 / G2 are hardly increased or extremely decreased.
  • the values of the signal ratio B1 / G2 and the signal ratio R2 / G2 hardly exceed the lower limit line 93 with an oxygen saturation of 0%, or conversely fall below the upper limit line 94 with an oxygen saturation of 100%.
  • the oxygen saturation calculation unit 83 sets the oxygen saturation to 0%.
  • the oxygen saturation is set to 100. %.
  • the image generation unit 84 uses the oxygen saturation data calculated by the oxygen saturation calculation unit 83 and the oxygen saturation image obtained by imaging the oxygen saturation using the B2 image signal, the G2 image signal, and the R2 image signal. Is generated. Specifically, the image generation unit 84 applies a gain corresponding to the oxygen saturation to the input original B2 image signal, G2 image signal, and R2 image signal for each pixel, and a B2 image obtained by applying the gain. RGB image data is generated using the signal, the G2 image signal, and the R2 image signal. For example, the image generation unit 84 multiplies all of the B2 image signal, the G2 image signal, and the R2 image signal by the same gain “1” for a pixel having a corrected oxygen saturation of 60% or more.
  • the B2 image signal is multiplied by a gain less than “1”, and the G2 image signal and the R2 image signal are gained by “1” or more. Multiply.
  • the RGB image data generated using the B1 image signal, the G2 image signal, and the R2 image signal after the gain processing is an oxygen saturation image.
  • the high oxygen region (region where the oxygen saturation is 60 to 100%) is expressed in the same color as the normal observation image.
  • a low oxygen region where the oxygen saturation is below a specific value (region where the oxygen saturation is 0 to 60%) is represented by a color (pseudo color) different from that of the normal observation image.
  • the image generation unit 84 multiplies the gain for pseudo-coloring only the low oxygen region, but the gain corresponding to the oxygen saturation is applied even in the high oxygen region, and the entire oxygen saturation image is obtained.
  • a pseudo color may be used.
  • the low oxygen region and the high oxygen region are separated by oxygen saturation 60%, this boundary is also arbitrary.
  • the evaluation unit 65 includes a distribution pattern generation unit 96, a medical condition score calculation unit 97, and a reference pattern storage unit 98.
  • the distribution pattern generation unit 96 acquires oxygen saturation data calculated for each pixel by the oxygen saturation calculation unit 83, and generates a distribution pattern representing a distribution related to oxygen saturation.
  • the distribution pattern is, for example, a two-dimensional distribution pattern in which the oxygen saturation of each pixel is arranged like an image, a distribution pattern related to the gradient of oxygen saturation obtained by differentiating the oxygen saturation in a predetermined direction, or the oxygen saturation of each pixel. Is a spatial frequency spectrum obtained by Fourier-transforming a two-dimensional distribution pattern in which images are arranged like an image.
  • the distribution pattern generation unit 96 calculates the oxygen saturation distribution as a distribution pattern.
  • the distribution pattern generation unit 96 may generate other distribution patterns, or a plurality of distribution patterns. It may be generated.
  • the periphery of the cancer tissue 103 in the case of early cancer in which the cancer tissue 103 remains in the mucosal layer 104 without infiltrating the mucosal muscle 105, the periphery of the cancer tissue 103 Although the new blood vessels 103 a are constructed, these new blood vessels 103 a do not reach the thick blood vessel 106 a of the submucosa layer 106. For this reason, as shown in the oxygen saturation distribution 115 and the distribution 116 at the center thereof, the cancer tissue 103 is in a low oxygen region where the oxygen saturation is lower than that of the normal tissue.
  • oxygen depletion and oxygen saturation decrease as more cancerous tissue is present, so in early stage cancer, the vicinity of the center of the cancerous tissue 103 is most hypoxic and can be seen along the central X1-X2 cross section.
  • the oxygen saturation distribution is U-shaped (or V-shaped).
  • the hypoxic state is lower in oxygen saturation than the surrounding normal tissue, but the oxygen supply is abundant in the central portion where the new blood vessel 113a reaches the thick blood vessel 106a of the submucosa layer 106. Therefore, for example, a high oxygen state equivalent to that of a normal tissue is obtained. For this reason, in advanced cancer, a high oxygen region is formed in the central portion, and an annular hypoxic region surrounded by a low oxygen region is observed. Further, when viewed along the central cross section X1-X2, the distribution of oxygen saturation is close to a W-shape with a convex center.
  • the distribution pattern generation unit 96 calculates the oxygen saturation distributions 115 and 125 as distribution patterns.
  • the gradient of oxygen saturation and the spatial frequency spectrum can be obtained by differentiating the oxygen saturation distributions 115 and 125 or by performing Fourier transform.
  • FIG. 13 and FIG. 14 when cancer occurs, a high frequency component is generated in the spatial frequency spectrum of oxygen saturation, and when the cancer progresses and infiltrates the mucosal muscle plate 105, further high frequency component is generated. Arise.
  • the medical condition score calculation unit 97 calculates a medical condition score representing a medical condition to be observed based on the distribution pattern calculated by the distribution pattern generation unit 96. Specifically, the distribution pattern calculated by the distribution pattern generation unit 96 is compared with a reference pattern stored in advance in the reference pattern storage unit 98, and the similarity is calculated as a disease state score. In the present embodiment, the distribution pattern generation unit 96 calculates the distribution of oxygen saturation as a distribution pattern, and accordingly, the oxygen saturation generated in the reference pattern storage unit 98 based on past clinical data and the like. A template of the degree distribution is stored in advance as a reference pattern. The medical condition score calculation unit 97 calculates a medical condition score (similarity) by matching the distribution pattern obtained from the distribution pattern generation unit 96 with the reference pattern.
  • the reference pattern 130 has a specific oxygen saturation distribution shape having, for example, a high oxygen region 132 having an oxygen saturation level equal to or higher than a certain value in a low oxygen region 131 having an oxygen saturation level lower than a certain value. It is. That is, the reference pattern 130 has an oxygen saturation distribution shape simulating the distribution of oxygen saturation of advanced cancer.
  • the constant value of oxygen saturation that distinguishes the low oxygen region 131 and the high oxygen region 132 is determined based on clinical data.
  • the oxygen saturation of the outer peripheral portion 133 of the low oxygen region 131 is a value imitating the oxygen saturation of normal tissue.
  • the medical condition score calculation unit 97 enlarges or reduces the reference pattern 130 and matches the distribution pattern calculated by the distribution pattern generation unit 96.
  • the distribution pattern 125 (see FIG. 14) of advanced cancer having a high oxygen region in a hypoxic region is the standard compared to the distribution pattern 115 (see FIG. 13) of early cancer in which there is no high oxygen region in the low oxygen region. The similarity with the pattern 130 is high. For this reason, a disease state score is large when observing advanced cancer. When there is no cancer, the distribution pattern calculated by the distribution pattern generation unit 96 does not include even the hypoxic region, so the disease state score is even smaller than when observing early cancer.
  • the disease state score is high when there is a high oxygen region in the low oxygen region, but the disease state score may be larger as the proportion of the high oxygen region in the low oxygen region is larger. preferable. Since the proportion of the high oxygen region in the hypoxic region corresponds to the range in which the mucosal muscle plate 105 is infiltrated, scoring the disease state score in this way makes the progress of advanced cancer more detailed and objective. Can be evaluated.
  • a large disease state score according to the proportion of the high oxygen region in the low oxygen region for example, a plurality of reference patterns in which the area of the high oxygen region 132 is changed are prepared in advance. Matching is performed, and the total lesion score obtained by matching with each reference pattern may be used as the final lesion score.
  • a reference pattern in which the ratio of the high oxygen region in the low oxygen region is different from the reference pattern 130 may be generated and used. Further, a value obtained by correcting the similarity obtained by the matching in accordance with the ratio of the high oxygen region in the low oxygen region may be used as the lesion score.
  • the medical condition score calculated by the medical condition score calculation unit 97 is input to the image display signal generation unit 66 and displayed on the monitor 18 together with the oxygen saturation image.
  • the flow of observation by the endoscope system 10 of the present embodiment will be described along the flowchart of FIG.
  • screening is performed from the farthest view state (S10).
  • a normal observation image is displayed on the monitor 18.
  • the mode switching SW 22b is operated to switch to the special observation mode. (S12).
  • a possible lesion site is diagnosed.
  • the first and second white lights are alternately irradiated onto the observation target in synchronization with the imaging frame of the sensor 48. Therefore, in the frame irradiated with the first white light, the sensor 48 detects the R1 image signal, G1. An image signal and a B1 image signal are output, and an R2 image signal, a G2 image signal, and a B2 image signal are output in a frame irradiated with the second white light. Based on the image signals for these two frames, the signal ratio calculation unit 81 calculates the signal ratio B1 / G2 and the signal ratio R2 / G2 for each pixel (S13).
  • the oxygen saturation calculation unit 83 calculates the oxygen saturation for each pixel based on the signal ratio B1 / G2 and the signal ratio R2 / G2 (S14).
  • the image generation unit 87 generates an oxygen saturation image in which a gain corresponding to the oxygen saturation is applied to the B2 image signal, the G2 image signal, and the R2 image signal (S15).
  • the distribution pattern generation unit 96 calculates a distribution pattern representing a distribution related to oxygen saturation based on the oxygen saturation data calculated by the oxygen saturation calculation unit 83 (S16), and the disease state score calculation unit 97 further calculates the distribution pattern. Based on this distribution pattern, a medical condition score is calculated (S17).
  • the oxygen saturation image and the disease state score thus generated and calculated are displayed on the monitor 18 (S18).
  • S18 For example, as shown in FIG. 17, when the diseased tissue is early cancer, an oxygen saturation image 141 in which almost the entire cancer tissue 103 is pseudo-colored and a disease state score “12” are displayed on the monitor 18.
  • FIG. 18 when the diseased tissue is advanced cancer, an oxygen saturation image 142 in which a cancer tissue 113 having a normal-colored high-oxygen region in a pseudo-colored low-oxygen region is projected, and a medical condition The score “95” is displayed on the monitor 18.
  • the doctor can display the oxygen saturation image displayed on the monitor 18. By observing, it is possible to determine whether the diseased tissue is early cancer or advanced cancer (or not cancer).
  • the lesion score that objectively evaluates the possibility of infiltration is displayed on the monitor 18, the possibility of invasion can be grasped more easily by looking at this lesion score. Detailed diagnosis including the degree of cancer progression can be performed accurately and easily.
  • the endoscope system 10 not only displays the oxygen saturation image on the monitor 18, but also calculates a lesion score and displays the lesion score on the monitor 18, thereby supporting a doctor's diagnosis. can do.
  • the endoscope system 10 can perform the support by calculating and displaying the lesion score in real time while observing the observation target.
  • the endoscope system 10 can observe the surface of the observation target and score the degree of progression of the lesion into the observation target (in the depth direction of the lesion).
  • Information that supports diagnosis can be presented more quickly (for example, without lowering the frame rate of observation) than when observation is performed for each depth of the observation target, for example.
  • the endoscope system 10 generates and displays an oxygen saturation image in which a hypoxic region is pseudo-colored with an oxygen saturation of 60% as a boundary, and further divides the oxygen saturation into finer steps. You may make it pseudo color so that it may become a different color in a stage. For example, by changing the gain by which the image signal is multiplied according to the oxygen saturation, it is possible to display different colors at each stage of the oxygen saturation. In this case, as shown in FIGS. 19 and 20, oxygen saturation images 143 and 144 in which the distribution shape of the oxygen saturation inside the cancer tissue 103 of early cancer and the cancer tissue 113 of advanced cancer are shown in detail are displayed. be able to.
  • the endoscope system 10 calculates and displays a disease state score that supports diagnosis of the degree of progression of cancer. However, the endoscope system 10 diagnoses the degree of progression of lesions other than cancer (such as inflammation and ulcers) in the same manner.
  • the supporting medical condition score can be calculated and displayed.
  • the endoscope system 10 calculates and displays a medical condition score that increases according to the degree of progression of cancer. Conversely, a medical condition score that decreases according to the degree of progression of cancer may be calculated.
  • the medical condition score calculation unit 97 may calculate the reciprocal of the similarity between the distribution pattern obtained from the distribution pattern generation unit 96 and the reference pattern as the medical condition score.
  • the disease state score is the highest when there is no high oxygen region in the low oxygen region, and the disease state score is small when there is a high oxygen region in the low oxygen region.
  • the disease state score is further reduced as the proportion of the high oxygen region in the low oxygen region increases. For this reason, the small disease state score represents the degree of progression of cancer.
  • a reference pattern storage unit 98 that stores a reference pattern 130 in advance is provided in the evaluation unit 65, but as shown in FIG. 21, for example, an external network connected to the processor device 16 is provided.
  • the reference pattern 130 may be acquired from the database 151.
  • maintenance is performed to update data in the reference pattern storage unit 98 even when the reference pattern 130 is updated based on the latest case.
  • the lesion score can always be calculated using the optimum reference pattern even if it is not.
  • the distribution pattern generation unit 96 and the disease state score calculation unit 97 do not designate areas for calculating the distribution pattern and the lesion score, but the distribution pattern generation unit 96 and the disease state score calculation unit 97 As shown in FIG. 22, a distribution pattern and a lesion score may be calculated for a region of interest 161 designated in advance. In this way, the lesion score can be calculated quickly.
  • the region of interest 161 is, for example, a region designated by a doctor while observing the normal observation image 110 or the like (may be an oxygen saturation image).
  • the area of interest 161 can be specified by the processor device 16.
  • the distribution pattern and the lesion score are calculated for the entire range being observed, and when the region of interest 161 is designated, the distribution pattern and the region of interest for the designated region 161 are calculated. What is necessary is just to calculate a lesion score.
  • the region of interest 161 may be automatically performed by the endoscope system 10.
  • the evaluation unit 65 is provided with a region-of-interest extraction unit that extracts a region having a predetermined size and a predetermined shape (for example, a quadrangle) including a hypoxic region from the distribution patterns 115 and 125 generated by the distribution pattern generation unit 96 as the region of interest.
  • a medical condition score may be calculated for the given region of interest.
  • one distribution pattern (distribution shape of oxygen saturation) is calculated in the distribution pattern generation unit 96, and a lesion score corresponding to this is calculated in the disease state score calculation unit 97.
  • the total lesion score calculated for each distribution pattern and the lesion score calculated for each distribution pattern are weighted.
  • the result calculated using the lesion score calculated for each distribution pattern, such as the added value, may be used as the lesion score to be finally calculated and displayed.
  • the endoscope system of the second embodiment further includes a similar clinical data selection unit 201 in the evaluation unit 65.
  • Other configurations are the same as those of the endoscope system 10 of the first embodiment.
  • the similar clinical data selection unit 201 acquires a medical condition score from the medical condition score calculation unit 97, and a clinical database in which a plurality of clinical data 203 are stored for past cases in which the acquired medical condition score is close to (or coincides with) the lesion score. Select from 202.
  • the selected clinical data (hereinafter referred to as similar clinical data) is input to the image display signal generation unit 66.
  • the oxygen saturation image 206 included in the similar clinical data is displayed on the monitor 18 along with the oxygen saturation image 142 to be observed.
  • the oxygen saturation image 206 of similar clinical data can be presented to the doctor in real time.
  • the oxygen saturation image 206 In addition to the oxygen saturation image 206, other types of images such as a normal observation image included in similar clinical data and a narrow-band light image when observed with narrow-band light may be displayed on the monitor 18. Further, when the similar clinical data includes various images, as well as the results of diagnosis such as disease names, the treatments performed, and the effects thereof, these may be displayed on the monitor 18. Various records of these similar cases are also helpful to the doctor's diagnosis.
  • the clinical database 202 may be an external database connected to the endoscope system via a network, or may be a database built in the endoscope system (processor device 16).
  • the latest one among them may be selected and displayed, or the one with a large number of references may be selected and displayed.
  • clinical data to be displayed as similar clinical data may be set in advance for each disease state score. Further, similar clinical data may be selected by narrowing down according to the commonality with the observation object other than the disease state score such as the age and sex of the subject and the observation site.
  • one example of similar clinical data is selected, but a plurality of similar clinical data may be selected and displayed on the monitor 18.
  • these lists may be displayed on the monitor 18, and an oxygen saturation image or the like of the similar clinical data selected by the doctor from this list may be displayed on the monitor 18. .
  • the endoscope system of the third embodiment is obtained by adding a treatment effect score calculation unit 301 to the evaluation unit 65 with respect to the endoscope system 10 of the first embodiment.
  • Other configurations are the same as those of the endoscope system 10 of the first embodiment.
  • the treatment effect score calculation unit 301 acquires a distribution pattern from the distribution pattern generation unit 96, and calculates a treatment effect score representing a treatment effect by a specific treatment method based on the acquired distribution pattern. Specifically, the therapeutic effect score is calculated based on the distribution pattern (distribution shape of oxygen saturation) and information calculated from the distribution pattern (the area of the hypoxic region, the ratio of the high oxygen region in the hypoxic region, etc.). calculate.
  • the therapeutic effect score calculation unit 301 inputs the calculated therapeutic effect score to the image display signal generation unit 66, thereby monitoring the therapeutic effect score 302 along with the oxygen saturation image 142 to be observed as shown in FIG. 18 is displayed.
  • Specific treatment methods and their therapeutic effects include, for example, anticancer drugs and their effects, radiation therapy and their effects, surgery and their prognosis (survival rate, likelihood of recurrence and metastasis) when the lesion is cancer. .
  • the therapeutic effect score calculation unit 301 treats the anticancer agent when there are many hypoxic regions.
  • the effect score is low and there are many high oxygen regions (particularly, the proportion of the high oxygen region in the low oxygen region is large)
  • the therapeutic effect score of the anticancer agent is scored high.
  • the therapeutic effect score calculation unit 301 sets the therapeutic effect score of radiotherapy when there are many hypoxic regions. Scoring low and scoring a high therapeutic effect score for radiation therapy when there are many high oxygen regions (the proportion of high oxygen regions in the low oxygen region is large).
  • the endoscope system of the fourth embodiment is obtained by adding a message display control unit 401 to the evaluation unit 65 with respect to the endoscope system 10 of the first embodiment.
  • the configuration is the same as that of the endoscope system 10 of the first embodiment.
  • the message display control unit 401 acquires a medical condition score from the medical condition score calculation unit 97, and monitors the acquired value of the medical condition score. For example, the message display control unit 401 inputs information corresponding to a medical condition score such as a warning to the image display signal generation unit 66. As a result, as shown in FIG. 28, a message 402 corresponding to the disease state score is displayed on the monitor 18 along with the oxygen saturation image 142 to be observed. For example, when the disease state score is large, there is a high possibility that the mucosal muscle plate 105 or the submucosal tissue layer 106 has infiltrated. If the disease state score is equal to or higher than a predetermined value, the possibility of infiltration is warned.
  • the message 402 to be displayed is preferably displayed.
  • the message display control unit 401 always monitors the medical condition score, but the message 402 may not be displayed.
  • the message may be displayed only when the symptom score is a specific symptom score, such as when the medical condition score is greater than or equal to (or less than) a specific value or within a predetermined range.
  • the endoscope system of the fifth embodiment is obtained by adding an automatic storage control unit 501 to the evaluation unit 65 with respect to the endoscope system 10 of the first embodiment.
  • the configuration is the same as that of the endoscope system 10 of the first embodiment.
  • the automatic storage control unit 501 acquires a medical condition score from the medical condition score calculation unit 97 and monitors the value. Then, when the medical condition score is equal to or greater than a predetermined specified value, for example, an oxygen saturation image output from the structure enhancement unit 78 is acquired and automatically stored in the storage unit 502. The automatic storage of the oxygen saturation image performed by the automatic storage control unit 501 is executed even when a doctor operating the endoscope system does not perform an operation for storing a still image.
  • the oxygen saturation image that is automatically stored is stored in association with a disease state score.
  • the medical condition score is recorded as incidental information in the header of the oxygen saturation image, for example.
  • the medical condition score is high, there is a high possibility of advanced cancer, and it is usually highly likely that the doctor will save it as a still image. However, even if the doctor forgets to save it as a still image, automatic Since the oxygen saturation image is automatically stored by the storage control unit 501, there is no need to repeat the examination, and the burden on the doctor and the subject is reduced.
  • automatic saving by the automatic saving control unit 501 may be executed at regular frame intervals.
  • the similar clinical data selection unit 201 of the second embodiment, the treatment effect score calculation unit 301 of the third embodiment, the message display control unit 401 of the fourth embodiment, and the automatic storage control unit 501 of the fifth embodiment Can be used in any combination.
  • the endoscope system includes a feature region extraction unit 601 in the evaluation unit 65.
  • the medical condition score calculation unit 602 calculates a medical condition score based on the oxygen saturation distribution pattern in the feature region extracted by the feature region extraction unit 601.
  • Other configurations are the same as those of the endoscope system 10 of the first embodiment.
  • the feature region extraction unit 601 acquires an image signal, and extracts a portion suspected of being a lesion as a feature region to be observed. For example, as illustrated in FIG. 31, the feature region extraction unit 601 extracts a region with a strong redness (hereinafter referred to as a red region) 605 as a feature region based on the acquired image signal.
  • a red region a region with a strong redness
  • the disease state score calculation unit 602 calculates a disease state score representing a disease state to be observed based on the oxygen saturation distribution pattern in the reddening region 605 extracted by the feature region extraction unit 601. For example, if a distribution pattern having a high oxygen region 607 in a low oxygen region 606 overlaps with the red region 605 as in the oxygen saturation distribution pattern 603, the red region 605 forms a new blood vessel by cancer tissue. Is likely due to. For this reason, by calculating the disease state score based on the oxygen saturation distribution pattern in the reddening region 605, the presence / absence of the cancer tissue and the degree of progression thereof can be expressed particularly accurately by the disease state score.
  • the disease state score calculation unit 602 displays a disease state score when the redness region 605 has a high oxygen region with an oxygen saturation level equal to or higher than a certain value, and a disease state score when the redness region 605 does not have a high oxygen region. Larger than. Further, when the red region 605 includes a high oxygen region, the disease state score is increased as the ratio of the high oxygen region to the red region 605 increases. In this way, the degree of cancer progression can be expressed more easily by the disease state score.
  • the disease state score calculation unit 602 has a disease state score when the redness region 605 has a high oxygen region with an oxygen saturation level equal to or higher than a certain value smaller than a disease state score when the redness region 605 does not have a high oxygen region. You may do it. In this case, when there is a high oxygen region in the red region 605, the disease state score decreases as the ratio of the high oxygen region to the red region 605 increases. Even in this case, the degree of progression of cancer can be expressed more easily by the disease state score.
  • the feature region extraction unit 601 can extract the red region 605 using the B2 image signal or the G2 image signal.
  • the B2 image signal and G2 image signal including information on this wavelength band the contrast due to the absorption of hemoglobin greatly appears. This is because it is easy to determine the presence or absence of blood vessels.
  • a B1 image signal or a G2 image signal may be used.
  • the red region 605 may be extracted based on a plurality of image signals including the R1 (R2) image signal.
  • region 605 is extracted as a characteristic area
  • the disease state score is input to the image display signal generation unit 66 and displayed on the monitor 18 as in the first embodiment. Therefore, the second to fifth embodiments can be used in combination with the sixth embodiment.
  • the light source device 14 of the endoscope system 700 includes an LED (Light Emitting Diode) light source unit 701 instead of the first and second blue laser light sources 34 and 36 and the light source control unit 40.
  • An LED light source control unit 704 is provided.
  • the phosphor 44 is not provided in the illumination optical system 24 a of the endoscope system 700. Other than that, it is the same as the endoscope system 10 of the first embodiment.
  • the LED light source unit 701 includes an R-LED 701a, a G-LED 701b, and a B-LED 701c as light sources that emit light limited to a specific wavelength band.
  • the R-LED 701a has a red band light in the red region of 600 to 720 nm (hereinafter simply referred to as red light)
  • the G-LED 701b has a green band light in the green region of 480 to 620 nm (hereinafter referred to as “red light”). Simply emits green light).
  • the B-LED 701c emits blue band light in the blue region of 400 to 500 nm (hereinafter simply referred to as blue light).
  • the LED light source unit 701 has a high-pass filter (HPF) 702 that is inserted into and removed from the optical path of blue light emitted from the B-LED 701c.
  • the high pass filter 702 cuts blue light having a wavelength band of 450 nm or less and transmits light having a wavelength band longer than 450 nm.
  • the cut-off wavelength (450 nm) of the high-pass filter 702 is a wavelength at which the absorption coefficients of oxyhemoglobin and reduced hemoglobin are substantially equal (see FIG. 10), and the absorption coefficients of oxyhemoglobin and reduced hemoglobin are reversed at this wavelength.
  • the correlation stored in the correlation storage unit 82 is a case where the extinction coefficient of oxyhemoglobin is larger than the extinction coefficient of reduced hemoglobin. Therefore, a signal based on a wavelength band equal to or less than the cutoff wavelength is The signal ratio B1 / G2 is lower than the original value measured at 473 nm, causing inaccurate oxygen saturation to be calculated. For this reason, the high-pass filter 702 prevents the observation target from being irradiated with light in a wavelength band equal to or less than the cutoff wavelength when acquiring the B1 image signal for calculating the oxygen saturation.
  • the high pass filter 702 is inserted in front of the B-LED 701c in the special observation mode, and is retracted to the retreat position in the normal observation mode.
  • the high-pass filter 702 is inserted / removed by the HPF insertion / extraction unit 703 under the control of the LED light source control unit 704.
  • the LED light source control unit 704 controls turning on / off of the LEDs 701a to 701c of the LED light source unit 701 and insertion / extraction of the high-pass filter 702. Specifically, as shown in FIG. 34, in the normal observation mode, the LED light source control unit 704 turns on all the LEDs 701a to 701c, and the high-pass filter 702 retracts from the optical path of the B-LED 701c.
  • the LED light source control unit 704 inserts the high-pass filter 702 on the optical path of the B-LED 701c.
  • the B-LED 701c is turned on and the R-LED 701a and the G-LED 701b are turned off, so that the observation target is irradiated with blue light having a wavelength band of 450 nm or less.
  • the R-LED 701a, the G-LED 701b, and the B-LED 701c are all turned on, and the blue light from which the wavelength band of 450 nm or less is cut out of the blue light emitted from the B-LED 701c and the R-LED 701a emits light.
  • the observation object is irradiated with white light composed of red light and green light emitted from the G-LED 701b. Accordingly, the sensor 48 outputs a B1 image signal in the first frame, and outputs an R2 image signal, a G2 image signal, and a B2 image signal in the second frame, respectively. Therefore, the subsequent processing can be performed in the same manner as the endoscope system 10 of the first embodiment.
  • the observation target is imaged with the high-pass filter 702 inserted in both the first frame and the second frame in the special observation mode, but the high-pass filter 702 is inserted only in the first frame, The high pass filter 702 may be retracted to the eyes. Further, in the first frame in the special observation mode, only the B-LED 701c is turned on and only the blue light is irradiated to the observation target. However, the R-LED 701a and the G-LED 701b are also turned on in the first frame, The signal and the G1 image signal may be output to the sensor 48.
  • the light source device 14 of the endoscope system 800 includes a broadband light source 801, a rotary filter 802, and a rotation instead of the first and second blue laser light sources 34 and 36 and the light source control unit 40.
  • a filter control unit 803 is provided.
  • the sensor 805 of the endoscope system 800 is a monochrome image sensor that is not provided with a color filter. About other than that, it is the same as the endoscope system 10 of 1st Embodiment.
  • the broadband light source 801 includes, for example, a xenon lamp, a white LED, and the like, and emits white light whose wavelength band ranges from blue to red.
  • the rotary filter 802 includes a normal observation mode filter 810 and a special observation mode filter 811 (see FIG. 37), and the white light emitted from the broadband light source 801 is normally placed on the optical path on which the light guide 41 is incident. It is movable in the radial direction between the first position for the normal observation mode where the filter for observation mode 810 is arranged and the second position for the special observation mode where the filter for special observation mode 811 is arranged.
  • the mutual movement of the rotary filter 802 to the first position and the second position is controlled by the rotary filter control unit 803 according to the selected observation mode. Further, the rotary filter 802 rotates according to the imaging frame of the sensor 805 in a state where the rotary filter 802 is disposed at the first position or the second position. The rotation speed of the rotation filter 802 is controlled by the rotation filter control unit 803 according to the selected observation mode.
  • the normal observation mode filter 810 is provided on the inner periphery of the rotary filter 802.
  • the normal observation mode filter 810 includes an R filter 810a that transmits red light, a G filter 810b that transmits green light, and a B filter 810c that transmits blue light. Therefore, when the rotary filter 802 is arranged at the first position for the normal light observation mode, white light from the broadband light source 801 is selected from the R filter 810a, the G filter 810b, and the B filter 810c according to the rotation of the rotary filter 802. Is incident on.
  • the observation object is sequentially irradiated with red light, green light, and blue light according to the transmitted filter, and the sensor 805 images the observation object with these reflected lights, thereby obtaining an R image signal, A G image signal and a B image signal are sequentially output.
  • the special observation mode filter 811 is provided on the outer peripheral portion of the rotary filter 802.
  • the special observation mode filter 811 includes an R filter 811a that transmits red light, a G filter 811b that transmits green light, a B filter 811c that transmits blue light, and a narrow band that transmits 473 ⁇ 10 nm narrow band light. And a filter 811d. Therefore, when the rotary filter 802 is disposed at the second position for the normal light observation mode, white light from the broadband light source 801 is converted into the R filter 811a, the G filter 811b, the B filter 811c, and the narrow band according to the rotation of the rotary filter 802. The light enters one of the filters 811d.
  • the observation target is sequentially irradiated with red light, green light, blue light, and narrowband light (473 nm) according to the transmitted filter, and the sensor 805 images each of the observation targets with these reflected lights.
  • an R image signal, a G image signal, a B image signal, and a narrowband image signal are sequentially output.
  • the R image signal and the G image signal obtained in the special observation mode correspond to the R1 (or R2) image signal and the G1 (or G2) image signal of the first embodiment.
  • the B image signal obtained in the special observation mode corresponds to the B2 image signal of the first embodiment, and the narrowband image signal corresponds to the B1 image signal. Therefore, the subsequent processing can be performed in the same manner as the endoscope system 10 of the first embodiment.
  • the oxygen saturation is calculated based on the signal ratio B1 / G2 and the signal ratio R2 / G2, but the oxygen saturation is calculated based only on the signal ratio B1 / G2. May be.
  • the correlation storage unit 82 may store the correlation between the signal ratio B1 / G2 and the oxygen saturation.
  • an oxygen saturation image obtained by imaging oxygen saturation is generated and displayed.
  • a blood volume image obtained by imaging blood volume is generated and displayed. May be. Since the blood volume has a correlation with the signal ratio R2 / G2, a blood volume image in which the blood volume is imaged can be created by assigning a different color according to the signal ratio R2 / G2.
  • the oxygen saturation is calculated, but instead of or in addition to this, “blood volume (signal ratio R2 / G2) ⁇ oxygen saturation (%)”.
  • Other biological function information such as an oxyhemoglobin index obtained or a reduced hemoglobin index obtained from “blood volume ⁇ (1 ⁇ oxygen saturation) (%)” may be calculated.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Optics & Photonics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Signal Processing (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Endoscopes (AREA)
PCT/JP2014/068764 2013-09-26 2014-07-15 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法及びプロセッサ装置の作動方法 WO2015045576A1 (ja)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP14848419.9A EP3050487B1 (de) 2013-09-26 2014-07-15 Endoskopsystem und prozessorvorrichtung für ein endoskopsystem
US15/058,391 US10231658B2 (en) 2013-09-26 2016-03-02 Endoscope system, processor device for endoscope system, operation method for endoscope system, and operation method for processor device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2013-200653 2013-09-26
JP2013200653 2013-09-26
JP2013235460A JP6140056B2 (ja) 2013-09-26 2013-11-13 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法、プロセッサ装置の作動方法
JP2013-235460 2013-11-13

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/058,391 Continuation US10231658B2 (en) 2013-09-26 2016-03-02 Endoscope system, processor device for endoscope system, operation method for endoscope system, and operation method for processor device

Publications (1)

Publication Number Publication Date
WO2015045576A1 true WO2015045576A1 (ja) 2015-04-02

Family

ID=52742733

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/068764 WO2015045576A1 (ja) 2013-09-26 2014-07-15 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法及びプロセッサ装置の作動方法

Country Status (4)

Country Link
US (1) US10231658B2 (de)
EP (1) EP3050487B1 (de)
JP (1) JP6140056B2 (de)
WO (1) WO2015045576A1 (de)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019042157A (ja) * 2017-09-01 2019-03-22 富士フイルム株式会社 医療画像処理装置、内視鏡装置、診断支援装置、及び、医療業務支援装置
WO2019087971A1 (ja) * 2017-10-30 2019-05-09 富士フイルム株式会社 医療画像処理装置、及び、内視鏡装置
WO2019159273A1 (ja) * 2018-02-15 2019-08-22 株式会社日立製作所 放射線治療装置
JPWO2018180631A1 (ja) * 2017-03-30 2020-01-09 富士フイルム株式会社 医療用画像処理装置及び内視鏡システム並びに医療用画像処理装置の作動方法
WO2020012872A1 (ja) * 2018-07-09 2020-01-16 富士フイルム株式会社 医用画像処理装置、医用画像処理システム、医用画像処理方法、及びプログラム
JPWO2019008941A1 (ja) * 2017-07-03 2020-05-07 富士フイルム株式会社 医療画像処理装置、内視鏡装置、診断支援装置、医療業務支援装置、及び、レポート作成支援装置

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016076905A1 (en) * 2014-11-14 2016-05-19 The General Hospital Corporation Dba Massachusetts General Hospital Prediction of tumor recurrence by measuring oxygen saturation
JP6420358B2 (ja) * 2015-06-25 2018-11-07 Hoya株式会社 内視鏡システム及び評価値計算装置
WO2017057680A1 (ja) * 2015-09-30 2017-04-06 Hoya株式会社 内視鏡システム及び評価値計算装置
US20200116731A1 (en) * 2016-06-30 2020-04-16 Ndsu Research Foundation Method for improving the quality and quantity of offspring in mammals
WO2018159083A1 (ja) * 2017-03-03 2018-09-07 富士フイルム株式会社 内視鏡システム、プロセッサ装置、及び、内視鏡システムの作動方法
WO2018165620A1 (en) * 2017-03-09 2018-09-13 The Board Of Trustees Of The Leland Stanford Junior University Systems and methods for clinical image classification
US20180289263A1 (en) * 2017-03-30 2018-10-11 Nan M. Jokerst Devices and methods for endoscopic optical assessment of tissue histology
CN110913749B (zh) 2017-07-03 2022-06-24 富士胶片株式会社 医疗图像处理装置、内窥镜装置、诊断支持装置、医疗业务支持装置及报告书制作支持装置
JP7033146B2 (ja) * 2017-10-17 2022-03-09 富士フイルム株式会社 医療画像処理装置、及び、内視鏡装置
WO2019138773A1 (ja) * 2018-01-10 2019-07-18 富士フイルム株式会社 医療画像処理装置、内視鏡システム、医療画像処理方法及びプログラム
JP7090699B2 (ja) * 2018-05-17 2022-06-24 オリンパス株式会社 内視鏡装置及び内視鏡装置の作動方法
WO2020008834A1 (ja) * 2018-07-05 2020-01-09 富士フイルム株式会社 画像処理装置、方法及び内視鏡システム
KR102168485B1 (ko) * 2018-10-02 2020-10-21 한림대학교 산학협력단 실시간으로 획득되는 위 내시경 이미지를 기반으로 위 병변을 진단하는 내시경 장치 및 방법
CN113038868A (zh) * 2018-11-14 2021-06-25 富士胶片株式会社 医疗图像处理系统
CN113040707A (zh) * 2020-12-02 2021-06-29 泰州国安医疗用品有限公司 人体组织病变参数解析平台及方法

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001037718A (ja) * 1999-05-26 2001-02-13 Olympus Optical Co Ltd 画像診断装置及び内視鏡装置
JP2001046388A (ja) * 1999-08-12 2001-02-20 Terumo Corp 加熱治療装置
JP2010068865A (ja) * 2008-09-16 2010-04-02 Fujifilm Corp 画像診断装置
JP2010167045A (ja) * 2009-01-21 2010-08-05 Olympus Medical Systems Corp 画像表示装置
JP2010184057A (ja) * 2009-02-13 2010-08-26 Fujifilm Corp 画像処理方法および装置
JP2012125402A (ja) 2010-12-15 2012-07-05 Fujifilm Corp 内視鏡システム、内視鏡システムのプロセッサ装置及び機能情報取得方法
JP2012213550A (ja) 2011-04-01 2012-11-08 Fujifilm Corp 生体情報取得システムおよび生体情報取得方法
JP2012213551A (ja) * 2011-04-01 2012-11-08 Fujifilm Corp 生体情報取得システムおよび生体情報取得方法
JP2012239816A (ja) * 2011-05-24 2012-12-10 Fujifilm Corp 内視鏡システム及び内視鏡診断支援方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63311937A (ja) * 1987-06-16 1988-12-20 Toshiba Corp 内視鏡装置
US5795295A (en) * 1996-06-25 1998-08-18 Carl Zeiss, Inc. OCT-assisted surgical microscope with multi-coordinate manipulator
US6516209B2 (en) * 2000-08-04 2003-02-04 Photonify Technologies, Inc. Self-calibrating optical imaging system
JP5452300B2 (ja) * 2010-03-19 2014-03-26 富士フイルム株式会社 電子内視鏡システム、電子内視鏡用のプロセッサ装置、電子内視鏡システムの作動方法、病理観察装置および病理顕微鏡装置
JP5616303B2 (ja) * 2010-08-24 2014-10-29 富士フイルム株式会社 電子内視鏡システム及び電子内視鏡システムの作動方法
JP5222934B2 (ja) * 2010-12-21 2013-06-26 富士フイルム株式会社 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡システムの作動方法
JP5502812B2 (ja) * 2011-07-14 2014-05-28 富士フイルム株式会社 生体情報取得システムおよび生体情報取得システムの作動方法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001037718A (ja) * 1999-05-26 2001-02-13 Olympus Optical Co Ltd 画像診断装置及び内視鏡装置
JP2001046388A (ja) * 1999-08-12 2001-02-20 Terumo Corp 加熱治療装置
JP2010068865A (ja) * 2008-09-16 2010-04-02 Fujifilm Corp 画像診断装置
JP2010167045A (ja) * 2009-01-21 2010-08-05 Olympus Medical Systems Corp 画像表示装置
JP2010184057A (ja) * 2009-02-13 2010-08-26 Fujifilm Corp 画像処理方法および装置
JP2012125402A (ja) 2010-12-15 2012-07-05 Fujifilm Corp 内視鏡システム、内視鏡システムのプロセッサ装置及び機能情報取得方法
JP2012213550A (ja) 2011-04-01 2012-11-08 Fujifilm Corp 生体情報取得システムおよび生体情報取得方法
JP2012213551A (ja) * 2011-04-01 2012-11-08 Fujifilm Corp 生体情報取得システムおよび生体情報取得方法
JP2012239816A (ja) * 2011-05-24 2012-12-10 Fujifilm Corp 内視鏡システム及び内視鏡診断支援方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3050487A4

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11412917B2 (en) 2017-03-30 2022-08-16 Fujifilm Corporation Medical image processor, endoscope system, and method of operating medical image processor
JPWO2018180631A1 (ja) * 2017-03-30 2020-01-09 富士フイルム株式会社 医療用画像処理装置及び内視鏡システム並びに医療用画像処理装置の作動方法
US11450425B2 (en) 2017-07-03 2022-09-20 Fujifilm Corporation Medical image processing apparatus, endoscope apparatus, diagnostic support apparatus, medical service support apparatus, and report creation support apparatus
JPWO2019008941A1 (ja) * 2017-07-03 2020-05-07 富士フイルム株式会社 医療画像処理装置、内視鏡装置、診断支援装置、医療業務支援装置、及び、レポート作成支援装置
JP2019042157A (ja) * 2017-09-01 2019-03-22 富士フイルム株式会社 医療画像処理装置、内視鏡装置、診断支援装置、及び、医療業務支援装置
US11010891B2 (en) 2017-09-01 2021-05-18 Fujifilm Corporation Medical image processing apparatus, endoscope apparatus, diagnostic support apparatus, and medical service support apparatus
JPWO2019087971A1 (ja) * 2017-10-30 2020-10-22 富士フイルム株式会社 医療画像処理装置、及び、内視鏡装置
WO2019087971A1 (ja) * 2017-10-30 2019-05-09 富士フイルム株式会社 医療画像処理装置、及び、内視鏡装置
WO2019159273A1 (ja) * 2018-02-15 2019-08-22 株式会社日立製作所 放射線治療装置
CN112367896A (zh) * 2018-07-09 2021-02-12 富士胶片株式会社 医用图像处理装置、医用图像处理系统、医用图像处理方法及程序
WO2020012872A1 (ja) * 2018-07-09 2020-01-16 富士フイルム株式会社 医用画像処理装置、医用画像処理システム、医用画像処理方法、及びプログラム
JPWO2020012872A1 (ja) * 2018-07-09 2021-08-02 富士フイルム株式会社 医用画像処理装置、医用画像処理システム、医用画像処理方法、及びプログラム
JP7270626B2 (ja) 2018-07-09 2023-05-10 富士フイルム株式会社 医用画像処理装置、医用画像処理システム、医用画像処理装置の作動方法、プログラム、及び記憶媒体
US11991478B2 (en) 2018-07-09 2024-05-21 Fujifilm Corporation Medical image processing apparatus, medical image processing system, medical image processing method, and program

Also Published As

Publication number Publication date
EP3050487B1 (de) 2018-03-07
JP6140056B2 (ja) 2017-05-31
US20160174886A1 (en) 2016-06-23
JP2015085152A (ja) 2015-05-07
EP3050487A4 (de) 2017-01-18
EP3050487A1 (de) 2016-08-03
US10231658B2 (en) 2019-03-19

Similar Documents

Publication Publication Date Title
JP6140056B2 (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法、プロセッサ装置の作動方法
JP6092792B2 (ja) 内視鏡システム用プロセッサ装置、内視鏡システム、内視鏡システム用プロセッサ装置の作動方法、内視鏡システムの作動方法
JP6785948B2 (ja) 医療用画像処理装置及び内視鏡システム並びに医療用画像処理装置の作動方法
JP5992936B2 (ja) 内視鏡システム、内視鏡システム用プロセッサ装置、内視鏡システムの作動方法、内視鏡システム用プロセッサ装置の作動方法
WO2018159363A1 (ja) 内視鏡システム及びその作動方法
JP5887367B2 (ja) プロセッサ装置、内視鏡システム、及び内視鏡システムの作動方法
JP5977772B2 (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法、プロセッサ装置の作動方法
JP5789280B2 (ja) プロセッサ装置、内視鏡システム、及び内視鏡システムの作動方法
JP6640865B2 (ja) 画像処理装置、内視鏡システム、及び画像処理方法
JP2019081044A (ja) 画像処理装置、画像処理装置の作動方法、および画像処理プログラム
JP6109695B2 (ja) 内視鏡システム及びプロセッサ装置並びに作動方法並びに距離測定装置
JP6866531B2 (ja) 医療画像処理システム及び内視鏡システム
JP2009039510A (ja) 撮像装置
WO2017057573A1 (ja) 画像処理装置、内視鏡システム、及び画像処理方法
JPWO2013115323A1 (ja) 生体観察装置
JP2020065685A (ja) 内視鏡システム
JP6203088B2 (ja) 生体観察システム
JP6389299B2 (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法、プロセッサ装置の作動方法
JP6129686B2 (ja) 内視鏡システム及びプロセッサ装置並びに作動方法並びにテーブル作成方法
US20230237659A1 (en) Image processing apparatus, endoscope system, operation method of image processing apparatus, and non-transitory computer readable medium
JP6099518B2 (ja) 内視鏡システム及び作動方法
JP2016067781A (ja) 内視鏡用のプロセッサ装置、内視鏡用のプロセッサ装置の作動方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14848419

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2014848419

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014848419

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE