WO2017158692A1 - Dispositif d'endoscope, dispositif de traitement d'image, procédé de traitement d'image et programme - Google Patents

Dispositif d'endoscope, dispositif de traitement d'image, procédé de traitement d'image et programme Download PDF

Info

Publication number
WO2017158692A1
WO2017158692A1 PCT/JP2016/058002 JP2016058002W WO2017158692A1 WO 2017158692 A1 WO2017158692 A1 WO 2017158692A1 JP 2016058002 W JP2016058002 W JP 2016058002W WO 2017158692 A1 WO2017158692 A1 WO 2017158692A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
image
filter
wavelength band
light source
Prior art date
Application number
PCT/JP2016/058002
Other languages
English (en)
Japanese (ja)
Inventor
直 菊地
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to PCT/JP2016/058002 priority Critical patent/WO2017158692A1/fr
Priority to PCT/JP2017/005175 priority patent/WO2017159165A1/fr
Priority to JP2018505351A priority patent/JPWO2017159165A1/ja
Priority to CN201780017384.XA priority patent/CN108778091B/zh
Priority to DE112017001293.7T priority patent/DE112017001293T5/de
Publication of WO2017158692A1 publication Critical patent/WO2017158692A1/fr
Priority to US16/039,802 priority patent/US11045079B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000095Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00064Constructional details of the endoscope body
    • A61B1/00071Insertion part of the endoscope body
    • A61B1/0008Insertion part of the endoscope body characterised by distal tip features
    • A61B1/00096Optical elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/043Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0638Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0646Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with illumination filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0655Control therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0661Endoscope light sources
    • A61B1/0676Endoscope light sources at distal tip of an endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/07Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements using light-conductive means, e.g. optical fibres
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2461Illumination
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2476Non-optical details, e.g. housings, mountings, supports
    • G02B23/2484Arrangements in relation to a camera or imaging device

Definitions

  • the present invention relates to an endoscope apparatus, an image processing apparatus, an image processing method, and a program that are introduced into a subject and acquire an image in the subject.
  • a medical endoscope apparatus is configured by inserting a flexible insertion portion having an elongated shape in which an imaging element having a plurality of pixels is provided at a tip into a body cavity of a subject such as a patient. Since an in-vivo image in a body cavity can be acquired without incising the subject, the burden on the subject is small, and the spread is increasing.
  • a white light observation method (WLI: White Light Imaging) using white illumination light (white light) and two narrow bands included in the blue and green wavelength bands, respectively.
  • a narrow-band light observation method (NBI: Narrow Band Imaging) using illumination light (narrow-band light) composed of light is widely known.
  • the narrow-band light observation method can obtain an image that highlights capillaries and mucous membrane fine patterns existing on the surface of the mucous membrane of a living body. According to the narrow-band light observation method, a lesion in the mucosal surface layer of a living body can be found more accurately.
  • a color filter generally called a Bayer array is provided on the light receiving surface of the image sensor in order to obtain a captured image by a single-plate image sensor.
  • filters that transmit light in the wavelength bands of red (R), green (G), and blue (B) (hereinafter referred to as “R filter”, “G filter”, and “B filter”) are combined into one filter. These are arranged for each pixel as a unit.
  • each pixel receives light in the wavelength band that has passed through the filter, and the image sensor generates an electrical signal of a color component corresponding to the light in that wavelength band.
  • an interpolation process for interpolating the signal value of the missing color component without passing through the filter in each pixel is performed. Such interpolation processing is called demosaicing processing.
  • a high resolution can be obtained by using a signal value of a pixel that receives light transmitted through the G filter.
  • the G filter and the B filter are used.
  • the number of B filters is only 1 ⁇ 4 of the total, so that even if the same processing as in the white light observation method is performed, a high resolution image cannot be obtained.
  • a technique is known in which the positions of the G filters and the B filters in the Bayer array are interchanged and the number of the B filters is arranged most in one filter unit (see Patent Document 1).
  • Patent Document 1 in the narrow-band light observation method, an image having a higher resolution than that of the Bayer array image can be obtained, while the number of G pixels is smaller than that of the conventional Bayer array, so that white In the light observation method, there is a problem that the resolution is lower than that of a conventional Bayer array image.
  • the present invention has been made in view of the above, and is an endoscope apparatus, an image processing apparatus, and the like that can obtain a high-resolution image in both the white light observation system and the narrow-band light observation system.
  • An object is to provide an image processing method and program.
  • an endoscope apparatus includes a first illumination light including light in each of the red, green, and blue wavelength bands, or light in the green wavelength band. And a light source unit that emits second illumination light including light in the blue wavelength band or light in the red wavelength band, and a plurality of pixels arranged in a two-dimensional grid form each receiving and photoelectrically converting the light.
  • An imaging device that generates an imaging signal, at least one first filter that transmits light in any one of the red, green, and blue wavelength bands, and transmits light in the green wavelength band.
  • a second filter that transmits light in one wavelength band of the red color and the blue color, the filter unit including a plurality of filters having the second filter.
  • a color filter in which a filter unit having a number equal to or more than the number of the first filters of the type arranged at most among the at least one type of the first filters is arranged corresponding to the plurality of pixels;
  • the resolution of the first image when the first illumination light is emitted by the light source unit is higher than the resolution of the first image when the second illumination light is emitted by the light source unit,
  • the resolution of the second image when the second illumination light is emitted by the light source unit is higher than the resolution of the second image when the first illumination light is emitted by the light source unit.
  • the light source unit when the light source unit emits the first illumination light, the light source unit emits light in the green wavelength band stronger than light in the one wavelength band.
  • the second illumination light when the second illumination light is emitted, the light in the one wavelength band is emitted more strongly than the light in the green wavelength band.
  • the endoscope apparatus is characterized in that, in the above invention, the filter unit has a number of the second filters equal to or greater than the total number of other filters arranged in the filter unit. To do.
  • the filter unit transmits light in the green wavelength band and transmits light in the other wavelength band different from the light in the one wavelength band. It has the 3rd filter which permeate
  • the second filter is a cyan filter
  • the third filter is a magenta filter or a yellow filter.
  • the second filter is a yellow filter
  • the third filter is a cyan filter or a magenta filter.
  • the image processing unit performs an interpolation process on an electrical signal generated by the imaging device based on light received by the pixel via the second filter.
  • a first interpolated image for interpolating the imaging signal of the pixel in which a filter other than the second filter is arranged, and based on the first interpolated image, other than the second filter
  • a second interpolated image obtained by interpolating the image pickup signal of the pixel in which a filter other than the second filter is arranged with respect to the image pickup signal generated by the image pickup element based on the light received by the pixel through the filter Is generated.
  • the image processing unit converts light received by the pixel through the second filter.
  • the first image is generated by regarding the electrical signal generated by the imaging device as light received by the pixel through the first filter.
  • the image processing apparatus is an image processing apparatus to which an endoscope having a light source unit, an imaging element, and a color filter is connected, and the light source unit includes red, green, and blue colors, respectively.
  • the first illuminating light including the light of the wavelength band or the second illuminating light composed of the light of the green wavelength band and the light of the blue wavelength band or the red wavelength band is emitted.
  • a plurality of pixels arranged in a three-dimensional lattice receive and photoelectrically convert each pixel to generate an imaging signal, and the color filter emits light in any one wavelength band of the red, green, and blue
  • a unit is arranged corresponding to the plurality of pixels, and when the first illumination light is emitted by the light source unit, the resolution of the first image is emitted by the light source unit.
  • the resolution of the second image when the second illumination light is emitted by the light source unit is higher than the resolution of the first image when the first illumination light is emitted by the light source unit. In this case, the resolution is higher than the resolution of the second image.
  • An image processing method is an image processing method executed by an image processing apparatus to which an endoscope having a light source unit, an image sensor, and a color filter is connected, and the light source unit includes: Emitting first illumination light including light in each wavelength band of red, green and blue, or second illumination light composed of light in the green wavelength band and light in the blue wavelength band or light in the red wavelength band;
  • the imaging element receives a plurality of pixels arranged in a two-dimensional lattice and photoelectrically converts light, and generates an imaging signal.
  • the color filter is any of red, green, and blue At least one first filter that transmits light in one wavelength band; light in the green wavelength band; and light in one of the red and blue wavelengths
  • a filter unit comprising a plurality of filters having a second filter that transmits the second filter, wherein the number of the second filters is the most arranged among the at least one type of the first filters.
  • the number of filter units equal to or greater than the number of filters is arranged corresponding to the plurality of pixels, and the resolution of the first image when the first illumination light is emitted by the light source unit is the light source unit.
  • the resolution of the second image when the second illumination light is emitted by the light source unit is determined by the light source unit.
  • the resolution is higher than the resolution of the second image when the first illumination light is emitted.
  • the program according to the present invention includes an image processing apparatus to which an endoscope having a light source unit, an image sensor, and a color filter is connected, and the light source unit has red, green, and blue wavelength bands.
  • the first illumination light including light, or the second illumination light composed of the light in the green wavelength band and the light in the blue wavelength band or the red wavelength band is emitted, and the imaging element is in a two-dimensional lattice shape
  • the plurality of arranged pixels each receive light and photoelectrically convert light to generate an imaging signal, and the color filter transmits light in any one of the red, green, and blue wavelength bands, respectively.
  • a plurality of filters wherein the number of the second filters is equal to or greater than the number of the first filters of the type that is most disposed among the at least one type of the first filters.
  • a unit is arranged corresponding to the plurality of pixels, and when the first illumination light is emitted by the light source unit, the resolution of the first image is emitted by the light source unit.
  • the resolution of the second image when the second illumination light is emitted by the light source unit is higher than the resolution of the first image when the first illumination light is emitted by the light source unit. In this case, the resolution is higher than the resolution of the second image.
  • FIG. 1 is a diagram showing a schematic configuration of an endoscope apparatus according to Embodiment 1 of the present invention.
  • FIG. 2 is a schematic diagram showing a schematic configuration of the endoscope apparatus according to Embodiment 1 of the present invention.
  • FIG. 3 is a schematic diagram illustrating a configuration of a pixel of the image sensor according to Embodiment 1 of the present invention.
  • FIG. 4 is a schematic diagram showing an example of the configuration of the color filter according to Embodiment 1 of the present invention.
  • FIG. 5 is a diagram showing an example of transmission characteristics of each filter constituting the color filter according to Embodiment 1 of the present invention.
  • FIG. 1 is a diagram showing a schematic configuration of an endoscope apparatus according to Embodiment 1 of the present invention.
  • FIG. 2 is a schematic diagram showing a schematic configuration of the endoscope apparatus according to Embodiment 1 of the present invention.
  • FIG. 3 is a schematic diagram illustrating a configuration of a pixel of the image
  • FIG. 6 is a diagram illustrating an example of spectral characteristics of white light emitted from the light source unit according to Embodiment 1 of the present invention.
  • FIG. 7 is a diagram illustrating an example of spectral characteristics of narrowband light emitted from the light source unit according to Embodiment 1 of the present invention.
  • FIG. 8 is a diagram illustrating an example of transmission characteristics of the Cy filter when white light is emitted to the Cy filter by the light source unit according to Embodiment 1 of the present invention.
  • FIG. 9 is a diagram illustrating an example of transmission characteristics of the Cy filter when narrowband light is emitted to the Cy filter by the light source unit according to Embodiment 1 of the present invention.
  • FIG. 9 is a diagram illustrating an example of transmission characteristics of the Cy filter when narrowband light is emitted to the Cy filter by the light source unit according to Embodiment 1 of the present invention.
  • FIG. 10A is a diagram illustrating an example of an image when a Cy filter is arranged on the light receiving surface of each pixel P.
  • FIG. 10B is a diagram illustrating an example of an image when the Cy filter is arranged on the light receiving surface of each pixel P.
  • FIG. 10C is a diagram illustrating an example of an image when the Cy filter is arranged on the light receiving surface of each pixel P.
  • FIG. 10D is a diagram illustrating an example of an image when the Cy filter is disposed on the light receiving surface of each pixel P.
  • FIG. 10E is a diagram illustrating an example of an image when the Cy filter is arranged on the light receiving surface of each pixel P.
  • FIG. 11 is a flowchart showing an outline of processing executed by the endoscope apparatus according to Embodiment 1 of the present invention.
  • FIG. 12 is a flowchart showing an overview of the image generation processing according to Embodiment 1 of the present invention.
  • FIG. 13 is a schematic diagram for explaining an overview of image generation processing executed by the image processing unit according to Embodiment 1 of the present invention.
  • FIG. 14 is a diagram illustrating an example of a color image generated by the image processing unit according to the first modification of the first embodiment of the present invention when the narrow-band light observation method is used.
  • FIG. 15 is a schematic diagram illustrating an example of a configuration of a color filter according to the second modification of the first embodiment of the present invention.
  • FIG. 16 is a schematic diagram illustrating an example of a configuration of a color filter according to Embodiment 2 of the present invention.
  • FIG. 17 is a diagram illustrating an example of transmission characteristics of each filter constituting the color filter according to Embodiment 2 of the present invention.
  • FIG. 18 is a schematic diagram for explaining an overview of image generation processing executed by the image processing unit according to Embodiment 2 of the present invention.
  • FIG. 19 is a schematic diagram illustrating an example of the configuration of a color filter according to Modification 1 of Embodiment 2 of the present invention.
  • FIG. 20 is a schematic diagram showing another example of the configuration of the color filter according to Embodiment 3 of the present invention.
  • FIG. 21 is a diagram showing an example of transmission characteristics of each filter constituting the color filter according to Embodiment 3 of the present invention.
  • FIG. 22 is a schematic diagram for explaining an overview of image generation processing executed by the image processing unit according to Embodiment 3 of the present invention.
  • FIG. 23 is a schematic diagram showing an example of the configuration of a color filter according to Embodiment 4 of the present invention.
  • FIG. 24 is a diagram showing an example of transmission characteristics of each filter constituting the color filter according to Embodiment 4 of the present invention.
  • FIG. 25 is a diagram illustrating an example of spectral characteristics of white light emitted from the light source unit according to Embodiment 4 of the present invention.
  • FIG. 22 is a schematic diagram for explaining an overview of image generation processing executed by the image processing unit according to Embodiment 3 of the present invention.
  • FIG. 23 is a schematic diagram showing an example of the configuration of a color filter according to Embodiment 4 of the present invention.
  • FIG. 24 is
  • FIG. 26 is a diagram illustrating an example of spectral characteristics of narrowband light emitted from the light source unit according to Embodiment 4 of the present invention.
  • FIG. 27 is a schematic diagram for explaining an overview of image generation processing executed by the image processing unit according to Embodiment 4 of the present invention.
  • FIG. 28 is a diagram showing a list in which variations of color filters according to other embodiments of the present invention are associated with wavelength bands of illumination light emitted from the light source unit and effects.
  • FIG. 29 is a diagram showing an example of the configuration of a color filter according to another embodiment of the present invention.
  • FIG. 30 is a diagram illustrating an example of the configuration of a color filter according to another embodiment of the present invention.
  • FIG. 31 is a diagram showing an example of the configuration of a color filter according to another embodiment of the present invention.
  • FIG. 32 is a diagram illustrating an example of a configuration of a color filter according to another embodiment of the present invention.
  • FIG. 33 is a diagram illustrating an example of transmission characteristics of a Cy filter according to another embodiment of the present invention.
  • FIG. 1 is a diagram showing a schematic configuration of an endoscope apparatus according to Embodiment 1 of the present invention.
  • FIG. 2 is a schematic diagram showing a schematic configuration of the endoscope apparatus according to Embodiment 1 of the present invention.
  • the endoscope apparatus 1 shown in FIGS. 1 and 2 is inserted into a subject such as a patient, images the inside of the subject, and displays this in-vivo image.
  • a user such as a doctor examines the presence or absence of each of a bleeding site, a tumor site (lesion site S), and an abnormal site, which are detection target sites, by observing the displayed in-vivo images.
  • the endoscope apparatus 1 includes an endoscope 2 that captures an in-vivo image of an observation site by being inserted into a subject and generates an electrical signal, and a light source unit that generates illumination light emitted from the distal end of the endoscope 2 3, a predetermined image processing is performed on the electrical signal generated by the endoscope 2, and a processor unit 4 that comprehensively controls the operation of the entire endoscope apparatus 1, and an internal body in which the processor unit 4 performs the image processing. And a display unit 5 for displaying an image.
  • the endoscope 2 includes an insertion portion 21 having an elongated shape having flexibility, an operation portion 22 that is connected to a proximal end side of the insertion portion 21 and receives input of various operation signals, and an insertion portion from the operation portion 22. And a universal cord 23 including various cables that extend in a direction different from the direction in which 21 extends and are connected to the light source unit 3 and the processor unit 4.
  • the insertion unit 21 includes pixels (photodiodes) that receive light in a two-dimensional matrix, and includes an image sensor 201 that generates an image signal by performing photoelectric conversion on the light received by each pixel.
  • the operation section 22 includes a bending knob 221 that bends the bending section 25 in the vertical direction and the left-right direction, a treatment instrument insertion section 222 that inserts a treatment instrument such as a biological forceps, an electric knife, and a test probe into the body cavity of the subject. And a plurality of switches 223 for receiving an input of an instruction signal or the like for causing the light source unit 3 to switch the illumination light.
  • the universal cord 23 includes at least a light guide 203 and an aggregate cable in which one or a plurality of signal lines are collected.
  • the collective cable is a signal line for transmitting and receiving signals between the endoscope 2 and the light source unit 3 and the processor unit 4, and is a signal line for transmitting and receiving setting data, a signal line for transmitting and receiving image signals, A signal line for transmitting and receiving a driving timing signal for driving the image sensor 201 is included.
  • the endoscope 2 includes an imaging optical system 200, an imaging element 201, a color filter 202, a light guide 203, an illumination lens 204, an A / D conversion unit 205, an imaging information storage unit 206, Is provided.
  • the imaging optical system 200 is provided at the distal end portion 24 of the insertion portion 21 and collects light from at least an observation site.
  • the imaging optical system 200 is configured using one or a plurality of lenses.
  • the imaging optical system 200 may be provided with an optical zoom mechanism that changes the angle of view and a focus mechanism that changes the focus.
  • the imaging element 201 is provided perpendicular to the optical axis of the imaging optical system 200, and receives an object image formed by the imaging optical system 200 and performs photoelectric conversion to generate an electrical signal (image signal).
  • the electrical signal is output to the A / D converter 205.
  • the image sensor 201 is realized by using an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). The detailed configuration of the image sensor 201 will be described later.
  • the color filter 202 is disposed on the light receiving surface of the image sensor 201, and has a plurality of filters that each transmit light in a wavelength band set individually. The detailed configuration of the color filter 202 will be described later.
  • the light guide 203 is configured using glass fiber or the like, and serves as a light guide path for light emitted from the light source unit 3.
  • the illumination lens 204 is provided at the tip of the light guide 203, diffuses the light guided by the light guide 203, and emits the light to the outside of the tip 24.
  • the illumination lens 204 is configured using one or a plurality of lenses.
  • the A / D conversion unit 205 performs A / D conversion on the analog electrical signal generated by the image sensor 201 and outputs the converted digital electrical signal to the processor unit 4.
  • the imaging information storage unit 206 stores data including various programs for operating the endoscope 2, various parameters necessary for the operation of the endoscope 2, and identification information of the endoscope 2. Further, the imaging information storage unit 206 includes an identification information storage unit 261 that records identification information.
  • the identification information includes unique information (ID) of the endoscope 2, year type, specification information, transmission method, filter arrangement information in the color filter 202, and the like.
  • the imaging information storage unit 206 is realized using a flash memory or the like.
  • the light source unit 3 includes an illumination unit 31 and an illumination control unit 32.
  • the illumination unit 31 switches a plurality of illumination lights and emits them to the light guide 203 under the control of the illumination control unit 32.
  • the illumination unit 31 includes a light source 31a, a light source driver 31b, a switching filter 31c, a drive unit 31d, and a drive driver 31e.
  • the light source 31a emits white light including light in the wavelength bands H R , wavelength bands H G, and wavelength bands H B of red, green, and blue in accordance with the current input from the light source driver 31b.
  • the light source 31a is realized using a white LED, a xenon lamp, or the like.
  • the light source driver 31b supplies white light to the light source 31a by supplying a current to the light source 31a under the control of the illumination control unit 32.
  • the switching filter 31c is detachably disposed on the optical path of white light emitted from the light source 31a, and transmits light in a predetermined wavelength band among the white light emitted from the light source 31a.
  • the switching filter 31c transmits blue narrow band light and green narrow band light. That is, in the first embodiment, the switching filter 31c transmits two narrowband lights when arranged on the optical path of white light.
  • the switching filter 31c includes a narrow band T B (for example, 390 nm to 445 nm) included in the wavelength band H B and a narrow band T G (for example, 530 nm to 550 nm) included in the wavelength band H G. Of light.
  • the light transmitted through the switching filter 31c is a narrow band illumination light comprising narrowband T B and narrowband T G.
  • These narrow bands T B and TG are wavelength bands of blue light and green light that are easily absorbed by hemoglobin in blood.
  • This image observation with narrow-band illumination light is called a narrow-band light observation method (NBI).
  • the driving unit 31d is configured using a stepping motor, a DC motor, or the like, and arranges or retracts the switching filter 31c on the optical path of white light emitted from the light source 31a under the control of the illumination control unit 32.
  • the drive unit 31d sets the switching filter 31c to the light source 31a.
  • the endoscope apparatus 1 performs the narrow-band light observation method (NBI method) that is the second illumination light
  • the white light emitted from the light source 31a is emitted from the light source 31a. Place on the optical path.
  • the drive driver 31e supplies a predetermined current to the drive unit 31d under the control of the illumination control unit 32.
  • the condensing lens 31 f condenses the white light emitted from the light source 31 a and emits it to the light guide 203.
  • the condensing lens 31 f condenses the light transmitted through the switching filter 31 c and emits it to the light guide 203.
  • the condenser lens 31f is configured using one or a plurality of lenses.
  • the illumination control unit 32 is configured using a CPU or the like.
  • the illumination control unit 32 controls the light source driver 31b based on the instruction signal input from the processor unit 4 to turn on and off the light source 31a.
  • the illumination control unit 32 controls the drive driver 31e based on the instruction signal input from the processor unit 4, and arranges or retracts the switching filter 31c on the optical path of white light emitted from the light source 31a. By doing so, the type of illumination light emitted by the illumination unit 31 is controlled.
  • the processor unit 4 includes an image processing unit 41, an input unit 42, a storage unit 43, and a control unit 44.
  • the image processing unit 41 performs predetermined image processing on the electrical signal input from the endoscope 2 to generate a display image to be displayed by the display unit 5. Specifically, when the light source unit 3 emits white light (first illumination light) or narrowband light (second illumination light), the image processing unit 41 is based on the imaging signal generated by the imaging element 201. A first image corresponding to light in the green wavelength band and a second image corresponding to light in one wavelength band (in the first embodiment, light in the blue wavelength band) are generated.
  • the resolution of the first image when white light is emitted by the light source unit 3 is higher than the resolution of the first image when narrow-band light is emitted by the light source unit 3.
  • the image processing unit 41 includes a guide image generation unit 411, an interpolation image generation unit 412, a color image generation unit 413, and a display image generation unit 414.
  • the guide image generation unit 411 Based on the electrical signal input from the endoscope 2, the guide image generation unit 411 generates a guide image that serves as a guide for interpolating the electrical signals of other pixels when the interpolation image generation unit 412 performs an interpolation process.
  • the guide image is generated and output to the interpolation image generation unit 412 and the color image generation unit 413.
  • the guide image functions as the first interpolation image.
  • the interpolation image generation unit 412 performs an interpolation process on the electrical signal input from the endoscope 2 based on the guide image input from the guide image generation unit 411 to generate an interpolation image. (Second interpolation image) is output to the color image generation unit 413.
  • the color image generation unit 413 generates a color image using the interpolation image input from the interpolation image generation unit 412 based on the guide image input from the guide image generation unit 411, and displays the color image as a display image generation unit. To 414.
  • the display image generation unit 414 performs gradation conversion, enlargement processing, or structure enhancement processing of structures such as capillaries and mucous membrane fine patterns on the mucosal surface layer, on the electrical signal generated by the color image generation unit 413.
  • the display image generation unit 414 performs predetermined processing and then outputs the display image signal to the display unit 5 as a display image signal for display.
  • the input unit 42 is an interface for performing input from the user to the processor unit 4, a power switch for turning on / off the power, a mode switching button for switching shooting modes and other various modes, a light source
  • the illumination light switching button for switching illumination light of the unit 3 is included.
  • the storage unit 43 records various programs for operating the endoscope apparatus 1 and data including various parameters necessary for the operation of the endoscope apparatus 1.
  • the storage unit 43 may store information related to the endoscope 2, for example, a relationship table between unique information (ID) of the endoscope 2 and information related to the filter arrangement of the color filter 202a.
  • the storage unit 43 is realized by using a semiconductor memory such as a flash memory or a DRAM (Dynamic Random Access Memory).
  • the control unit 44 is configured using a CPU or the like, and performs drive control of each component including the endoscope 2 and the light source unit 3, input / output control of information with respect to each component, and the like.
  • the control unit 44 transmits setting data (for example, pixels to be read) recorded in the storage unit 43, timing signals related to imaging timing, and the like via a predetermined signal line to the endoscope. 2 to send.
  • the control unit 44 outputs the color filter information (identification information) acquired via the imaging information storage unit 206 to the image processing unit 41, and transmits information related to the arrangement of the switching filter 31c based on the color filter information. Output to.
  • the display unit 5 receives the display image signal generated by the processor unit 4 via the video cable, and displays an in-vivo image corresponding to the display image signal.
  • the display unit 5 is configured using liquid crystal or organic EL (Electro Luminescence).
  • FIG. 3 is a schematic diagram illustrating the configuration of the pixels of the image sensor 201.
  • the image sensor 201 includes a plurality of pixels P that receive light from the imaging optical system 200 arranged in a two-dimensional lattice (two-dimensional matrix). Each pixel P receives the light incident from the imaging optical system 200 and performs photoelectric conversion to generate an electrical signal. This electric signal includes a luminance value (pixel value) of each pixel, pixel position information, and the like.
  • a pixel arranged in the i-th row and the j-th column is expressed as a pixel P ij .
  • FIG. 4 is a schematic diagram illustrating an example of the configuration of the color filter 202.
  • the color filter 202 is a filter unit U1 composed of 25 filters arranged in a 5 ⁇ 5 two-dimensional grid and arranged according to the arrangement of the pixels P ij. .
  • the pixel P ij provided with the filter receives light in the wavelength band that has passed through the filter.
  • the pixel P ij provided with a filter that transmits light in the red wavelength band receives light in the red wavelength band.
  • the pixel Pij that receives light in the red wavelength band is referred to as an R pixel.
  • G pixel pixel P ij for receiving light in the green wavelength band B pixel pixel P ij for receiving light of blue wavelength band, the light and blue wavelength band of the green wavelength band and light
  • the pixel P ij that receives light is called a Cy pixel.
  • Filter unit as shown in FIG. 4 U1 transmits light in a wavelength band H R in a blue wavelength band H B of (B), wavelength band H G and red green (G) (R).
  • blue, green and red wavelength bands H B, H G and H R the wavelength band H B is 390 nm ⁇ 500 nm
  • the wavelength band H G is 500 nm ⁇ 600 nm
  • the wavelength band H R is 600 nm ⁇ 700 nm.
  • the filter unit U1 includes one R filter, four B filters, and 20 Cy filters. Furthermore, the filter unit U1 is provided more than the number of types of B filters provided most in the color filter 202 among the plurality of types of first filters. Specifically, the ratio between the number of Cy filters and the number of B filters is 5: 1.
  • the R filter has Cy filters arranged at all adjacent positions. Hereinafter, when a B filter is arranged at a position corresponding to the pixel P ij , this B filter is referred to as B ij .
  • this R filter when an R filter is disposed at a position corresponding to the pixel P ij , this R filter is R ij , and when a Cy filter is disposed at a position corresponding to the pixel P ij , this Cy filter is Cy ij. .
  • the B filter and the R filter function as a first filter, and the Cy filter functions as a second filter.
  • FIG. 5 is a diagram illustrating an example of the transmission characteristics of each filter constituting the color filter 202.
  • the transmittance curve is normalized in a simulated manner so that the maximum values of the transmittances of the respective filters are equal.
  • the curve L B represents the transmittance curve of the B filter
  • curve L R represents the transmittance curve of the R filter
  • a curve L Cy indicates the transmittance curve of Cy filter.
  • the horizontal axis indicates the wavelength
  • the vertical axis indicates the transmittance.
  • the B filter transmits light in the wavelength band H B.
  • Cy filter transmits the wavelength band H B and the wavelength band H G each light, light in a wavelength band H R absorbs (shading). That is, the Cy filter transmits light in the cyan wavelength band which is a complementary color.
  • R filter transmits light in the wavelength band H R.
  • complementary colors refers to the color composed by light including at least two wavelength bands of the wavelength band H B, H G, H R .
  • FIG. 6 is a diagram illustrating an example of spectral characteristics of white light emitted from the light source unit 3.
  • FIG. 7 is a diagram illustrating an example of spectral characteristics of narrowband light emitted from the light source unit 3. 6 and 7, the horizontal axis indicates the wavelength, and the vertical axis indicates the intensity. 6 shows the spectral characteristics of the white light curve L W is emitted by the light source unit 3. In FIG. 7, a curved line L 1 and a curved line L 2 indicate the spectral characteristics of narrowband light emitted by the light source unit 3.
  • the white light source unit 3 is emitted is larger intensity of the wavelength band H G than the intensity of the green and blue wavelength band H B.
  • the narrowband light source unit 3 is emitted is larger intensity of the blue wavelength band H B than the intensity of the green wavelength band H G.
  • the light received by each pixel P ij is information obtained by multiplying the spectral characteristics of the light emitted by the light source unit 3 and the transmission characteristics of each filter.
  • the light source unit 3 to emit white light as the first illumination light, strongly emit light in the green wavelength band H G than the light of the blue wavelength band H B, narrowband light as the second illumination light Is emitted more strongly in the blue wavelength band H B than in the green wavelength band H G.
  • FIG. 8 is a diagram illustrating an example of transmission characteristics of the Cy filter when white light is emitted to the Cy filter by the light source unit 3.
  • FIG. 9 is a diagram illustrating an example of the transmission characteristics of the Cy filter when narrowband light is emitted to the Cy filter by the light source unit 3. 8 and 9, the horizontal axis indicates the wavelength, and the vertical axis indicates the intensity.
  • a curve L CyW indicates the transmission characteristics of the Cy filter when white light is emitted to the Cy filter by the light source unit 3.
  • a curve L CyB and a curve L CyG indicate the transmission characteristics of the Cy filter when narrowband light is emitted to the Cy filter by the light source unit 3.
  • Cy filter if the white light is emitted in the Cy filter by the light source unit 3, the transmittance of the blue wavelength band H green wavelength range H G than the transmittance of the B is large and transmits more light in the relatively green wavelength range H G. That is, the Cy ij pixel acquires more information on the green wavelength band H G than information on the blue wavelength band H B.
  • the Cy filter has a transmittance of the green wavelength band H G when the light source unit 3 emits narrow band light to the Cy filter.
  • the transmittance of the blue wavelength band H B is large, and a relatively large amount of light in the blue wavelength band H B is transmitted. That, Cy ij pixel obtains much information a blue wavelength band H B from the information in the green wavelength band H G.
  • 10A to 10E are diagrams illustrating examples of images when the Cy filter is disposed on the light receiving surface of each pixel P ij .
  • the Cy pixel has sensitivity in each of the blue wavelength band H B and the green wavelength band H G. Therefore, Cy pixel, blue information of a wavelength band H capillaries of a surface layer of B1 is information B (see FIG. 10A), the green wavelength band H G information in a thick blood vessel B2 information (FIG. 10E Information) can be obtained. In the sensitivity characteristics including light emitted from the light source unit 3 illustrated in FIGS. 8 and 9, the Cy pixel has a ratio of the blue wavelength band H B to the green wavelength band H G of 1: 1. As illustrated in FIG. 10C, information obtained by averaging the image W11 illustrated in FIG. 10A and the image W15 illustrated in FIG. 10E is obtained as the image W13.
  • the second image generated by the image processing unit 41 has the blue wavelength band information obtained from the B pixel and the information of the blue wavelength band H B obtained from the Cy pixel when the second illumination light is irradiated by the light source unit 3. a the result of the addition of the information in the wavelength band H B.
  • the resolution of the second image when the second illumination light (narrowband light) is emitted by the light source unit 3 is the second image when the first illumination light (white light) is emitted by the light source unit 3. Higher than the resolution.
  • the second image is an information image W12 that is close to the image W11 in FIG. 10A, as shown in FIG. 10B.
  • Cy pixel when the first illumination light by the light source unit 3 is emitted, it is possible to obtain many green information wavelength band H G obtained from Cy pixel.
  • the first image generated by the image processing unit 41 when the first illumination light is irradiated by the light source unit 3, the blue obtained from the information and G pixel in the green wavelength band H G obtained from Cy pixel a the result of the addition of the information in the wavelength band H G.
  • the resolution of the first image when the first illumination light is emitted by the light source unit 3 is higher than the resolution of the first image when the second illumination light is emitted by the light source unit 3.
  • the first image becomes an information image W14 that is close to the image W15 in FIG. 10E as illustrated in FIG. 10D.
  • the resolution of the first image when the first illumination light (white light) is emitted by the light source unit 3 is the first resolution when the second illumination light (narrow band light) is emitted by the light source unit 3. It becomes higher than the resolution of the image. Furthermore, the resolution of the second image when the second illumination light (narrow band light) is emitted by the light source unit 3 is the same as that of the second image when the first illumination light (white light) is emitted by the light source unit 3. It becomes higher than the resolution.
  • FIG. 11 is a flowchart illustrating an outline of processing executed by the endoscope apparatus 1.
  • the illumination control unit 32 determines whether or not the observation method is a white light observation method based on information from the processor unit 4 (step S101). (Step S101: Yes), the illumination control unit 32 retreats the switching filter 31c from the optical path of the white light emitted by the light source 31a by driving the drive unit 31d (Step S102), and causes the light source unit 3 to emit white light. Is emitted (step S103). After step S103, the endoscope apparatus 1 proceeds to step S106 described later.
  • step S101 when the white light observation method is not used (step S101: No), the illumination control unit 32 inserts the switching filter 31c from the optical path of the white light emitted by the light source 31a by driving the drive unit 31d (In step S104, the light source unit 3 emits narrowband light (step S105). After step S105, the endoscope apparatus 1 proceeds to step S106 described later.
  • step S106 the endoscope 2 images the subject.
  • the endoscope 2 outputs the electrical signal generated by the image sensor 201 to the processor unit 4.
  • the processor unit 4 executes image generation processing for generating an image to be displayed on the display unit 5 by performing image processing on the electrical signal input from the endoscope 2 (step S107). Details of the image generation process will be described later.
  • step S108: Yes the endoscope apparatus 1 ends this process.
  • step S108: No the endoscope apparatus 1 returns to step S101 described above.
  • FIG. 12 is a flowchart showing an outline of the image generation process.
  • FIG. 13 is a schematic diagram for explaining an outline of the image generation processing executed by the image processing unit 41.
  • the image processing unit 41 acquires image data from the endoscope 2 (step S201). Specifically, as illustrated in FIG. 13, the image processing unit 41 acquires an image F ⁇ b> 1 corresponding to the image data from the endoscope 2.
  • the guide image generation unit 411 generates an interpolation image of Cy pixels arranged with the highest density in the image sensor 201 as a guide image (step S202). Specifically, as illustrated in FIG. 13, the guide image generation unit 411 performs imaging based on the luminance value (pixel value) of each Cy pixel in the separated image F Cy1 obtained by separating the luminance value of the Cy pixel from the image F1.
  • an interpolated image F Cy2 having the luminance value of the Cy pixel at all pixel positions is obtained as a guide image (hereinafter referred to as “guide image”) , “Guide image F Cy2 ”).
  • guide image a guide image
  • Guide image F Cy2 the adjacent eight directions (horizontal method, vertical direction, and diagonal direction) are all surrounded by Cy pixels.
  • the guide image generation unit 411 generates the guide image F Cy2 using well-known bilinear interpolation, cubic interpolation, direction discrimination interpolation, or the like. Accordingly, the guide image generation unit 411 can generate a highly accurate guide image FCy2 .
  • the interpolated image generation unit 412 determines each of the other color pixels, that is, the R pixel and the B pixel in the first embodiment, based on the guide image F Cy2 generated by the guide image generation unit 411 in step S202 described above.
  • An interpolation image is generated (step S203). Specifically, as illustrated in FIG. 13, the interpolation image generation unit 412 has the Cy pixel and the R pixel in the separated image F B1 arranged based on the guide image F Cy2 generated by the guide image generation unit 411.
  • an interpolated image F B2 (second image) having the luminance value of the B pixel at all pixel positions is generated.
  • the interpolation image generating unit 412 based on the guide image F Cy2 the guide image generating unit 411 has generated, the pixel positions respectively Cy and B pixels in the separation image F R1 is arranged By calculating the luminance value of the R pixel by interpolation processing, an interpolated image FR2 having the luminance value of the R pixel at all pixel positions is generated.
  • the interpolation method based on the guide image FCy2 by the interpolation image generation unit 412 is a known joint bilateral interpolation process or a guided filter interpolation process.
  • the interpolation image generation unit 412 can generate the interpolation image F FB2 and the interpolation image F R2 with high accuracy of the B pixel and the R pixel, which are arranged at a low density in the image sensor 201, respectively.
  • the interpolation image generation unit 412 can perform the interpolation processing based on the guide image F Cy2 with high accuracy.
  • high-frequency components in white light have a high correlation between the R pixel, the G pixel, and the B pixel.
  • the interpolation image generation unit 412 can perform the interpolation process with high accuracy even when the R pixel is interpolated using the Cy image as the guide image FCy2 .
  • the interpolation image generation unit 412 generates a color image using the B image and the G image, and thus does not need to generate an R interpolation image.
  • the image processing unit 41 is sensitive to light in the green wavelength band and light in the blue wavelength band based on the imaging signal generated by the imaging element 201. It is possible to generate an interpolated image F B2 (an image obtained by adding the guide image F Cy2 and the separated image F B1 in FIG. 13) as a second image having a higher resolution than in the case of the white light observation method. Furthermore, the image processing unit 41 can generate an interpolated image F B2 as a second image having a higher resolution than the interpolated image FG2 that is the first image in the narrowband light observation method.
  • the color image generation unit 413 uses the guide image F Cy2 generated by the guide image generation unit 411 in step S202 and the interpolation image F B2 generated by the interpolation image generation unit 412 in step S203 to set all pixel values to G.
  • An interpolated image F G2 (first image) having the luminance value of the pixel is generated.
  • the color image generation unit 413 separates the G component from the guide image F Cy2 by subtracting the luminance value of the B pixel interpolation image F B2 from the Cy pixel guide image F Cy2 for each pixel. , to generate an interpolated image F G2 of the G pixel. More specifically, the color image generation unit 413 generates an interpolation image FG2 of G pixels by the following equation (1).
  • G (i, j) Cy (i, j) ⁇ ⁇ B (i, j) (1)
  • G (i, j) represents the luminance value of each G pixel in the interpolated image F G2 (pixel value)
  • Cy (i, j) is the brightness value of each Cy pixel in the interpolation image
  • Cy2 (pixel B) (i, j) indicates the luminance value (pixel value) of each B pixel in the interpolated image F CB2
  • is a G correction coefficient, and in the spectral characteristics of the light source 31a and the Cy pixel. This is a parameter calculated in advance from the ratio of the blue wavelength band H B and the green wavelength band H G.
  • the color image generation unit 413 when the endoscope apparatus 1 performs the white light observation method, the color image generation unit 413 generates a color image FW using the interpolation image F G2 , the interpolation image F B2, and the interpolation image F R2 . In contrast, when the endoscope apparatus 1 performs the narrow-band light observation method, the color image generation unit 413 generates a color image using the interpolation image F G2 and the interpolation image F B2 (step S204).
  • the display image generation unit 414 generates a display image using the color image Fw generated by the color image generation unit 413 (step S205). Specifically, the display image generation unit 414 performs gradation modulation processing, enlargement processing, denoising processing, structure enhancement processing of structures such as capillaries and mucous membrane fine patterns on the mucosal surface layer, and the like on the color image FW . To generate a display image for display. In this case, the display image generation unit 414 may perform structure enhancement processing using information of the guide image FCy2 , for example, edge information or luminance information, in step S202. Since the interpolated image F Cy2 has a high resolution regardless of the observation method, various processes such as a structure enhancement process can be performed with high accuracy. After step S205, the endoscope apparatus 1 returns to the main routine of FIG.
  • the color filter 202 is provided with the Cy filter, the B filter, and the R filter, and the number of Cy filters provided with the largest number of Cy filters is the number of the B filters.
  • a high-resolution image can be obtained by both the white light observation method and the narrow-band light observation method.
  • Embodiment 1 of the present invention when the light source unit 3 emits white light as the first illumination light, the light in the green wavelength band H G is emitted more strongly than the light in the blue wavelength band H B.
  • the narrow-band light is emitted as the second illumination light, the light of the blue wavelength band H B is emitted more strongly than the light of the green wavelength band H G. Therefore, the white light observation method and the narrow-band light observation method are used. In either observation method, an image with high resolution can be obtained.
  • the interpolation image generation unit 412 arranges each of the Cy pixel and the R pixel in the separated image F B1 based on the guide image F Cy2 generated by the guide image generation unit 411.
  • an interpolated image F B2 having the B pixel luminance value at all pixel positions is generated, and the Cy pixel and the B pixel in the separated image F R1 are respectively
  • the interpolated image FR2 having the luminance value of the R pixel at all pixel positions is generated, so that the interpolation process can be performed with high accuracy.
  • FIG. 14 is a diagram illustrating an example of a color image generated by the image processing unit according to the first modification of the first embodiment of the present invention when the narrow-band light observation method is used.
  • the interpolation image generation unit 412 can acquire information in which the Cy pixel is very close to the G pixel or the B pixel.
  • the interpolation image F B2 may be generated by regarding the Cy pixel as the B pixel.
  • the color image generation unit 413 may generate a color image of narrowband light using the interpolation image F B2 and the interpolation image F G2 generated by the interpolation image generation unit 412.
  • the interpolation image generation unit 412 may generate the interpolation image F G2 by regarding the Cy pixel as a G pixel as in the narrow-band light observation method. .
  • the image processing by the image processing unit 41 can be simplified as compared with the first embodiment described above.
  • FIG. 15 is a schematic diagram illustrating an example of a configuration of a color filter according to the second modification of the first embodiment of the present invention.
  • a color filter 202a shown in FIG. 15 is a filter unit U2 composed of 16 filters arranged in a 4 ⁇ 4 two-dimensional grid, arranged in accordance with the arrangement of the pixels Pij .
  • the filter unit U2 is a conventional Bayer arrangement in which a Cy filter is disposed instead of the G filter at a position where the G filter is disposed.
  • the filter unit U2 includes four R filters, four B filters, and eight Cy filters.
  • the filter unit U2 is provided more than the number of B filters or R filters provided most in the color filter 202a. That is, the filter unit U2 includes Cy filters arranged in a checkered pattern. Further, the ratio between the number of Cy filters and the number of B filters is 2: 1.
  • a high-resolution image can be obtained in both the white light observation method and the narrow-band light observation method. Obtainable.
  • FIG. 16 is a schematic diagram illustrating an example of a configuration of a color filter according to Embodiment 2 of the present invention.
  • the color filter 202b is configured by a filter unit U3 in which a magenta filter (hereinafter referred to as “Mg filter”) is arranged instead of the R filter described above.
  • Mg filter magenta filter
  • the Mg filter functions as the third filter.
  • FIG. 17 is a diagram illustrating an example of transmission characteristics of each filter constituting the color filter 202b.
  • the horizontal axis indicates the wavelength
  • the vertical axis indicates the transmittance.
  • the curve L B represents the transmittance curve of the B filter
  • curve L Cy represents a transmittance curve of Cy filter
  • the curve L Mg indicates the transmittance curve of Mg filter.
  • Mg filter transmitting red wavelength band H R and the blue wavelength bands H B each light. That is, the Mg filter transmits light in a magenta wavelength band that is a complementary color.
  • Pixel Mg ij when the narrow-band light emitted by the light source unit 3, it is not possible to obtain information in the red wavelength band H R, it can be regarded as a pixel to obtain information similar to the B pixel. Thereby, when the endoscope apparatus 1 performs the narrow-band light observation method, an image having a higher resolution can be obtained by increasing the information of the B pixel.
  • FIG. 18 is a schematic diagram for explaining an overview of the image generation processing executed by the image processing unit 41.
  • the guide image generation unit 411 performs the luminance value (pixel value) of each Cy pixel of the separated image F Cy1 obtained by separating the luminance value of the Cy pixel from the image F2, as in the first embodiment. Based on the above, by calculating the luminance value of the Cy pixel at the pixel position where each of the B pixel and the Mg pixel in the image sensor 201 is arranged by interpolation processing, the guide image F Cy2 having the luminance value of the Cy pixel at all pixel positions. Is generated.
  • the interpolation image generation unit 412 interpolates the luminance value of the B pixel at the pixel position where each of the Cy pixel and the Mg pixel in the separated image F B1 is based on the guide image F Cy2 generated by the guide image generation unit 411.
  • an interpolated image F B2 (second image) having a luminance value of B pixels at all pixel positions is generated.
  • the image processing unit 41 calculates the interpolated image F B2 as the second image having a higher resolution than that in the case of the white light observation method based on the imaging signal generated by the imaging element 201. Can be generated.
  • the image processing unit 41 can generate an interpolated image F B2 as a second image having a higher resolution than the interpolated image FG2 that is the first image in the narrowband light observation method.
  • the interpolation image generation unit 412 calculates, based on the interpolation image F B2 , the luminance value of the Mg pixel at the pixel position where each of the Cy pixel and the B pixel in the separated image F Mg1 is calculated by interpolation processing.
  • An interpolated image F Mg2 having a luminance value of Mg pixels at all pixel positions is generated.
  • the color image generation unit 413 subtracts the luminance value for each pixel from the interpolation image F B2 of the B pixel from the interpolation image F Mg2 of the Mg pixel generated by the interpolation image generation unit 412, thereby obtaining the interpolation image F Mg2 from the interpolation image F Mg2.
  • the R component is separated to generate an interpolated image FR2 of R pixels.
  • the color image generation unit 413 by the interpolation image generating unit 412 generates an interpolated image F R2 from interpolated image F Mg2 generated using the interpolated image F B2, subjected to subtraction processing at the time of color image generation process Since interpolation processing using color information is performed, an increase in noise during subtraction processing can be reduced.
  • a high-resolution image can be obtained in both the white light observation method and the narrow-band light observation method, as in the first embodiment. .
  • the Mg filter is arranged in the filter unit U3, so that the interpolation process using the color information to be subtracted during the color image generation process is performed.
  • the increase in noise during the subtraction process can be reduced.
  • FIG. 19 is a schematic diagram illustrating an example of the configuration of a color filter according to Modification 1 of Embodiment 2 of the present invention.
  • a color filter 202c shown in FIG. 19 is a filter unit U4 composed of 16 filters arranged in a 4 ⁇ 4 two-dimensional lattice and arranged in accordance with the arrangement of the pixels Pij.
  • the filter unit U4 replaces the G filter at the place where the G filter is arranged, and arranges the Cy filter, and at the place where the R filter is arranged, replaces the R filter with the Mg filter. Is arranged.
  • the filter unit U4 includes four Mg filters, four B filters, and eight Cy filters.
  • the filter unit U4 is provided more than the number of B filters or Mg filters provided with the largest number of Cy filters in the color filter 202a.
  • the Cy filters are arranged in a checkered pattern. Furthermore, the ratio of the number of Cy filters, the number of B filters, and the number of Mg filters is 2: 1: 1.
  • a high-resolution image can be obtained in both the white light observation method and the narrow-band light observation method, as in the second embodiment.
  • Embodiment 3 Next, a third embodiment of the present invention will be described.
  • the filter unit is configured using three types of filters.
  • the filter unit is configured using four types of filters.
  • the image processing executed by the image processing unit according to the third embodiment will be described.
  • symbol is attached
  • FIG. 20 is a schematic diagram showing another example of the configuration of the color filter according to Embodiment 3 of the present invention.
  • the color filter 202d shown in FIG. 20 is obtained by placing the filter unit U5 consisting 25 of the filter arranged in a two-dimensional grid pattern in 5 ⁇ 5 are arranged according to the arrangement of the pixel P ij.
  • the filter unit U5 includes one R filter, four B filters, four G filters, and 16 Cy filters. Furthermore, the filter unit U5 is provided in a number equal to or greater than the number of each of the B filter and the G filter provided in the color filter 202d with the largest number of Cy filters.
  • FIG. 21 is a diagram illustrating an example of the transmission characteristics of each filter constituting the color filter 202d.
  • the curve L B represents the transmittance curve of the B filter
  • curve L R represents the transmittance curve of the R filter
  • a curve L G represents a transmittance curve of G filters
  • the curve L Cy is Cy filter The transmittance curve is shown.
  • the horizontal axis indicates the wavelength
  • the vertical axis indicates the transmittance.
  • G filter transmits light in the wavelength band H G.
  • FIG. 22 is a schematic diagram for explaining an outline of the image generation processing executed by the image processing unit 41.
  • the interpolation image generation unit 412 generates interpolated images of other color pixels, that is, R pixels and B pixels in the third embodiment, based on the guide image generated by the guide image generation unit 411. Generate. Specifically, the interpolation image generation unit 412 is based on the guide image F Cy2 generated by the guide image generation unit 411, the interpolation image F FB2 (second image) , the interpolation image F FR2, and the interpolation image F FG2 (first image) . Each image) is generated. Thus, the color image generation unit 413 can be omitted subtraction processing for subtracting the luminance value from the guide image F Cy2 the interpolated image F B2 of B pixels for each pixel.
  • the image processing unit 41 generates an interpolated image F B2 as a second image having a higher resolution than that in the case of the white light observation method, based on the imaging signal generated by the imaging element 201. can do. Further, in the narrowband light observation method, it is possible to generate the interpolated image F B2 as the second image having a higher resolution than the interpolated image FG2 that is the first image.
  • a high-resolution image can be obtained in both the white light observation method and the narrow-band light observation method as in the first embodiment. .
  • the color image generation unit 413 can omit the subtraction process for subtracting the luminance value for each pixel of the interpolation image F B2 of B pixels from the guide image F Cy2 .
  • the endoscope apparatus 1 performs the white light observation method, high color reproducibility can be ensured.
  • FIG. 23 is a schematic diagram showing an example of the configuration of a color filter according to Embodiment 4 of the present invention.
  • a color filter 202e shown in FIG. 23 is a filter unit U6 composed of 25 filters arranged in a two-dimensional grid of 5 ⁇ 5, arranged in accordance with the arrangement of the pixels Pij .
  • the filter unit U6 includes four R filters, one B filter, and 20 Ye filters. Further, the filter unit U6 is provided with a number of Ye filters equal to or more than the number of R filters provided in the color filter 202e.
  • FIG. 24 is a diagram illustrating an example of transmission characteristics of each filter constituting the color filter 202e.
  • the curve L B represents the transmittance curve of the B filter
  • curve L R represents the transmittance curve of the R filter
  • a curve L Ye shows the transmittance curve of the Ye filter.
  • the horizontal axis indicates the wavelength
  • the vertical axis indicates the transmittance.
  • Ye filter transmitting red wavelength band H R and green wavelength range H G each light. That is, the Ye filter transmits light in a yellow wavelength band that is a complementary color.
  • FIG. 25 is a diagram illustrating an example of spectral characteristics of white light emitted from the light source unit 3.
  • FIG. 26 is a diagram illustrating an example of spectral characteristics of narrowband light emitted from the light source unit 3. 25 and 26, the horizontal axis indicates the wavelength, and the vertical axis indicates the intensity. In FIG 25, illustrates a spectral characteristic of the white light curve L W is emitted by the light source unit 3. In FIG. 26, two curves L 2 and L 11 indicate the spectral characteristics of the narrowband light emitted by the light source unit 3.
  • the white light source unit 3 is emitted, the larger the intensity of the blue wavelength band H wavelength band H G than the intensity of green B. Further, as shown by the curve L 2 and curve L 11 in FIG. 26, the narrow-band light source unit 3 is emitted is larger intensity of red wavelength band H R than the intensity of the green wavelength band H G.
  • FIG. 27 is a schematic diagram for explaining an overview of the image generation processing executed by the image processing unit 41.
  • the guide image generation unit 411 generates an interpolation image of Ye pixels as a guide image. Specifically, as illustrated in FIG. 27, the guide image generation unit 411 includes the B pixel and the R pixel based on the luminance value of each Ye pixel in the separated image F Ye1 obtained by separating the luminance value of the Ye pixel from the image F4. By calculating the luminance value of the Ye pixel at the pixel position where each pixel is arranged by interpolation processing, the guide image F Ye2 having the luminance value of the Ye pixel at all pixel positions is generated. As shown in FIG.
  • the guide image generation unit 411 generates the guide image F Ye2 using well-known bilinear interpolation, cubic interpolation, direction discrimination interpolation, or the like. Thereby, the guide image generation unit 411 can generate a highly accurate guide image F Ye2 .
  • the interpolation image generation unit 412 calculates the luminance value of the R pixel at the pixel position where the Ye pixel and the B pixel in the separated image F R1 are arranged. By calculating by interpolation processing, an interpolated image F R2 (second image) having R pixel luminance values at all pixel positions is generated. Furthermore, the interpolation image generation unit 412 interpolates the luminance value of the B pixel at the pixel position where the Ye pixel and the R pixel in the separated image F B1 are arranged based on the guide image F Ye2 generated by the guide image generation unit 411.
  • an interpolated image F B2 having a luminance value of B pixels at all pixel positions is generated.
  • the interpolation method based on the guide image F Ye2 by the interpolation image generation unit 412 is a known joint bilateral interpolation process or a guided filter interpolation process.
  • the interpolation image generation unit 412 can generate the interpolation image FR2 and the interpolation image FB2 with high accuracy of the R pixel and the B pixel, which are arranged at a low density in the image sensor 201, respectively.
  • the image processing unit 41 is based on the image pickup signal generated by the image pickup device 201, and the interpolated image F R2 as the second image having a higher resolution than in the case of the white light observation method (see FIG. 27 of the guide image F Ye2 and separation images F R1 can generate image) obtained by adding the. Further, in the narrow band light observation method, it is possible to generate the interpolated image FR2 as the second image having a higher resolution than the interpolated image FG2 that is the first image.
  • the color image generation unit 413 by subtracting the luminance value from the guide image F Ye2 of Ye pixels interpolated image F R2 R pixels for each pixel, to separate the R component from the guide image F Ye2, G pixel
  • the interpolated image FG2 is generated.
  • the color image generation unit 413 when the endoscope apparatus 1 performs the white light observation method, the color image generation unit 413 generates a color image FW using the interpolation image F G2 , the interpolation image F B2, and the interpolation image F R2 .
  • the color image generation unit 413 when the endoscope apparatus 1 performs fluorescence observation, the color image generation unit 413 generates a color image using the interpolation image FG2 and the interpolation image FR2 .
  • FIG. 28 is a diagram showing a list in which variations of color filters according to other embodiments of the present invention are associated with wavelength bands of illumination light emitted from the light source unit 3 and effects.
  • a Ye filter may be disposed instead of the Mg filter at the position where the Mg filter of FIG. 16 described above is disposed.
  • a Cy filter may be disposed in place of the B filter at the position where the B filter of FIG. 23 is disposed.
  • an Mg filter may be disposed in place of the B filter at the position where the B filter of FIG. 23 is disposed.
  • a color filter 202f shown in FIG. 29 may be used.
  • the color filter 202f is obtained by placing the filter unit U7 consisting 25 of the filter arranged in a two-dimensional grid pattern in 5 ⁇ 5 are arranged according to the arrangement of the pixel P ij.
  • the filter unit U7 includes four G filters, four B filters, one Mg filter, and 16 Cy filters. Further, the filter unit U7 is arranged to have a larger number of Cy filters than the number of G filters or B filters provided most in the filter unit U7.
  • a color filter 202g shown in FIG. 30 may be used.
  • the color filter 202g is obtained by placing the filter unit U8 consisting 36 filters arranged in a 6 ⁇ 6 in a two-dimensional lattice form are arranged according to the arrangement of the pixel P ij.
  • the filter unit U8 includes nine G filters, five B filters, four Mg filters, and 18 Cy filters. Further, the filter unit U8 is arranged so that the number of Cy filters is more than the number of G filters provided most in the filter unit U8.
  • a color filter 202h shown in FIG. 31 may be used.
  • the color filter 202h is obtained by placing the filter unit U9 of 16 filters that are arranged in a two-dimensional grid pattern in 4 ⁇ 4 are arranged according to the arrangement of the pixel P ij.
  • the filter unit U9 includes four G filters, two B filters, two Mg filters, and eight Cy filters. Further, the filter unit U9 is arranged so that the number of Cy filters is more than the number of G filters provided most in the filter unit U9.
  • a color filter 202i shown in FIG. 32 may be used.
  • the color filter 202i is obtained by placing the filter unit U10 of 16 filters that are arranged in a two-dimensional grid pattern in 4 ⁇ 4 are arranged according to the arrangement of the pixel P ij.
  • the filter unit U10 includes four G filters, two B filters, two R filters, and eight Cy filters. Further, the filter unit U10 is arranged so that the number of Cy filters is more than the number of G filters provided most in the filter unit U10.
  • the transmission characteristics of the Cy filter transmit light in the entire band of the blue wavelength band H B and the green wavelength band H G , but this is shown by a curve L Cy2 in FIG. As such, it may have a bimodal transmission characteristic.
  • the image processing unit 41 can perform the color image generation process with high accuracy, it can generate a color image with less noise.
  • the illumination light emitted from the light source unit 3 is converted into white light or narrow light by inserting / removing the switching filter 31c into / from the optical path with respect to white light emitted from one light source 31a.
  • a light source that emits white light and a light source that emits narrow band light may be provided, and white light or narrow band light may be emitted while switching the lighting of the two light sources.
  • the present invention can be applied to a capsule endoscope that can be introduced into a subject.
  • the image processing unit generates an interpolation image of pixels of other colors using the Cy interpolation image or the Ye interpolation image as a guide image.
  • the Cy interpolation image when generating a B interpolation image, the Cy interpolation image
  • the direction of the edge in the image may be determined for each pixel position, and a B-interpolated image may be generated from only the information of the B pixel based on the determined result.
  • the endoscope apparatus 1 has been described as having the A / D conversion unit 205 provided in the distal end portion 24, it may be provided in the processor unit 4. Further, the configuration relating to image processing may be provided in the endoscope 2, the connector that connects the endoscope 2 and the processor unit 4, the operation unit 22, or the like. In the endoscope apparatus 1 described above, the endoscope unit 2 connected to the processor unit 4 is identified using the identification information stored in the identification information storage unit 261. However, the processor unit 4 An identification means may be provided at a connection portion (connector) between the endoscope 2 and the endoscope 2. For example, an identification pin (identification means) is provided on the endoscope 2 side to identify the endoscope 2 connected to the processor unit 4.
  • section module, unit
  • control unit can be read as control means or a control circuit.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Signal Processing (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

La présente invention concerne : un dispositif d'endoscope qui est en mesure d'obtenir des images à haute résolution dans un mode d'observation de lumière blanche et un mode d'observation de lumière à bande étroite ; un dispositif de traitement d'image ; un procédé de traitement d'image ; et un programme. Ce dispositif d'endoscope 1 est caractérisé en ce qu'il est pourvu d'une unité de traitement d'image 41 qui, dans des cas dans lesquels une unité de source de lumière 3 émet une première lumière d'éclairage ou une deuxième lumière d'éclairage, génère, sur la base d'un signal d'imagerie généré par un élément d'imagerie 201, une première image correspondant à une lumière dans la bande de longueur d'onde verte et une deuxième image correspondant à une lumière dans une bande de longueur d'onde unique. Le dispositif d'endoscope est en outre caractérisé en ce que : la résolution de la première image lorsque la première lumière d'éclairage est émise par l'unité de source de lumière est supérieure à celle de la première image lorsque la deuxième lumière d'éclairage est émise par l'unité de source de lumière ; et la résolution de la deuxième image lorsque la deuxième lumière d'éclairage est émise par l'unité de source de lumière est supérieure à celle de la deuxième image lorsque la première lumière d'éclairage est émise par l'unité de source de lumière.
PCT/JP2016/058002 2016-03-14 2016-03-14 Dispositif d'endoscope, dispositif de traitement d'image, procédé de traitement d'image et programme WO2017158692A1 (fr)

Priority Applications (6)

Application Number Priority Date Filing Date Title
PCT/JP2016/058002 WO2017158692A1 (fr) 2016-03-14 2016-03-14 Dispositif d'endoscope, dispositif de traitement d'image, procédé de traitement d'image et programme
PCT/JP2017/005175 WO2017159165A1 (fr) 2016-03-14 2017-02-13 Dispositif d'endoscope, dispositif de traitement d'image, procédé de traitement d'images, et programme
JP2018505351A JPWO2017159165A1 (ja) 2016-03-14 2017-02-13 内視鏡装置、画像処理装置、画像処理方法およびプログラム
CN201780017384.XA CN108778091B (zh) 2016-03-14 2017-02-13 内窥镜装置、图像处理装置、图像处理方法和记录介质
DE112017001293.7T DE112017001293T5 (de) 2016-03-14 2017-02-13 Endoskop, vorrichtung zur bildverarbeitung, verfahren zur bildverarbeitung und programm
US16/039,802 US11045079B2 (en) 2016-03-14 2018-07-19 Endoscope device, image processing apparatus, image processing method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/058002 WO2017158692A1 (fr) 2016-03-14 2016-03-14 Dispositif d'endoscope, dispositif de traitement d'image, procédé de traitement d'image et programme

Publications (1)

Publication Number Publication Date
WO2017158692A1 true WO2017158692A1 (fr) 2017-09-21

Family

ID=59850808

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2016/058002 WO2017158692A1 (fr) 2016-03-14 2016-03-14 Dispositif d'endoscope, dispositif de traitement d'image, procédé de traitement d'image et programme
PCT/JP2017/005175 WO2017159165A1 (fr) 2016-03-14 2017-02-13 Dispositif d'endoscope, dispositif de traitement d'image, procédé de traitement d'images, et programme

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/005175 WO2017159165A1 (fr) 2016-03-14 2017-02-13 Dispositif d'endoscope, dispositif de traitement d'image, procédé de traitement d'images, et programme

Country Status (5)

Country Link
US (1) US11045079B2 (fr)
JP (1) JPWO2017159165A1 (fr)
CN (1) CN108778091B (fr)
DE (1) DE112017001293T5 (fr)
WO (2) WO2017158692A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019155815A1 (fr) * 2018-02-06 2019-08-15 富士フイルム株式会社 Système endoscopique
WO2021152704A1 (fr) * 2020-01-28 2021-08-05 オリンパス株式会社 Dispositif d'endoscope, procédé de fonctionnement pour dispositif d'endoscope, et programme de traitement d'image

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7048628B2 (ja) 2016-11-28 2022-04-05 アダプティブエンドウ エルエルシー 分離可能使い捨てシャフト付き内視鏡
CN111093459A (zh) 2017-10-04 2020-05-01 奥林巴斯株式会社 内窥镜装置、图像处理方法以及程序
JP7176855B2 (ja) * 2018-04-23 2022-11-22 株式会社エビデント 内視鏡装置、内視鏡装置の作動方法、プログラム、および記録媒体
JP2019191321A (ja) * 2018-04-23 2019-10-31 オリンパス株式会社 内視鏡装置、内視鏡装置の作動方法、プログラム、および記録媒体
CN112823373A (zh) * 2018-10-10 2021-05-18 奥林巴斯株式会社 图像信号处理装置、图像信号处理方法、程序
USD1018844S1 (en) 2020-01-09 2024-03-19 Adaptivendo Llc Endoscope handle
USD1031035S1 (en) 2021-04-29 2024-06-11 Adaptivendo Llc Endoscope handle

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015116328A (ja) * 2013-12-18 2015-06-25 オリンパス株式会社 内視鏡装置

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003010114A (ja) * 2001-06-29 2003-01-14 Fuji Photo Film Co Ltd 電子内視鏡装置およびその制御方法
CA2581668A1 (fr) * 2003-09-26 2005-04-07 Tidal Photonics, Inc Appareil et procedes relatifs a des systemes endoscopes a imagerie ayant une plage dynamique elargie
US20060232668A1 (en) 2005-04-18 2006-10-19 Given Imaging Ltd. Color filter array with blue elements
US8498695B2 (en) 2006-12-22 2013-07-30 Novadaq Technologies Inc. Imaging system with a single color image sensor for simultaneous fluorescence and color video endoscopy
WO2010143692A1 (fr) * 2009-06-10 2010-12-16 オリンパスメディカルシステムズ株式会社 Dispositif d'endoscope de type capsule
WO2011162111A1 (fr) * 2010-06-25 2011-12-29 オリンパスメディカルシステムズ株式会社 Dispositif endoscope
JP2012081048A (ja) * 2010-10-12 2012-04-26 Fujifilm Corp 電子内視鏡システム、電子内視鏡、及び励起光照射方法
JP6230409B2 (ja) * 2013-12-20 2017-11-15 オリンパス株式会社 内視鏡装置

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015116328A (ja) * 2013-12-18 2015-06-25 オリンパス株式会社 内視鏡装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019155815A1 (fr) * 2018-02-06 2019-08-15 富士フイルム株式会社 Système endoscopique
JPWO2019155815A1 (ja) * 2018-02-06 2021-01-14 富士フイルム株式会社 内視鏡システム
JP7057381B2 (ja) 2018-02-06 2022-04-19 富士フイルム株式会社 内視鏡システム
WO2021152704A1 (fr) * 2020-01-28 2021-08-05 オリンパス株式会社 Dispositif d'endoscope, procédé de fonctionnement pour dispositif d'endoscope, et programme de traitement d'image

Also Published As

Publication number Publication date
CN108778091B (zh) 2020-10-02
US11045079B2 (en) 2021-06-29
CN108778091A (zh) 2018-11-09
JPWO2017159165A1 (ja) 2019-01-24
DE112017001293T5 (de) 2018-12-13
WO2017159165A1 (fr) 2017-09-21
US20180344136A1 (en) 2018-12-06

Similar Documents

Publication Publication Date Title
WO2017158692A1 (fr) Dispositif d'endoscope, dispositif de traitement d'image, procédé de traitement d'image et programme
US10159404B2 (en) Endoscope apparatus
JP6196900B2 (ja) 内視鏡装置
JP5670264B2 (ja) 内視鏡システム、及び内視鏡システムの作動方法
JP6588043B2 (ja) 画像処理装置、内視鏡システム、画像処理装置の作動方法およびプログラム
JP6471173B2 (ja) 画像処理装置、内視鏡装置の作動方法、画像処理プログラムおよび内視鏡装置
US10070771B2 (en) Image processing apparatus, method for operating image processing apparatus, computer-readable recording medium, and endoscope device
JP6401800B2 (ja) 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置
JP6230409B2 (ja) 内視鏡装置
WO2016084257A1 (fr) Appareil d'endoscopie
US11571111B2 (en) Endoscope scope, endoscope processor, and endoscope adaptor
CN111093459A (zh) 内窥镜装置、图像处理方法以及程序
JPWO2019175991A1 (ja) 画像処理装置、内視鏡システム、画像処理方法およびプログラム

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16894316

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16894316

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP