WO2019180983A1 - Système d'endoscope, procédé de traitement d'image et programme - Google Patents

Système d'endoscope, procédé de traitement d'image et programme Download PDF

Info

Publication number
WO2019180983A1
WO2019180983A1 PCT/JP2018/029313 JP2018029313W WO2019180983A1 WO 2019180983 A1 WO2019180983 A1 WO 2019180983A1 JP 2018029313 W JP2018029313 W JP 2018029313W WO 2019180983 A1 WO2019180983 A1 WO 2019180983A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
wavelength band
light source
image data
image
Prior art date
Application number
PCT/JP2018/029313
Other languages
English (en)
Japanese (ja)
Inventor
理 足立
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Publication of WO2019180983A1 publication Critical patent/WO2019180983A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0655Control therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/26Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides

Definitions

  • the present disclosure relates to an endoscope system, an image processing method, and a program that are introduced into a living body of a subject and capture an image in the living body.
  • a white light observation method (WLI: White Light Imaging) using white illumination light (white light) and two narrow bands included in the blue and green wavelength bands, respectively.
  • Special light observation methods such as a narrow band light observation method (NBI: Narrow Band Imaging) using illumination light (narrow band light) made of light are widely known.
  • NBI narrow band Light observation method
  • a color image can be obtained by irradiating white light.
  • narrow band light observation method by irradiating narrow band light, it is possible to obtain a special image that highlights capillaries and mucous membrane fine patterns existing on the surface of the mucosa of the living body.
  • the present disclosure has been made in view of the above, and an object of the present disclosure is to provide an endoscope system, an image processing method, and a program capable of obtaining each of a color image and a special image with high image quality and no color misregistration.
  • an endoscope system includes a red wavelength band light, a green wavelength band light, and a blue wavelength band light.
  • An image sensor that is formed by stacking color filters configured using two types of filters that transmit light of two different wavelength bands on the light receiving surface of each pixel, and that can generate image data by imaging a subject;
  • a light source unit that can irradiate the light source unit, an illumination control unit that alternately irradiates light of the first wavelength band and light of the second wavelength band to the light source unit, and the light source unit has the first wavelength
  • the imaging when irradiated with light of the band An image processing unit that synthesizes the first image data generated by the child and the second image data generated by the imaging device when the light source unit emits light of the second wavelength band; It is characterized by providing.
  • the two types of filters include a cyan filter that transmits light in the green wavelength band and light in the blue wavelength band, and a red wavelength band.
  • a yellow filter that transmits light and light in the green wavelength band
  • the light source unit emits light in the red wavelength band and first light source that can emit light in the green wavelength band
  • Two light sources and a third light source capable of irradiating light in the blue wavelength band, and the illumination control unit supplies the second light source with light in the green wavelength band as light in the first wavelength band.
  • the first light source While irradiating light, as the light of the second wavelength band, the first light source is simultaneously irradiated with light of the red wavelength band and the third light source of light of the blue wavelength band, and the image processing unit Is included in the second image data.
  • a separation unit that generates red image data and blue image data by separating the pixel value of the light in the red wavelength band and the pixel value of the light in the blue wavelength band from the corresponding second image;
  • a combining unit that generates white image data by combining the red image data generated by the separating unit, the blue image data, and the first image data.
  • the combining unit generates special image data by combining the blue image data generated by the separating unit and the first image data. It is characterized by doing.
  • the two types of filters include a cyan filter that transmits light in the green wavelength band and light in the blue wavelength band, and a red wavelength band.
  • a magenta filter that transmits light and light in the blue wavelength band wherein the light source unit includes a first light source capable of irradiating light in the red wavelength band, and a first light source capable of irradiating light in the green wavelength band.
  • Two light sources and a third light source capable of irradiating light in the blue wavelength band, and the illumination control unit supplies the third light source with light in the blue wavelength band as light in the first wavelength band.
  • the first light source While irradiating light, as the light of the second wavelength band, the first light source is simultaneously irradiated with light of the red wavelength band and light of the green wavelength band to the second light source, and the image processing unit Is included in the second image data.
  • a separation unit that generates red image data and green image data by separating the pixel value of the light in the red wavelength band and the pixel value of the light in the green wavelength band from the corresponding second image;
  • a combining unit that generates white image data by combining the red image data generated by the separation unit, the green image data, and the first image data.
  • the synthesis unit generates special image data by synthesizing the green image data separated by the separation unit and the first image data. It is characterized by doing.
  • a pixel value of each pixel constituting the imaging element is set to a predetermined number of pixels. Pixels of pixels that can receive light of the same wavelength band among the pixels that constitute the image sensor when the light source unit emits light of the second wavelength band while adding and outputting each time An imaging control unit that adds and outputs a value for each predetermined number of pixels is provided.
  • the light source unit includes a fourth light source capable of emitting light in a purple wavelength band, and a fifth light source capable of emitting light in an orange wavelength band.
  • the illumination control unit irradiates the light of the second wavelength band to the light source unit, and then causes the fourth light source to emit light of the purple wavelength band and the fifth light source to orange. It is characterized in that the light of the wavelength band is simultaneously emitted.
  • the image processing method transmits two kinds of light in two different wavelength bands among light in the red wavelength band, light in the green wavelength band, and light in the blue wavelength band.
  • a color filter configured using a filter is stacked on the light-receiving surface of each pixel, and an image sensor that can generate image data by imaging a subject, and at least one wavelength band of light in the two wavelength bands
  • a light source unit capable of irradiating light in a first wavelength band including the light in the second wavelength band and light in a second wavelength band including light in at least the other wavelength band of the light in the two wavelength bands.
  • a control method executed by an endoscope system wherein the light source unit alternately irradiates light of the first wavelength band and light of the second wavelength band to the light source unit, and the light source unit Irradiates light of 1 wavelength band
  • the first image data generated by the image sensor when the light source unit emits light of the second wavelength band, and the second image data generated by the image sensor And an image processing step.
  • the program according to the present disclosure includes two types of filters that transmit light in two different wavelength bands among light in a red wavelength band, light in a green wavelength band, and light in a blue wavelength band.
  • a light source unit capable of irradiating light in a first wavelength band including light and light in a second wavelength band including light in at least the other wavelength band of the two wavelength bands.
  • An illumination control step for causing the light source unit to alternately irradiate the light of the first wavelength band and the light of the second wavelength band to the mirror system, and the light source unit irradiated the light of the first wavelength band
  • FIG. 1 is a schematic configuration diagram of an endoscope system according to the first embodiment.
  • FIG. 2 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the first embodiment.
  • FIG. 3 is a diagram schematically showing the configuration of the color filter.
  • FIG. 4 is a diagram schematically illustrating spectral characteristics of the color filter and each light emitted from the light source device.
  • FIG. 5 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in the green wavelength band.
  • FIG. 6 is a diagram illustrating an example of an image generated by the imaging device when the light source device simultaneously emits light in the red wavelength band and light in the blue wavelength band.
  • FIG. 1 is a schematic configuration diagram of an endoscope system according to the first embodiment.
  • FIG. 2 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the first embodiment.
  • FIG. 3 is a diagram schematically showing the configuration
  • FIG. 7 is a flowchart illustrating an outline of processing executed by the endoscope system according to the first embodiment.
  • FIG. 8 is a diagram schematically illustrating an image generated by the image processing unit.
  • FIG. 9 is a diagram schematically illustrating a configuration of a color filter according to a modification of the first embodiment of the present disclosure.
  • FIG. 10 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in the green wavelength band.
  • FIG. 11 is a diagram illustrating an example of an image generated by the imaging device when the light source device simultaneously emits light in the red wavelength band and light in the blue wavelength band.
  • FIG. 12 is a circuit diagram schematically showing the configuration of the image sensor according to the second embodiment.
  • FIG. 13 is a diagram schematically illustrating an image sensor addition method.
  • FIG. 14 is a diagram schematically illustrating an image sensor addition method.
  • FIG. 15 is a diagram schematically illustrating an image sensor addition method.
  • FIG. 16 is a diagram schematically illustrating a configuration of a color filter according to the third embodiment.
  • FIG. 17 is a diagram illustrating an example of an image generated by the imaging device when the light source device simultaneously emits light in the green wavelength band and light in the red wavelength band.
  • FIG. 18 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in a blue wavelength band.
  • FIG. 19 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the fourth embodiment.
  • FIG. 19 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the fourth embodiment.
  • FIG. 20 is a diagram schematically illustrating spectral characteristics of the color filter 2442 and each light emitted from the light source device 3C.
  • FIG. 21 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in the green wavelength band.
  • FIG. 22 is a diagram illustrating an example of an image generated by the imaging device when the light source device simultaneously emits light in the blue wavelength band and light in the red wavelength band.
  • FIG. 23 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in a purple wavelength band and light in an orange wavelength band.
  • FIG. 24 is a timing chart illustrating an outline of processing executed by the endoscope system according to the fourth embodiment.
  • FIG. 1 is a schematic configuration diagram of an endoscope system according to the first embodiment.
  • FIG. 2 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the first embodiment.
  • An endoscope system 1 shown in FIGS. 1 and 2 inserts an endoscope into a subject such as a patient, images the inside of the subject, and outputs the captured image data to an external display device.
  • a user such as a doctor examines the presence or absence of each of a bleeding site, a tumor site, and an abnormal site, which are detection target sites, by observing the in-vivo images displayed on the display device.
  • the endoscope system 1 includes an endoscope 2, a light source device 3, a display device 4, and a processing device 5 (processor).
  • the endoscope 2 captures an image of the inside of the subject, generates image data (RAW data), and outputs the generated image data to the processing device 5.
  • the endoscope 2 includes an insertion unit 21, an operation unit 22, and a universal cord 23.
  • the insertion part 21 has an elongated shape having flexibility.
  • the insertion portion 21 is connected to a distal end portion 24 incorporating an imaging element 244 described later, a bendable bending portion 25 constituted by a plurality of bending pieces, and a proximal end side of the bending portion 25, and has flexibility. And a long flexible tube portion 26.
  • the distal end portion 24 is configured using a glass fiber or the like, and forms a light guide path for light emitted from the light source device 3.
  • An illumination lens 242 provided at the distal end of the light guide 241.
  • a system 243, and an image pickup device 244 provided at an image forming position of the optical system 243, in which a plurality of pixels that the optical system 243 collects light, receives light, and photoelectrically converts it into an electrical signal are arranged two-dimensionally;
  • An endoscope recording unit 245 that records various types of information related to the endoscope 2 and an imaging control unit 246 that controls the imaging element 244 are provided.
  • the image sensor 244 is configured using an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). Specifically, the imaging element 244 receives a light and performs photoelectric conversion to arrange a plurality of pixels that output an electrical signal in a two-dimensional array, and images a subject (body cavity) at a predetermined frame rate. Image data (RAW data) is output.
  • the imaging element 244 includes a pixel portion 2441 and a color filter 2442.
  • the pixel portion 2441 is formed by arranging a plurality of pixels each having a photodiode for accumulating charges according to the amount of light and an amplifier for amplifying the charges accumulated by the photodiodes in a two-dimensional matrix.
  • the color filter 2442 includes a cyan filter (hereinafter simply referred to as a “Cy filter”) that transmits light in a green wavelength band (500 nm to 600 nm) and light in a blue wavelength band (390 nm to 500 nm), and a green wavelength band. And a yellow filter (hereinafter simply referred to as “Ye filter”) that transmits light and light in a red wavelength band (600 nm to 700 nm).
  • the color filter 2442 is formed by a filter unit U1 using two Cy filters and two Ye filters, and the filter unit U1 is composed of two vertical and horizontal units (2 ⁇ 2). It is formed.
  • the color filter 2442 is formed by arranging a Ye filter and a Cy filter in a checkered pattern. As the color filter 2442, a Cy filter or a Ye filter is stacked on the light receiving surface of each pixel of the pixel portion 2441. The spectral characteristics of the Cy filter and Ye filter will be described later.
  • the endoscope recording unit 245 records various information related to the endoscope 2. For example, the endoscope recording unit 245 records identification information for identifying the endoscope 2, identification information for the imaging element 244, and the like.
  • the endoscope recording unit 245 is configured using a nonvolatile memory or the like.
  • the imaging control unit 246 controls the operation of the imaging element 244 based on the instruction information input from the processing device 5. Specifically, the imaging control unit 246 controls the frame rate and imaging timing of the imaging element 244 based on the instruction information input from the processing device 5. For example, the imaging control unit 246 causes the imaging element 244 to generate and output image data at 120 fps.
  • the operation unit 22 includes a bending knob 221 that bends the bending unit 25 in the vertical direction and the left-right direction, a treatment tool insertion unit 222 that inserts a treatment tool such as a biological forceps, a laser knife, and an inspection probe into the body cavity, and the light source device 3.
  • a plurality of switches that are operation input units for inputting operation instruction signals of peripheral devices such as an air supply means, a water supply means, and a gas supply means and a pre-freeze signal that instructs the image pickup device 244 to take a still image 223.
  • the treatment tool inserted from the treatment tool insertion portion 222 is exposed from the opening (not shown) via the treatment tool channel (not shown) of the distal end portion 24.
  • the universal cord 23 includes at least a light guide 241 and a condensing cable in which one or a plurality of cables are collected.
  • the collective cable is a signal line for transmitting and receiving signals between the endoscope 2 and the light source device 3 and the processing device 5, and is a signal line for transmitting and receiving setting data, a signal line for transmitting and receiving image data, A signal line for transmitting and receiving a driving timing signal for driving the image sensor 244 is included.
  • the universal cord 23 has a connector portion 27 that can be attached to and detached from the light source device 3.
  • the connector 27 includes a coiled coil cable 27a and a connector 28 that can be attached to and detached from the processing apparatus 5 at the extended end of the coil cable 27a.
  • the light source device 3 supplies illumination light for irradiating the subject from the distal end portion 24 of the endoscope 2.
  • the light source device 3 includes a light source unit 31, a light source driver 32, and an illumination control unit 33.
  • the light source unit 31 emits illumination light that irradiates the subject.
  • the light source unit 31 has at least one wavelength band of two different wavelength bands among the light in the red wavelength band, the light in the green wavelength band, and the light in the blue wavelength band. At least the other of the light in the first wavelength band including light, the light in the red wavelength band, the light in the green wavelength band, and the light in the two wavelength bands different from each other in the blue wavelength band And light in a second wavelength band including light in the wavelength band.
  • the light source unit 31 includes a condenser lens 311, a first light source 312, a second light source 313, and a third light source 314.
  • the condensing lens 311 is configured using one or a plurality of lenses.
  • the condensing lens 311 condenses the illumination light emitted from each of the first light source 312, the second light source 313, and the third light source 314 and emits it to the light guide 241.
  • the first light source 312 is configured using a red LED (Light Emitting Diode) lamp.
  • the first light source 312 emits light in the red wavelength band (hereinafter simply referred to as “R light”) based on the current supplied from the light source driver 32.
  • the second light source 313 is configured using a green LED lamp.
  • the second light source 313 emits light in the green wavelength band (hereinafter simply referred to as “G light”) based on the current supplied from the light source driver 32.
  • the third light source 314 is configured using a blue LED lamp.
  • the third light source 314 emits light in the blue wavelength band (hereinafter simply referred to as “B light”) based on the current supplied from the light source driver 32.
  • the light source driver 32 emits only G light to the second light source 313 by supplying current to the first light source 312, the second light source 313, and the third light source 314 under the control of the illumination control unit 33.
  • magenta light hereinafter simply referred to as “Mg light”
  • G light corresponds to light in the first wavelength band
  • Mg light corresponds to light in the second wavelength band.
  • the illumination control unit 33 controls the lighting timing of the light source unit 31 based on the instruction signal received from the processing device 5. Specifically, the illumination control unit 33 supplies G light to the endoscope 2 by causing the second light source 313 to emit G light at a predetermined cycle, while supplying R light and a third light source to the first light source 312. Mg light is supplied to the endoscope 2 by causing the 314 to emit B light simultaneously. In this case, the illumination control unit 33 causes the light source unit 31 to emit G light and Mg light alternately and intermittently.
  • the illumination control unit 33 is configured using a CPU (Central Processing Unit) or the like.
  • the display device 4 displays an image corresponding to the image data generated by the endoscope 2 received from the processing device 5.
  • the display device 4 displays various information related to the endoscope system 1.
  • the display device 4 is configured using a display panel such as liquid crystal or organic EL (Electro Luminescence).
  • the processing device 5 receives the image data generated by the endoscope 2, performs predetermined image processing on the received image data, and outputs it to the display device 4. Further, the processing device 5 comprehensively controls the operation of the entire endoscope system 1.
  • the processing device 5 includes an image processing unit 51, an input unit 52, a recording unit 53, and a processing control unit 54.
  • the image processing unit 51 receives the image data generated by the endoscope 2 under the control of the processing control unit 54, performs predetermined image processing on the received image data, and outputs the image data to the display device 4.
  • the image processing unit 51 includes the first image data generated by the image sensor 244 when the light source unit 31 emits G light, and the image sensor 244 when the light source unit 31 emits Mg light. And the second image data generated by the above.
  • predetermined image processing separation processing, composition processing, interpolation processing, OB clamping processing, gain adjustment processing, format conversion processing, and the like are performed.
  • the image processing unit 51 is configured using a GPU (Graphics Processing Unit), a DSP (Digital Signal Processing), or an FPGA (Field Programmable Gate Array).
  • the image processing unit 51 includes at least a separation unit 511, an interpolation unit 512, and a synthesis unit 513.
  • the separation unit 511 performs color separation on the image corresponding to the image data generated by the endoscope 2. Specifically, the separation unit 511 separates the pixel value of the R pixel and the pixel value of the B pixel from the image corresponding to the image data generated by the image sensor 244 when the light source device 3 emits Mg light.
  • the interpolation unit 512 performs known interpolation processing using the pixel values of the R pixels separated from the RB image by the separation unit 511, thereby generating R image data in which the pixel values of the R pixels are interpolated for all the pixels.
  • the interpolation unit 512 generates B image data having pixel values of B pixels in all pixels by performing a known interpolation process using the pixel values of B pixels separated from the RB image by the separation unit 511.
  • the interpolation processing includes bilinear interpolation processing, direction discrimination interpolation processing, and the like.
  • the interpolation unit 512 may generate the R image data and the B image data using another interpolation process.
  • the synthesizing unit 513 performs the imaging element 244 when the R image data and B image data generated by the interpolation unit 512 and the light source device 3 emit G light.
  • the G image data generated by is generated to generate color image data, and the generated color image data is output to the display device 4.
  • the N image is generated by synthesizing the G image data generated by the imaging device 244 when the light source device 3 emits the G light with the B image data generated by the interpolation unit 512. Image data is generated, and the generated NBI image data is output to the display device 4.
  • the input unit 52 receives an input of an instruction signal for instructing the operation of the endoscope system 1 and outputs the received instruction signal to the processing control unit 54.
  • the input unit 52 receives an input of an instruction signal instructing either the white light observation method or the NBI observation method, and outputs the received instruction signal to the processing control unit 54.
  • the input unit 52 is configured using switches, buttons, a touch panel, and the like.
  • the recording unit 53 records various programs executed by the endoscope system 1, data being executed by the endoscope system 1, and image data generated by the endoscope 2.
  • the recording unit 53 is configured using a volatile memory, a nonvolatile memory, a memory card, and the like.
  • the recording unit 53 includes a program recording unit 531 that records various programs executed by the endoscope system 1.
  • the process control unit 54 is configured using a CPU.
  • the process control unit 54 controls each unit constituting the endoscope system 1. For example, when the instruction signal for switching the illumination light emitted from the light source device 3 is input from the input unit 52, the processing control unit 54 controls the illumination control unit 33 to switch the illumination light emitted from the light source device 3. .
  • FIG. 4 is a diagram schematically showing the spectral characteristics of the color filter 2442 and each light emitted from the light source device 3.
  • the horizontal axis indicates the wavelength
  • the right vertical axis indicates the sensitivity of each pixel (filter transmittance)
  • the left vertical axis indicates the intensity of each light.
  • a curve L Cy indicates the spectral sensitivity characteristic of the Cy pixel
  • a curve L Ye indicates the spectral sensitivity characteristic of the Ye pixel.
  • the curve L B represents a wavelength characteristic of the B light
  • the curve L G represents a wavelength characteristic of the G light
  • the curve L R represents the wavelength characteristic of the R light.
  • the Ye pixel has sensitivity to R light and G light (the transmittance of the filter is high).
  • the Cy pixel has sensitivity to B light and G light (filter transmittance is high). Therefore, as shown in FIG. 5, when the light source apparatus 3 is emitted from the G light, the image pickup device 244 can generate an image P G all the pixels have a G pixel.
  • the image sensor 244 when the light source device 3 emits Mg, the image sensor 244 can generate an image P Mg in which R pixels and B pixels are in a checkered pattern.
  • the light source device 3 supplies G light or Mg light to the endoscope 2, and the imaging device 244 is irradiated with G light or Mg light. The subject is sequentially imaged.
  • FIG. 7 is a flowchart showing an outline of processing executed by the endoscope system 1.
  • FIG. 8 is a diagram schematically illustrating an image generated by the image processing unit 51.
  • the process control unit 54 causes the light source device 3 to emit G light (step S101), and causes the image sensor 244 to image the subject irradiated with the G light (step S102).
  • the image pickup device 244 generates an image P G corresponding to the image data by imaging the subject G light is irradiated.
  • the process control unit 54 causes the light source device 3 to emit Mg light (step S103), and causes the image sensor 244 to image the subject irradiated with the Mg light (step S104).
  • the imaging element 244 generates an image PMg corresponding to the image data by imaging the subject irradiated with the Mg light.
  • the separation unit 511 performs color separation of the pixel value of the R pixel and the pixel value of the B pixel on the image PMg (Step S105).
  • the separation unit 511 separates the pixel value of the R pixel from the image P Mg corresponding to the image data generated by the image sensor 244 and the image P R1 and the image P of the R pixel.
  • An image P B1 of B pixels obtained by separating the pixel values of B pixels from Mg is generated.
  • the interpolation unit 512 performs interpolation processing on each of the image P R1 and the image P B1 (step S106). Specifically, as illustrated in FIG. 8, the interpolation unit 512 performs an interpolation process for interpolating the pixel values of the other R pixels using the pixel values of the image PR1 , thereby causing the pixel values of the R pixels to be applied to all the pixels. Is generated as an interpolated image PR2 . Further, the interpolation unit 512 generates an image P B2 in which the pixel values of B pixels are interpolated in all pixels by performing an interpolation process that interpolates the pixel values of other B pixels using the pixel values of the image P B1. .
  • the synthesis unit 513 generates a white image (step S108). Specifically, as illustrated in FIG. 8, the combining unit 513 generates a white image P W (color image) by combining the image P R2 , the image P B2, and the image G. As a result, the endoscope system 1 can generate a white image in two frames (two fields) by two irradiations of G light and Mg light.
  • step S109: Yes when an instruction signal for instructing termination is input from the operation unit 22 or the input unit 52 (step S109: Yes), the endoscope system 1 ends this process. On the other hand, when the instruction signal for instructing the end is not input from the operation unit 22 or the input unit 52 (step S100: No), the endoscope system 1 returns to step S101.
  • step S107 when the white observation method is not set in the endoscope system 1 (step S107: No), the synthesis unit 513 generates an NBI image (step S110). Specifically, as shown in FIG. 8, the synthesis unit 513 generates an NBI image P NBI by combining the images P G and the image P B2. As a result, the endoscope system 1 can generate an NBI image P NBI in two frames (two fields) by two irradiations of G light and Mg light. In the first embodiment, either one of the white image and the NBI image is generated according to the setting of the endoscope system 1. However, the present invention is not limited to this, and the white image and the NBI image are simultaneously generated. It may be generated.
  • the white image and the NBI image may be simultaneously displayed on the display device 4, or the reduced NBI image may be superimposed on the white image and displayed on the display device 4, or the NBI may be displayed.
  • a reduced white image may be superimposed on the image and displayed on the display device 4.
  • the endoscope system 1 can generate a white image and an NBI image in two frames (two fields) by two irradiations of G light and Mg light.
  • Modification of Embodiment 1 Next, a modification of the first embodiment will be described.
  • the modification of the first embodiment is different in configuration from the color filter 2442 according to the first embodiment described above.
  • a configuration of the color filter according to the modification of the first embodiment will be described.
  • symbol is attached
  • FIG. 9 is a diagram schematically illustrating a configuration of a color filter according to a modification of the first embodiment.
  • the color filter 2442A shown in FIG. 9 is formed by a filter unit U2 using one Ye filter and three Cy filters, and this filter unit U2 is formed by two vertical and horizontal units (2 ⁇ 2).
  • the color filter 2442A is arranged such that the number of Ye filters is smaller than the number of Cy filters.
  • each of the Ye filter and the Cy filter is stacked on the light receiving surface of each pixel of the pixel portion 2441.
  • the light source device 3 supplies G light or Mg light to the endoscope 2 under the control of the processing device 5, and the imaging element 244 is G light or Mg light.
  • the irradiated subject is sequentially imaged.
  • the Ye filter and the Cy filter transmit G light. Therefore, as shown in FIG. 10, the image pickup device 244 can generate an image P G corresponding to the image data.
  • the Ye filter transmits R light
  • the Cy filter transmits B light.
  • the image sensor 244 can generate an image P Mg2 corresponding to image data composed of R pixels and B pixels.
  • R light has few high frequency components in the spatial frequency.
  • the color filter 2442A is provided with a 1/4 Ye filter as a whole, so that the resolution of the color signal can be optimized. Furthermore, since the color filter 2442A is provided with a 3/4 Cy filter as a whole, a high-resolution NBI image can be obtained when the narrow-band light observation method is performed.
  • the 3/4 Cy filter is provided for the entire color filter 2442A. Therefore, when performing the narrow-band light observation method, a high-resolution NBI image is obtained. be able to.
  • FIG. 12 is a circuit diagram schematically showing the configuration of the image sensor according to the second embodiment.
  • An imaging element 244B illustrated in FIG. 12 includes a pixel portion 2441B and a color filter 2442.
  • the color filter 2442 is indicated by a dotted line, and is stacked on the light receiving surface of each pixel.
  • the pixel portion 2441B shown in FIG. 12 is 8 pixels and shares the charge conversion portion, the reset transistor, the amplification transistor, and the selection transistor.
  • the pixel portion 2441B illustrated in FIG. 12 includes eight photoelectric conversion elements 301 (photodiodes), a transfer transistor 302 provided in each of the eight photoelectric conversion elements 301, a charge conversion unit 303, and a reset.
  • the pixel includes a transistor 304, a pixel source follower transistor 305, a selection transistor 306, and a vertical transfer line 307.
  • the photoelectric conversion element 301 photoelectrically converts incident light into a signal charge amount corresponding to the amount of light and accumulates it.
  • the photoelectric conversion element 301 has a cathode side connected to one end side of the transfer transistor 302 and an anode side connected to the ground.
  • the transfer transistor 302 transfers charges from the photoelectric conversion element 301 to the charge conversion unit 303.
  • the charge conversion unit 303 is connected to.
  • the transfer transistor 302 is turned on when a drive signal ⁇ T is supplied from a vertical scanning unit (not shown) via a signal line, and transfers charge from the photoelectric conversion element 301 to the charge conversion unit 303.
  • the charge conversion unit 303 includes a floating diffusion capacitor (FD), and converts the charge accumulated in the photoelectric conversion element 301 into a voltage.
  • FD floating diffusion capacitor
  • the reset transistor 304 resets the charge conversion unit 303 to a predetermined potential.
  • the reset transistor 304 has one end connected to the power supply voltage VDD, the other end connected to the charge converter 303, and a gate connected to a signal line to which the drive signal ⁇ R is supplied.
  • the reset transistor 304 is turned on when a drive signal ⁇ R is supplied from a vertical scanning unit (not shown) via a signal line, and releases the signal charge accumulated in the charge conversion unit 303, thereby causing the charge conversion unit 303 to be predetermined. Reset to potential.
  • the pixel source follower transistor 305 has one end connected to the selection transistor 306, the other end connected to the vertical transfer line 307, and a gate connected to a signal line for transmitting a signal (image signal) converted by the charge converter 303. Is done.
  • the selection transistor 306 has one end connected to the power supply voltage VDD, the other end connected to the pixel source follower transistor 305, and a gate connected to a signal line to which a row selection signal ⁇ X is supplied.
  • the selection transistor 306 is turned on when the row selection signal ⁇ X is supplied, and transfers the signal (image signal) output from the pixel source follower transistor 305 to the vertical transfer line 307.
  • the imaging control unit 246 inputs a transfer pulse to the signal line ⁇ T1 and reads the signal charges of the Cy pixels in the first row. Subsequently, the imaging control unit 246 inputs a transfer pulse to the signal line ⁇ T2 and reads the signal charge of the Ye pixels in the first row. In this manner, the imaging control unit 246 sequentially reads out the signal charges for each row of the imaging element 244B.
  • the imaging element 244B since the light source apparatus 3 is emitted G light, it is possible that all the pixels to produce an image P G is a G pixel.
  • the combining unit 513 combines four G pixels as one G pixel PGA1 . As a result, a G image with increased sensitivity can be generated.
  • the case of pixel mixture readout of the image sensor 244B will be described. Note that, when the image sensor 244B performs pixel mixture readout, the light source device 3 emits Mg light. For this reason, the image sensor 244B images the subject irradiated with the Mg light.
  • the imaging control unit 246 inputs a transfer pulse to the signal line ⁇ T1 and the signal line ⁇ T4, and reads the signal charges of the Cy pixels in the first row and the signal charges of the Cy pixels in the second row. Subsequently, the imaging control unit 246 inputs a transfer pulse to the signal line ⁇ T2 and the signal line ⁇ T3, and reads the signal charge of the Ye pixel in the first row and the signal charge of the Ye pixel in the second row. As described above, the imaging control unit 246 reads out signal charges from pixels of the same color every two rows of the imaging element 244B. In this case, as shown in FIGS.
  • the imaging element 244B since the light source apparatus 3 is emitted Mg light, it is possible to generate an image P Mg1 which R and B pixels is checkered pattern .
  • the combining unit 513 combines two pixels of the same color that are diagonally adjacent to each other as one pixel. For example, as illustrated in FIG. 14, the combining unit 513 combines two diagonally adjacent R pixels as one R pixel PRA1 . Further, as illustrated in FIG. 15, the combining unit 513 combines two diagonally adjacent B pixels as one B pixel P BA1 . Thereby, it is possible to generate an R image and a B image with increased sensitivity.
  • the imaging control unit 246 when the imaging control unit 246 is irradiated with G light from the light source device 3, the pixel values of the G pixels constituting the imaging element 244B are added for each predetermined number of pixels and output.
  • the pixel value of the B pixel or the pixel value of the R pixel that can receive light in the same wavelength band among the pixels that constitute the imaging element 244B when the imaging control unit 246 is irradiated with Mg light by the light source device 3. Are added for each predetermined number of pixels and output. As a result, it is possible to generate a white image and an NBI image with increased sensitivity.
  • the imaging control unit 246 adds the pixel values of the pixels that can receive the light in the same wavelength band constituting the imaging element 244B by the charge conversion unit 303 for each predetermined number of pixels, and performs the conversion. By outputting the pixel value, the synthesis process by the synthesis unit 513 may be omitted.
  • Embodiment 3 Next, Embodiment 3 will be described.
  • the third embodiment is different in configuration from the color filter 2442 according to the first embodiment described above, and is different in the type of illumination light emitted from the light source device 3.
  • the configuration of the color filter according to Embodiment 3 will be described.
  • symbol is attached
  • FIG. 16 is a diagram schematically illustrating a configuration of a color filter according to the third embodiment.
  • the 16 is configured using a magenta filter (hereinafter simply referred to as “Mg filter”) that transmits light in the red wavelength band and light in the blue wavelength band, and a Cy filter.
  • Mg filter magenta filter
  • the color filter 2442B is formed by a filter unit U3 using two Mg filters and two Cy filters, and the filter unit U3 is formed by two vertical and horizontal units (2xx).
  • the color filter 2442B is formed by arranging Mg filters and Cy filters in a checkered pattern. As the color filter 2442B, an Mg filter or a Cy filter is stacked on the light receiving surface of each pixel of the pixel portion 2441.
  • the light source device 3 emits G light and R light at the same time under the control of the processing device 5 to emit light in the red wavelength band and light in the green wavelength.
  • Including light (hereinafter simply referred to as “Ye light”) or B light is supplied to the endoscope 2, and the imaging element 244 sequentially images the subject irradiated with the Ye light or B light.
  • B light corresponds to light in the first wavelength band
  • Ye light corresponds to light in the second wavelength band.
  • the Mg filter transmits R light
  • the Cy filter transmits G light.
  • the image sensor 244 can generate an image P Ye in which the R pixel and the G pixel are in a checkered pattern.
  • the Mg filter and the Cy filter transmit B light.
  • the image sensor 244 can generate an image P B.
  • the image processing unit 51 performs the same process as that in the first embodiment. Thereby, when performing the narrow band observation method, since the pixel value of the B pixel is large, it is possible to improve the resolution of the NBI image.
  • the endoscope system 1 can generate a white image and an NBI image in two frames (two fields) by two irradiations of B light and Ye light.
  • the fourth embodiment is different from the light source device 3 according to the first embodiment described above.
  • the configuration of the endoscope system according to the fourth embodiment will be described.
  • symbol is attached
  • FIG. 19 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the fourth embodiment.
  • An endoscope system 1C shown in FIG. 19 includes a light source device 3C instead of the light source device 3 according to the first embodiment described above.
  • the light source device 3C includes a light source unit 31C instead of the light source unit 31 of the light source device 3 according to Embodiment 1 described above.
  • the light source unit 31C includes a fourth light source 315 and a fifth light source 316 in addition to the configuration of the light source unit 31 according to Embodiment 1 described above.
  • the fourth light source 315 is configured using a purple LED lamp.
  • the fourth light source 315 emits light in the violet wavelength band (400 nm to 435 nm) (hereinafter simply referred to as “V light”) based on the current supplied from the light source driver 32.
  • the fifth light source 316 is configured using an orange LED lamp.
  • the fifth light source 316 emits light in the orange (Amber) wavelength band (585 nm to 620 nm) (hereinafter simply referred to as “A light”) based on the current supplied from the light source driver 32.
  • FIG. 20 is a diagram schematically illustrating spectral characteristics of the color filter 2442 and each light emitted from the light source device 3C.
  • the horizontal axis indicates the wavelength
  • the right vertical axis indicates the sensitivity of each pixel
  • the left vertical axis indicates the intensity of each light.
  • a curve L Cy indicates the spectral sensitivity characteristic of the Cy pixel
  • a curve L Ye indicates the spectral sensitivity characteristic of the Ye pixel.
  • the curve L V represents a wavelength characteristic of the V light
  • curve L B represents a wavelength characteristic of the B light
  • the curve L G represents a wavelength characteristic of the G light
  • the curve L A represents a wavelength characteristic of the A light
  • curve L R represents the wavelength characteristic of the R light.
  • the Ye filter transmits R light, A light, and G light.
  • the Cy filter transmits V light, B light, and G light. Therefore, as shown in FIG. 21, when the light source device 3C is emitted from the G light, the image pickup device 244 can generate an image P G all the pixels have a G pixel.
  • the imaging element 244 when the light source device 3 ⁇ / b> C emits B light and R light, the imaging element 244 can generate an image P BR in which the R pixel and the B pixel are checkered. Furthermore, as shown in FIG.
  • imaging device 244 may generate an image P AV where A pixel and V pixel a checkered pattern. That is, according to the fourth embodiment, a plurality of images having different uses, for example, three or more types of images can be generated in three frames.
  • FIG. 24 is a timing chart illustrating an outline of processing executed by the endoscope system 1C.
  • the light source device 3C has a G light irradiation frame for irradiating G light, a B + R light irradiation frame for simultaneously irradiating B light and R light, and a V + A light for simultaneously irradiating V light and A light. Three patterns of irradiation frames are performed. In FIG. 24, the horizontal axis indicates time. Further, in FIG.
  • the time obtained by adding the irradiation time and the signal readout time is one cycle, and the time of this one cycle is 1/90 seconds (90 fps).
  • the time of one cycle can be changed as appropriate, and may be 1/120 seconds or 1/240 seconds, for example, or can be changed as appropriate depending on the subject, the examination content, the site, and the like.
  • the light source device 3C supplies G light to the endoscope 2, so that the endoscope 2 irradiates the subject with G light.
  • the image sensor 244 images the subject irradiated with the G light.
  • the imaging control unit 246 reads a signal from the imaging element 244 and outputs the signal to the image processing unit 51.
  • the light source device 3C supplies the B light + R light to the endoscope 2 so that the endoscope 2 irradiates the subject with the B light + R light.
  • the image sensor 244 images the subject irradiated with the B light + R light.
  • the imaging control unit 246 reads a signal from the imaging element 244 and outputs the signal to the image processing unit 51.
  • the image processing unit 51 performs the separation process and the interpolation process similar to those of the first embodiment described above on the image P BR to generate the image P B2 and the image PR2 , and the image P G and the image P A white image is generated by performing a combining process for combining B2 and the image PR2 .
  • the light source device 3C supplies V light + A light to the endoscope 2 so that the endoscope 2 irradiates the subject with V light + A light.
  • the image sensor 244 images the subject irradiated with the V light + A light.
  • the imaging control unit 246 reads a signal from the imaging element 244 and outputs the signal to the image processing unit 51.
  • the light source device 3 ⁇ / b> C supplies G light to the endoscope 2, so that the endoscope 2 irradiates the subject with G light.
  • the image sensor 244 images the subject irradiated with the G light.
  • the imaging control unit 246 reads a signal from the imaging element 244 and outputs the signal to the image processing unit 51.
  • the image processing unit 51 generates the image P V and the image P A by performing separation processing and interpolation processing on the image P VA , and the generated image P V and the image P A. generating a MBI image (Multiband Imaging) by combining G and B lights + R light image P G generated upon irradiation of the image P B2 and the image P R2.
  • the image P A is used when extracting the abnormal area having a predetermined feature quantity by learning, not shown.
  • the abnormal areas are a cancer area, a bleeding area, and a lesion area.
  • the endoscope system 1C has a plurality of types including a white image and an MBI image by repeating a pattern of one period in three frames of irradiation patterns of G light, B light + R light, and V light + A light. Images can be generated.
  • the illumination control unit 33 causes the light source device 3C to repeat the pattern of one cycle in the irradiation pattern of G light, B light + R light, and V light + A light.
  • Multiple types of images can be generated including white and MBI images.
  • a learning device or AI that has learned the feature amount of an abnormal region (for example, cancer, bleeding, etc.) may extract the abnormal region.
  • the feature part or feature region extracted by the learning device or AI may be superimposed on the white image and displayed on the display device 4.
  • color information necessary for diagnosis can be generated from a small number of frames while achieving both high image quality and downsizing, and color misregistration can be suppressed.
  • the frame rate is increased by sequentially irradiating V light, B light, A light, and R light, or by simultaneously irradiating only V light and B light. can do.
  • the sensitivity may be increased by adding pixel values of pixels of the same color as in the third embodiment.
  • control device and the light source device are separate, but may be integrally formed.
  • each light is emitted by the LED lamp.
  • the present invention is not limited to this.
  • each light may be emitted by using a laser light source.
  • each light may be emitted using a white light source and a filter that transmits the wavelength band of each light.
  • the endoscope system is used.
  • a capsule endoscope, a video microscope for imaging a subject, a mobile phone having an imaging function, and an imaging function are provided. Even a tablet terminal can be applied.
  • the endoscope system includes a flexible endoscope.
  • an endoscope system including a rigid endoscope, and an industrial endoscope are used. Even an endoscopic system provided can be applied.
  • the endoscope system includes an endoscope that is inserted into a subject.
  • an endoscope such as a sinus endoscope and an electric knife or a test probe is used. Even a system can be applied.
  • the “unit” described above can be read as “means”, “circuit”, and the like.
  • the control unit can be read as control means or a control circuit.
  • the program executed by the endoscope system according to the first to fourth embodiments of the present disclosure is a file data in an installable format or an executable format, and is a CD-ROM, a flexible disk (FD), a CD-R, Provided by being recorded on a computer-readable recording medium such as a DVD (Digital Versatile Disk), a USB medium, or a flash memory.
  • a computer-readable recording medium such as a DVD (Digital Versatile Disk), a USB medium, or a flash memory.
  • the program executed by the endoscope system according to Embodiments 1 to 4 of the present disclosure is configured to be provided by being stored on a computer connected to a network such as the Internet and downloaded via the network. May be.
  • the program to be executed by the endoscope system according to the first to fourth embodiments of the present disclosure may be provided or distributed via a network such as the Internet.
  • signals are transmitted from the endoscope to the processing device using the transmission cable. May be.
  • an image signal or the like may be transmitted from the endoscope to the processing device according to a predetermined wireless communication standard (for example, Wi-Fi (registered trademark) or Bluetooth (registered trademark)).
  • a predetermined wireless communication standard for example, Wi-Fi (registered trademark) or Bluetooth (registered trademark)
  • wireless communication may be performed according to other wireless communication standards.

Abstract

L'invention concerne un système d'endoscope, un procédé de traitement d'image et un programme qui permettent d'obtenir des images couleur de haute qualité sans dérive des couleurs, ainsi que des images spéciales. Le système d'endoscope (1) comprend : un élément d'imagerie (244) comportant un filtre coloré (2442) stratifié sur la surface de réception de la lumière de chaque pixel et apte à générer des données d'image par capture d'images d'un sujet ; une source de lumière (31) capable d'émettre de la lumière dans une première bande de longueur d'onde et dans une seconde bande de longueur d'onde ; une unité de commande d'éclairage (33) qui amène la source de lumière (31) à émettre en alternance de la lumière dans la première bande de longueur d'onde et dans la seconde bande de longueur d'onde ; et une unité de traitement d'image (51) qui synthétise des premières données d'image générées par l'élément d'imagerie (244) lorsque la source de lumière (31) émet de la lumière dans la première bande de longueur d'onde et des secondes données d'image générées par l'élément d'imagerie (244) lorsque la source de lumière (31) émet de la lumière dans la seconde bande de longueur d'onde.
PCT/JP2018/029313 2018-03-23 2018-08-03 Système d'endoscope, procédé de traitement d'image et programme WO2019180983A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018055586 2018-03-23
JP2018-055586 2018-03-23

Publications (1)

Publication Number Publication Date
WO2019180983A1 true WO2019180983A1 (fr) 2019-09-26

Family

ID=67986818

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/029313 WO2019180983A1 (fr) 2018-03-23 2018-08-03 Système d'endoscope, procédé de traitement d'image et programme

Country Status (1)

Country Link
WO (1) WO2019180983A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021149357A1 (fr) * 2020-01-23 2021-07-29 富士フイルム株式会社 Système d'endoscope et son procédé de fonctionnement

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5846935A (ja) * 1981-09-12 1983-03-18 富士写真光機株式会社 固体撮像素子を用いた内視鏡装置
JPS58103432A (ja) * 1981-12-14 1983-06-20 富士写真フイルム株式会社 固体撮像素子を用いた内視鏡装置
JPH03101397A (ja) * 1989-09-13 1991-04-26 Olympus Optical Co Ltd 電子内視鏡装置
JPH06225316A (ja) * 1993-01-28 1994-08-12 Canon Inc カラー光電変換装置
JP2014128423A (ja) * 2012-12-28 2014-07-10 Olympus Medical Systems Corp 内視鏡システム
JP2015066062A (ja) * 2013-09-27 2015-04-13 富士フイルム株式会社 内視鏡システム及びその作動方法並びに内視鏡用光源装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5846935A (ja) * 1981-09-12 1983-03-18 富士写真光機株式会社 固体撮像素子を用いた内視鏡装置
JPS58103432A (ja) * 1981-12-14 1983-06-20 富士写真フイルム株式会社 固体撮像素子を用いた内視鏡装置
JPH03101397A (ja) * 1989-09-13 1991-04-26 Olympus Optical Co Ltd 電子内視鏡装置
JPH06225316A (ja) * 1993-01-28 1994-08-12 Canon Inc カラー光電変換装置
JP2014128423A (ja) * 2012-12-28 2014-07-10 Olympus Medical Systems Corp 内視鏡システム
JP2015066062A (ja) * 2013-09-27 2015-04-13 富士フイルム株式会社 内視鏡システム及びその作動方法並びに内視鏡用光源装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021149357A1 (fr) * 2020-01-23 2021-07-29 富士フイルム株式会社 Système d'endoscope et son procédé de fonctionnement
JP7386266B2 (ja) 2020-01-23 2023-11-24 富士フイルム株式会社 内視鏡システム及びその作動方法

Similar Documents

Publication Publication Date Title
US10159404B2 (en) Endoscope apparatus
JP5616664B2 (ja) 内視鏡システム
JP5245022B1 (ja) 撮像装置
JP2011250926A (ja) 電子内視鏡システム
WO2015093104A1 (fr) Dispositif d'endoscope
JP5899172B2 (ja) 内視鏡装置
JP2013000176A (ja) 内視鏡システム、内視鏡システムの光源装置、及び光量制御方法
WO2017158692A1 (fr) Dispositif d'endoscope, dispositif de traitement d'image, procédé de traitement d'image et programme
US20190246875A1 (en) Endoscope system and endoscope
WO2016129062A1 (fr) Dispositif de traitement d'image, système endoscope, dispositif d'imagerie, procédé de traitement d'image et programme
US11571111B2 (en) Endoscope scope, endoscope processor, and endoscope adaptor
WO2019069414A1 (fr) Dispositif d'endoscope, procédé de traitement d'image et programme
WO2019180983A1 (fr) Système d'endoscope, procédé de traitement d'image et programme
US10863149B2 (en) Image processing apparatus, image processing method, and computer readable recording medium
JP6190906B2 (ja) 撮像モジュール、及び内視鏡装置
JP6137892B2 (ja) 撮像システム
JP2010184047A (ja) 内視鏡、内視鏡駆動方法、並びに内視鏡システム
JP6444450B2 (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡システムの作動方法
JP2010184046A (ja) 内視鏡、内視鏡駆動方法、並びに内視鏡システム
US20210007575A1 (en) Image processing device, endoscope system, image processing method, and computer-readable recording medium
WO2017056784A1 (fr) Dispositif d'imagerie, endoscope et système d'endoscope
US20220173145A1 (en) Image sensor, endoscope, and endoscope system
US9517002B2 (en) Solid state image sensor, endoscope, and endoscope system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18910852

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18910852

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP