WO2019180983A1 - Endoscope system, image processing method, and program - Google Patents

Endoscope system, image processing method, and program Download PDF

Info

Publication number
WO2019180983A1
WO2019180983A1 PCT/JP2018/029313 JP2018029313W WO2019180983A1 WO 2019180983 A1 WO2019180983 A1 WO 2019180983A1 JP 2018029313 W JP2018029313 W JP 2018029313W WO 2019180983 A1 WO2019180983 A1 WO 2019180983A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
wavelength band
light source
image data
image
Prior art date
Application number
PCT/JP2018/029313
Other languages
French (fr)
Japanese (ja)
Inventor
理 足立
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Publication of WO2019180983A1 publication Critical patent/WO2019180983A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0655Control therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/26Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides

Definitions

  • the present disclosure relates to an endoscope system, an image processing method, and a program that are introduced into a living body of a subject and capture an image in the living body.
  • a white light observation method (WLI: White Light Imaging) using white illumination light (white light) and two narrow bands included in the blue and green wavelength bands, respectively.
  • Special light observation methods such as a narrow band light observation method (NBI: Narrow Band Imaging) using illumination light (narrow band light) made of light are widely known.
  • NBI narrow band Light observation method
  • a color image can be obtained by irradiating white light.
  • narrow band light observation method by irradiating narrow band light, it is possible to obtain a special image that highlights capillaries and mucous membrane fine patterns existing on the surface of the mucosa of the living body.
  • the present disclosure has been made in view of the above, and an object of the present disclosure is to provide an endoscope system, an image processing method, and a program capable of obtaining each of a color image and a special image with high image quality and no color misregistration.
  • an endoscope system includes a red wavelength band light, a green wavelength band light, and a blue wavelength band light.
  • An image sensor that is formed by stacking color filters configured using two types of filters that transmit light of two different wavelength bands on the light receiving surface of each pixel, and that can generate image data by imaging a subject;
  • a light source unit that can irradiate the light source unit, an illumination control unit that alternately irradiates light of the first wavelength band and light of the second wavelength band to the light source unit, and the light source unit has the first wavelength
  • the imaging when irradiated with light of the band An image processing unit that synthesizes the first image data generated by the child and the second image data generated by the imaging device when the light source unit emits light of the second wavelength band; It is characterized by providing.
  • the two types of filters include a cyan filter that transmits light in the green wavelength band and light in the blue wavelength band, and a red wavelength band.
  • a yellow filter that transmits light and light in the green wavelength band
  • the light source unit emits light in the red wavelength band and first light source that can emit light in the green wavelength band
  • Two light sources and a third light source capable of irradiating light in the blue wavelength band, and the illumination control unit supplies the second light source with light in the green wavelength band as light in the first wavelength band.
  • the first light source While irradiating light, as the light of the second wavelength band, the first light source is simultaneously irradiated with light of the red wavelength band and the third light source of light of the blue wavelength band, and the image processing unit Is included in the second image data.
  • a separation unit that generates red image data and blue image data by separating the pixel value of the light in the red wavelength band and the pixel value of the light in the blue wavelength band from the corresponding second image;
  • a combining unit that generates white image data by combining the red image data generated by the separating unit, the blue image data, and the first image data.
  • the combining unit generates special image data by combining the blue image data generated by the separating unit and the first image data. It is characterized by doing.
  • the two types of filters include a cyan filter that transmits light in the green wavelength band and light in the blue wavelength band, and a red wavelength band.
  • a magenta filter that transmits light and light in the blue wavelength band wherein the light source unit includes a first light source capable of irradiating light in the red wavelength band, and a first light source capable of irradiating light in the green wavelength band.
  • Two light sources and a third light source capable of irradiating light in the blue wavelength band, and the illumination control unit supplies the third light source with light in the blue wavelength band as light in the first wavelength band.
  • the first light source While irradiating light, as the light of the second wavelength band, the first light source is simultaneously irradiated with light of the red wavelength band and light of the green wavelength band to the second light source, and the image processing unit Is included in the second image data.
  • a separation unit that generates red image data and green image data by separating the pixel value of the light in the red wavelength band and the pixel value of the light in the green wavelength band from the corresponding second image;
  • a combining unit that generates white image data by combining the red image data generated by the separation unit, the green image data, and the first image data.
  • the synthesis unit generates special image data by synthesizing the green image data separated by the separation unit and the first image data. It is characterized by doing.
  • a pixel value of each pixel constituting the imaging element is set to a predetermined number of pixels. Pixels of pixels that can receive light of the same wavelength band among the pixels that constitute the image sensor when the light source unit emits light of the second wavelength band while adding and outputting each time An imaging control unit that adds and outputs a value for each predetermined number of pixels is provided.
  • the light source unit includes a fourth light source capable of emitting light in a purple wavelength band, and a fifth light source capable of emitting light in an orange wavelength band.
  • the illumination control unit irradiates the light of the second wavelength band to the light source unit, and then causes the fourth light source to emit light of the purple wavelength band and the fifth light source to orange. It is characterized in that the light of the wavelength band is simultaneously emitted.
  • the image processing method transmits two kinds of light in two different wavelength bands among light in the red wavelength band, light in the green wavelength band, and light in the blue wavelength band.
  • a color filter configured using a filter is stacked on the light-receiving surface of each pixel, and an image sensor that can generate image data by imaging a subject, and at least one wavelength band of light in the two wavelength bands
  • a light source unit capable of irradiating light in a first wavelength band including the light in the second wavelength band and light in a second wavelength band including light in at least the other wavelength band of the light in the two wavelength bands.
  • a control method executed by an endoscope system wherein the light source unit alternately irradiates light of the first wavelength band and light of the second wavelength band to the light source unit, and the light source unit Irradiates light of 1 wavelength band
  • the first image data generated by the image sensor when the light source unit emits light of the second wavelength band, and the second image data generated by the image sensor And an image processing step.
  • the program according to the present disclosure includes two types of filters that transmit light in two different wavelength bands among light in a red wavelength band, light in a green wavelength band, and light in a blue wavelength band.
  • a light source unit capable of irradiating light in a first wavelength band including light and light in a second wavelength band including light in at least the other wavelength band of the two wavelength bands.
  • An illumination control step for causing the light source unit to alternately irradiate the light of the first wavelength band and the light of the second wavelength band to the mirror system, and the light source unit irradiated the light of the first wavelength band
  • FIG. 1 is a schematic configuration diagram of an endoscope system according to the first embodiment.
  • FIG. 2 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the first embodiment.
  • FIG. 3 is a diagram schematically showing the configuration of the color filter.
  • FIG. 4 is a diagram schematically illustrating spectral characteristics of the color filter and each light emitted from the light source device.
  • FIG. 5 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in the green wavelength band.
  • FIG. 6 is a diagram illustrating an example of an image generated by the imaging device when the light source device simultaneously emits light in the red wavelength band and light in the blue wavelength band.
  • FIG. 1 is a schematic configuration diagram of an endoscope system according to the first embodiment.
  • FIG. 2 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the first embodiment.
  • FIG. 3 is a diagram schematically showing the configuration
  • FIG. 7 is a flowchart illustrating an outline of processing executed by the endoscope system according to the first embodiment.
  • FIG. 8 is a diagram schematically illustrating an image generated by the image processing unit.
  • FIG. 9 is a diagram schematically illustrating a configuration of a color filter according to a modification of the first embodiment of the present disclosure.
  • FIG. 10 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in the green wavelength band.
  • FIG. 11 is a diagram illustrating an example of an image generated by the imaging device when the light source device simultaneously emits light in the red wavelength band and light in the blue wavelength band.
  • FIG. 12 is a circuit diagram schematically showing the configuration of the image sensor according to the second embodiment.
  • FIG. 13 is a diagram schematically illustrating an image sensor addition method.
  • FIG. 14 is a diagram schematically illustrating an image sensor addition method.
  • FIG. 15 is a diagram schematically illustrating an image sensor addition method.
  • FIG. 16 is a diagram schematically illustrating a configuration of a color filter according to the third embodiment.
  • FIG. 17 is a diagram illustrating an example of an image generated by the imaging device when the light source device simultaneously emits light in the green wavelength band and light in the red wavelength band.
  • FIG. 18 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in a blue wavelength band.
  • FIG. 19 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the fourth embodiment.
  • FIG. 19 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the fourth embodiment.
  • FIG. 20 is a diagram schematically illustrating spectral characteristics of the color filter 2442 and each light emitted from the light source device 3C.
  • FIG. 21 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in the green wavelength band.
  • FIG. 22 is a diagram illustrating an example of an image generated by the imaging device when the light source device simultaneously emits light in the blue wavelength band and light in the red wavelength band.
  • FIG. 23 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in a purple wavelength band and light in an orange wavelength band.
  • FIG. 24 is a timing chart illustrating an outline of processing executed by the endoscope system according to the fourth embodiment.
  • FIG. 1 is a schematic configuration diagram of an endoscope system according to the first embodiment.
  • FIG. 2 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the first embodiment.
  • An endoscope system 1 shown in FIGS. 1 and 2 inserts an endoscope into a subject such as a patient, images the inside of the subject, and outputs the captured image data to an external display device.
  • a user such as a doctor examines the presence or absence of each of a bleeding site, a tumor site, and an abnormal site, which are detection target sites, by observing the in-vivo images displayed on the display device.
  • the endoscope system 1 includes an endoscope 2, a light source device 3, a display device 4, and a processing device 5 (processor).
  • the endoscope 2 captures an image of the inside of the subject, generates image data (RAW data), and outputs the generated image data to the processing device 5.
  • the endoscope 2 includes an insertion unit 21, an operation unit 22, and a universal cord 23.
  • the insertion part 21 has an elongated shape having flexibility.
  • the insertion portion 21 is connected to a distal end portion 24 incorporating an imaging element 244 described later, a bendable bending portion 25 constituted by a plurality of bending pieces, and a proximal end side of the bending portion 25, and has flexibility. And a long flexible tube portion 26.
  • the distal end portion 24 is configured using a glass fiber or the like, and forms a light guide path for light emitted from the light source device 3.
  • An illumination lens 242 provided at the distal end of the light guide 241.
  • a system 243, and an image pickup device 244 provided at an image forming position of the optical system 243, in which a plurality of pixels that the optical system 243 collects light, receives light, and photoelectrically converts it into an electrical signal are arranged two-dimensionally;
  • An endoscope recording unit 245 that records various types of information related to the endoscope 2 and an imaging control unit 246 that controls the imaging element 244 are provided.
  • the image sensor 244 is configured using an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). Specifically, the imaging element 244 receives a light and performs photoelectric conversion to arrange a plurality of pixels that output an electrical signal in a two-dimensional array, and images a subject (body cavity) at a predetermined frame rate. Image data (RAW data) is output.
  • the imaging element 244 includes a pixel portion 2441 and a color filter 2442.
  • the pixel portion 2441 is formed by arranging a plurality of pixels each having a photodiode for accumulating charges according to the amount of light and an amplifier for amplifying the charges accumulated by the photodiodes in a two-dimensional matrix.
  • the color filter 2442 includes a cyan filter (hereinafter simply referred to as a “Cy filter”) that transmits light in a green wavelength band (500 nm to 600 nm) and light in a blue wavelength band (390 nm to 500 nm), and a green wavelength band. And a yellow filter (hereinafter simply referred to as “Ye filter”) that transmits light and light in a red wavelength band (600 nm to 700 nm).
  • the color filter 2442 is formed by a filter unit U1 using two Cy filters and two Ye filters, and the filter unit U1 is composed of two vertical and horizontal units (2 ⁇ 2). It is formed.
  • the color filter 2442 is formed by arranging a Ye filter and a Cy filter in a checkered pattern. As the color filter 2442, a Cy filter or a Ye filter is stacked on the light receiving surface of each pixel of the pixel portion 2441. The spectral characteristics of the Cy filter and Ye filter will be described later.
  • the endoscope recording unit 245 records various information related to the endoscope 2. For example, the endoscope recording unit 245 records identification information for identifying the endoscope 2, identification information for the imaging element 244, and the like.
  • the endoscope recording unit 245 is configured using a nonvolatile memory or the like.
  • the imaging control unit 246 controls the operation of the imaging element 244 based on the instruction information input from the processing device 5. Specifically, the imaging control unit 246 controls the frame rate and imaging timing of the imaging element 244 based on the instruction information input from the processing device 5. For example, the imaging control unit 246 causes the imaging element 244 to generate and output image data at 120 fps.
  • the operation unit 22 includes a bending knob 221 that bends the bending unit 25 in the vertical direction and the left-right direction, a treatment tool insertion unit 222 that inserts a treatment tool such as a biological forceps, a laser knife, and an inspection probe into the body cavity, and the light source device 3.
  • a plurality of switches that are operation input units for inputting operation instruction signals of peripheral devices such as an air supply means, a water supply means, and a gas supply means and a pre-freeze signal that instructs the image pickup device 244 to take a still image 223.
  • the treatment tool inserted from the treatment tool insertion portion 222 is exposed from the opening (not shown) via the treatment tool channel (not shown) of the distal end portion 24.
  • the universal cord 23 includes at least a light guide 241 and a condensing cable in which one or a plurality of cables are collected.
  • the collective cable is a signal line for transmitting and receiving signals between the endoscope 2 and the light source device 3 and the processing device 5, and is a signal line for transmitting and receiving setting data, a signal line for transmitting and receiving image data, A signal line for transmitting and receiving a driving timing signal for driving the image sensor 244 is included.
  • the universal cord 23 has a connector portion 27 that can be attached to and detached from the light source device 3.
  • the connector 27 includes a coiled coil cable 27a and a connector 28 that can be attached to and detached from the processing apparatus 5 at the extended end of the coil cable 27a.
  • the light source device 3 supplies illumination light for irradiating the subject from the distal end portion 24 of the endoscope 2.
  • the light source device 3 includes a light source unit 31, a light source driver 32, and an illumination control unit 33.
  • the light source unit 31 emits illumination light that irradiates the subject.
  • the light source unit 31 has at least one wavelength band of two different wavelength bands among the light in the red wavelength band, the light in the green wavelength band, and the light in the blue wavelength band. At least the other of the light in the first wavelength band including light, the light in the red wavelength band, the light in the green wavelength band, and the light in the two wavelength bands different from each other in the blue wavelength band And light in a second wavelength band including light in the wavelength band.
  • the light source unit 31 includes a condenser lens 311, a first light source 312, a second light source 313, and a third light source 314.
  • the condensing lens 311 is configured using one or a plurality of lenses.
  • the condensing lens 311 condenses the illumination light emitted from each of the first light source 312, the second light source 313, and the third light source 314 and emits it to the light guide 241.
  • the first light source 312 is configured using a red LED (Light Emitting Diode) lamp.
  • the first light source 312 emits light in the red wavelength band (hereinafter simply referred to as “R light”) based on the current supplied from the light source driver 32.
  • the second light source 313 is configured using a green LED lamp.
  • the second light source 313 emits light in the green wavelength band (hereinafter simply referred to as “G light”) based on the current supplied from the light source driver 32.
  • the third light source 314 is configured using a blue LED lamp.
  • the third light source 314 emits light in the blue wavelength band (hereinafter simply referred to as “B light”) based on the current supplied from the light source driver 32.
  • the light source driver 32 emits only G light to the second light source 313 by supplying current to the first light source 312, the second light source 313, and the third light source 314 under the control of the illumination control unit 33.
  • magenta light hereinafter simply referred to as “Mg light”
  • G light corresponds to light in the first wavelength band
  • Mg light corresponds to light in the second wavelength band.
  • the illumination control unit 33 controls the lighting timing of the light source unit 31 based on the instruction signal received from the processing device 5. Specifically, the illumination control unit 33 supplies G light to the endoscope 2 by causing the second light source 313 to emit G light at a predetermined cycle, while supplying R light and a third light source to the first light source 312. Mg light is supplied to the endoscope 2 by causing the 314 to emit B light simultaneously. In this case, the illumination control unit 33 causes the light source unit 31 to emit G light and Mg light alternately and intermittently.
  • the illumination control unit 33 is configured using a CPU (Central Processing Unit) or the like.
  • the display device 4 displays an image corresponding to the image data generated by the endoscope 2 received from the processing device 5.
  • the display device 4 displays various information related to the endoscope system 1.
  • the display device 4 is configured using a display panel such as liquid crystal or organic EL (Electro Luminescence).
  • the processing device 5 receives the image data generated by the endoscope 2, performs predetermined image processing on the received image data, and outputs it to the display device 4. Further, the processing device 5 comprehensively controls the operation of the entire endoscope system 1.
  • the processing device 5 includes an image processing unit 51, an input unit 52, a recording unit 53, and a processing control unit 54.
  • the image processing unit 51 receives the image data generated by the endoscope 2 under the control of the processing control unit 54, performs predetermined image processing on the received image data, and outputs the image data to the display device 4.
  • the image processing unit 51 includes the first image data generated by the image sensor 244 when the light source unit 31 emits G light, and the image sensor 244 when the light source unit 31 emits Mg light. And the second image data generated by the above.
  • predetermined image processing separation processing, composition processing, interpolation processing, OB clamping processing, gain adjustment processing, format conversion processing, and the like are performed.
  • the image processing unit 51 is configured using a GPU (Graphics Processing Unit), a DSP (Digital Signal Processing), or an FPGA (Field Programmable Gate Array).
  • the image processing unit 51 includes at least a separation unit 511, an interpolation unit 512, and a synthesis unit 513.
  • the separation unit 511 performs color separation on the image corresponding to the image data generated by the endoscope 2. Specifically, the separation unit 511 separates the pixel value of the R pixel and the pixel value of the B pixel from the image corresponding to the image data generated by the image sensor 244 when the light source device 3 emits Mg light.
  • the interpolation unit 512 performs known interpolation processing using the pixel values of the R pixels separated from the RB image by the separation unit 511, thereby generating R image data in which the pixel values of the R pixels are interpolated for all the pixels.
  • the interpolation unit 512 generates B image data having pixel values of B pixels in all pixels by performing a known interpolation process using the pixel values of B pixels separated from the RB image by the separation unit 511.
  • the interpolation processing includes bilinear interpolation processing, direction discrimination interpolation processing, and the like.
  • the interpolation unit 512 may generate the R image data and the B image data using another interpolation process.
  • the synthesizing unit 513 performs the imaging element 244 when the R image data and B image data generated by the interpolation unit 512 and the light source device 3 emit G light.
  • the G image data generated by is generated to generate color image data, and the generated color image data is output to the display device 4.
  • the N image is generated by synthesizing the G image data generated by the imaging device 244 when the light source device 3 emits the G light with the B image data generated by the interpolation unit 512. Image data is generated, and the generated NBI image data is output to the display device 4.
  • the input unit 52 receives an input of an instruction signal for instructing the operation of the endoscope system 1 and outputs the received instruction signal to the processing control unit 54.
  • the input unit 52 receives an input of an instruction signal instructing either the white light observation method or the NBI observation method, and outputs the received instruction signal to the processing control unit 54.
  • the input unit 52 is configured using switches, buttons, a touch panel, and the like.
  • the recording unit 53 records various programs executed by the endoscope system 1, data being executed by the endoscope system 1, and image data generated by the endoscope 2.
  • the recording unit 53 is configured using a volatile memory, a nonvolatile memory, a memory card, and the like.
  • the recording unit 53 includes a program recording unit 531 that records various programs executed by the endoscope system 1.
  • the process control unit 54 is configured using a CPU.
  • the process control unit 54 controls each unit constituting the endoscope system 1. For example, when the instruction signal for switching the illumination light emitted from the light source device 3 is input from the input unit 52, the processing control unit 54 controls the illumination control unit 33 to switch the illumination light emitted from the light source device 3. .
  • FIG. 4 is a diagram schematically showing the spectral characteristics of the color filter 2442 and each light emitted from the light source device 3.
  • the horizontal axis indicates the wavelength
  • the right vertical axis indicates the sensitivity of each pixel (filter transmittance)
  • the left vertical axis indicates the intensity of each light.
  • a curve L Cy indicates the spectral sensitivity characteristic of the Cy pixel
  • a curve L Ye indicates the spectral sensitivity characteristic of the Ye pixel.
  • the curve L B represents a wavelength characteristic of the B light
  • the curve L G represents a wavelength characteristic of the G light
  • the curve L R represents the wavelength characteristic of the R light.
  • the Ye pixel has sensitivity to R light and G light (the transmittance of the filter is high).
  • the Cy pixel has sensitivity to B light and G light (filter transmittance is high). Therefore, as shown in FIG. 5, when the light source apparatus 3 is emitted from the G light, the image pickup device 244 can generate an image P G all the pixels have a G pixel.
  • the image sensor 244 when the light source device 3 emits Mg, the image sensor 244 can generate an image P Mg in which R pixels and B pixels are in a checkered pattern.
  • the light source device 3 supplies G light or Mg light to the endoscope 2, and the imaging device 244 is irradiated with G light or Mg light. The subject is sequentially imaged.
  • FIG. 7 is a flowchart showing an outline of processing executed by the endoscope system 1.
  • FIG. 8 is a diagram schematically illustrating an image generated by the image processing unit 51.
  • the process control unit 54 causes the light source device 3 to emit G light (step S101), and causes the image sensor 244 to image the subject irradiated with the G light (step S102).
  • the image pickup device 244 generates an image P G corresponding to the image data by imaging the subject G light is irradiated.
  • the process control unit 54 causes the light source device 3 to emit Mg light (step S103), and causes the image sensor 244 to image the subject irradiated with the Mg light (step S104).
  • the imaging element 244 generates an image PMg corresponding to the image data by imaging the subject irradiated with the Mg light.
  • the separation unit 511 performs color separation of the pixel value of the R pixel and the pixel value of the B pixel on the image PMg (Step S105).
  • the separation unit 511 separates the pixel value of the R pixel from the image P Mg corresponding to the image data generated by the image sensor 244 and the image P R1 and the image P of the R pixel.
  • An image P B1 of B pixels obtained by separating the pixel values of B pixels from Mg is generated.
  • the interpolation unit 512 performs interpolation processing on each of the image P R1 and the image P B1 (step S106). Specifically, as illustrated in FIG. 8, the interpolation unit 512 performs an interpolation process for interpolating the pixel values of the other R pixels using the pixel values of the image PR1 , thereby causing the pixel values of the R pixels to be applied to all the pixels. Is generated as an interpolated image PR2 . Further, the interpolation unit 512 generates an image P B2 in which the pixel values of B pixels are interpolated in all pixels by performing an interpolation process that interpolates the pixel values of other B pixels using the pixel values of the image P B1. .
  • the synthesis unit 513 generates a white image (step S108). Specifically, as illustrated in FIG. 8, the combining unit 513 generates a white image P W (color image) by combining the image P R2 , the image P B2, and the image G. As a result, the endoscope system 1 can generate a white image in two frames (two fields) by two irradiations of G light and Mg light.
  • step S109: Yes when an instruction signal for instructing termination is input from the operation unit 22 or the input unit 52 (step S109: Yes), the endoscope system 1 ends this process. On the other hand, when the instruction signal for instructing the end is not input from the operation unit 22 or the input unit 52 (step S100: No), the endoscope system 1 returns to step S101.
  • step S107 when the white observation method is not set in the endoscope system 1 (step S107: No), the synthesis unit 513 generates an NBI image (step S110). Specifically, as shown in FIG. 8, the synthesis unit 513 generates an NBI image P NBI by combining the images P G and the image P B2. As a result, the endoscope system 1 can generate an NBI image P NBI in two frames (two fields) by two irradiations of G light and Mg light. In the first embodiment, either one of the white image and the NBI image is generated according to the setting of the endoscope system 1. However, the present invention is not limited to this, and the white image and the NBI image are simultaneously generated. It may be generated.
  • the white image and the NBI image may be simultaneously displayed on the display device 4, or the reduced NBI image may be superimposed on the white image and displayed on the display device 4, or the NBI may be displayed.
  • a reduced white image may be superimposed on the image and displayed on the display device 4.
  • the endoscope system 1 can generate a white image and an NBI image in two frames (two fields) by two irradiations of G light and Mg light.
  • Modification of Embodiment 1 Next, a modification of the first embodiment will be described.
  • the modification of the first embodiment is different in configuration from the color filter 2442 according to the first embodiment described above.
  • a configuration of the color filter according to the modification of the first embodiment will be described.
  • symbol is attached
  • FIG. 9 is a diagram schematically illustrating a configuration of a color filter according to a modification of the first embodiment.
  • the color filter 2442A shown in FIG. 9 is formed by a filter unit U2 using one Ye filter and three Cy filters, and this filter unit U2 is formed by two vertical and horizontal units (2 ⁇ 2).
  • the color filter 2442A is arranged such that the number of Ye filters is smaller than the number of Cy filters.
  • each of the Ye filter and the Cy filter is stacked on the light receiving surface of each pixel of the pixel portion 2441.
  • the light source device 3 supplies G light or Mg light to the endoscope 2 under the control of the processing device 5, and the imaging element 244 is G light or Mg light.
  • the irradiated subject is sequentially imaged.
  • the Ye filter and the Cy filter transmit G light. Therefore, as shown in FIG. 10, the image pickup device 244 can generate an image P G corresponding to the image data.
  • the Ye filter transmits R light
  • the Cy filter transmits B light.
  • the image sensor 244 can generate an image P Mg2 corresponding to image data composed of R pixels and B pixels.
  • R light has few high frequency components in the spatial frequency.
  • the color filter 2442A is provided with a 1/4 Ye filter as a whole, so that the resolution of the color signal can be optimized. Furthermore, since the color filter 2442A is provided with a 3/4 Cy filter as a whole, a high-resolution NBI image can be obtained when the narrow-band light observation method is performed.
  • the 3/4 Cy filter is provided for the entire color filter 2442A. Therefore, when performing the narrow-band light observation method, a high-resolution NBI image is obtained. be able to.
  • FIG. 12 is a circuit diagram schematically showing the configuration of the image sensor according to the second embodiment.
  • An imaging element 244B illustrated in FIG. 12 includes a pixel portion 2441B and a color filter 2442.
  • the color filter 2442 is indicated by a dotted line, and is stacked on the light receiving surface of each pixel.
  • the pixel portion 2441B shown in FIG. 12 is 8 pixels and shares the charge conversion portion, the reset transistor, the amplification transistor, and the selection transistor.
  • the pixel portion 2441B illustrated in FIG. 12 includes eight photoelectric conversion elements 301 (photodiodes), a transfer transistor 302 provided in each of the eight photoelectric conversion elements 301, a charge conversion unit 303, and a reset.
  • the pixel includes a transistor 304, a pixel source follower transistor 305, a selection transistor 306, and a vertical transfer line 307.
  • the photoelectric conversion element 301 photoelectrically converts incident light into a signal charge amount corresponding to the amount of light and accumulates it.
  • the photoelectric conversion element 301 has a cathode side connected to one end side of the transfer transistor 302 and an anode side connected to the ground.
  • the transfer transistor 302 transfers charges from the photoelectric conversion element 301 to the charge conversion unit 303.
  • the charge conversion unit 303 is connected to.
  • the transfer transistor 302 is turned on when a drive signal ⁇ T is supplied from a vertical scanning unit (not shown) via a signal line, and transfers charge from the photoelectric conversion element 301 to the charge conversion unit 303.
  • the charge conversion unit 303 includes a floating diffusion capacitor (FD), and converts the charge accumulated in the photoelectric conversion element 301 into a voltage.
  • FD floating diffusion capacitor
  • the reset transistor 304 resets the charge conversion unit 303 to a predetermined potential.
  • the reset transistor 304 has one end connected to the power supply voltage VDD, the other end connected to the charge converter 303, and a gate connected to a signal line to which the drive signal ⁇ R is supplied.
  • the reset transistor 304 is turned on when a drive signal ⁇ R is supplied from a vertical scanning unit (not shown) via a signal line, and releases the signal charge accumulated in the charge conversion unit 303, thereby causing the charge conversion unit 303 to be predetermined. Reset to potential.
  • the pixel source follower transistor 305 has one end connected to the selection transistor 306, the other end connected to the vertical transfer line 307, and a gate connected to a signal line for transmitting a signal (image signal) converted by the charge converter 303. Is done.
  • the selection transistor 306 has one end connected to the power supply voltage VDD, the other end connected to the pixel source follower transistor 305, and a gate connected to a signal line to which a row selection signal ⁇ X is supplied.
  • the selection transistor 306 is turned on when the row selection signal ⁇ X is supplied, and transfers the signal (image signal) output from the pixel source follower transistor 305 to the vertical transfer line 307.
  • the imaging control unit 246 inputs a transfer pulse to the signal line ⁇ T1 and reads the signal charges of the Cy pixels in the first row. Subsequently, the imaging control unit 246 inputs a transfer pulse to the signal line ⁇ T2 and reads the signal charge of the Ye pixels in the first row. In this manner, the imaging control unit 246 sequentially reads out the signal charges for each row of the imaging element 244B.
  • the imaging element 244B since the light source apparatus 3 is emitted G light, it is possible that all the pixels to produce an image P G is a G pixel.
  • the combining unit 513 combines four G pixels as one G pixel PGA1 . As a result, a G image with increased sensitivity can be generated.
  • the case of pixel mixture readout of the image sensor 244B will be described. Note that, when the image sensor 244B performs pixel mixture readout, the light source device 3 emits Mg light. For this reason, the image sensor 244B images the subject irradiated with the Mg light.
  • the imaging control unit 246 inputs a transfer pulse to the signal line ⁇ T1 and the signal line ⁇ T4, and reads the signal charges of the Cy pixels in the first row and the signal charges of the Cy pixels in the second row. Subsequently, the imaging control unit 246 inputs a transfer pulse to the signal line ⁇ T2 and the signal line ⁇ T3, and reads the signal charge of the Ye pixel in the first row and the signal charge of the Ye pixel in the second row. As described above, the imaging control unit 246 reads out signal charges from pixels of the same color every two rows of the imaging element 244B. In this case, as shown in FIGS.
  • the imaging element 244B since the light source apparatus 3 is emitted Mg light, it is possible to generate an image P Mg1 which R and B pixels is checkered pattern .
  • the combining unit 513 combines two pixels of the same color that are diagonally adjacent to each other as one pixel. For example, as illustrated in FIG. 14, the combining unit 513 combines two diagonally adjacent R pixels as one R pixel PRA1 . Further, as illustrated in FIG. 15, the combining unit 513 combines two diagonally adjacent B pixels as one B pixel P BA1 . Thereby, it is possible to generate an R image and a B image with increased sensitivity.
  • the imaging control unit 246 when the imaging control unit 246 is irradiated with G light from the light source device 3, the pixel values of the G pixels constituting the imaging element 244B are added for each predetermined number of pixels and output.
  • the pixel value of the B pixel or the pixel value of the R pixel that can receive light in the same wavelength band among the pixels that constitute the imaging element 244B when the imaging control unit 246 is irradiated with Mg light by the light source device 3. Are added for each predetermined number of pixels and output. As a result, it is possible to generate a white image and an NBI image with increased sensitivity.
  • the imaging control unit 246 adds the pixel values of the pixels that can receive the light in the same wavelength band constituting the imaging element 244B by the charge conversion unit 303 for each predetermined number of pixels, and performs the conversion. By outputting the pixel value, the synthesis process by the synthesis unit 513 may be omitted.
  • Embodiment 3 Next, Embodiment 3 will be described.
  • the third embodiment is different in configuration from the color filter 2442 according to the first embodiment described above, and is different in the type of illumination light emitted from the light source device 3.
  • the configuration of the color filter according to Embodiment 3 will be described.
  • symbol is attached
  • FIG. 16 is a diagram schematically illustrating a configuration of a color filter according to the third embodiment.
  • the 16 is configured using a magenta filter (hereinafter simply referred to as “Mg filter”) that transmits light in the red wavelength band and light in the blue wavelength band, and a Cy filter.
  • Mg filter magenta filter
  • the color filter 2442B is formed by a filter unit U3 using two Mg filters and two Cy filters, and the filter unit U3 is formed by two vertical and horizontal units (2xx).
  • the color filter 2442B is formed by arranging Mg filters and Cy filters in a checkered pattern. As the color filter 2442B, an Mg filter or a Cy filter is stacked on the light receiving surface of each pixel of the pixel portion 2441.
  • the light source device 3 emits G light and R light at the same time under the control of the processing device 5 to emit light in the red wavelength band and light in the green wavelength.
  • Including light (hereinafter simply referred to as “Ye light”) or B light is supplied to the endoscope 2, and the imaging element 244 sequentially images the subject irradiated with the Ye light or B light.
  • B light corresponds to light in the first wavelength band
  • Ye light corresponds to light in the second wavelength band.
  • the Mg filter transmits R light
  • the Cy filter transmits G light.
  • the image sensor 244 can generate an image P Ye in which the R pixel and the G pixel are in a checkered pattern.
  • the Mg filter and the Cy filter transmit B light.
  • the image sensor 244 can generate an image P B.
  • the image processing unit 51 performs the same process as that in the first embodiment. Thereby, when performing the narrow band observation method, since the pixel value of the B pixel is large, it is possible to improve the resolution of the NBI image.
  • the endoscope system 1 can generate a white image and an NBI image in two frames (two fields) by two irradiations of B light and Ye light.
  • the fourth embodiment is different from the light source device 3 according to the first embodiment described above.
  • the configuration of the endoscope system according to the fourth embodiment will be described.
  • symbol is attached
  • FIG. 19 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the fourth embodiment.
  • An endoscope system 1C shown in FIG. 19 includes a light source device 3C instead of the light source device 3 according to the first embodiment described above.
  • the light source device 3C includes a light source unit 31C instead of the light source unit 31 of the light source device 3 according to Embodiment 1 described above.
  • the light source unit 31C includes a fourth light source 315 and a fifth light source 316 in addition to the configuration of the light source unit 31 according to Embodiment 1 described above.
  • the fourth light source 315 is configured using a purple LED lamp.
  • the fourth light source 315 emits light in the violet wavelength band (400 nm to 435 nm) (hereinafter simply referred to as “V light”) based on the current supplied from the light source driver 32.
  • the fifth light source 316 is configured using an orange LED lamp.
  • the fifth light source 316 emits light in the orange (Amber) wavelength band (585 nm to 620 nm) (hereinafter simply referred to as “A light”) based on the current supplied from the light source driver 32.
  • FIG. 20 is a diagram schematically illustrating spectral characteristics of the color filter 2442 and each light emitted from the light source device 3C.
  • the horizontal axis indicates the wavelength
  • the right vertical axis indicates the sensitivity of each pixel
  • the left vertical axis indicates the intensity of each light.
  • a curve L Cy indicates the spectral sensitivity characteristic of the Cy pixel
  • a curve L Ye indicates the spectral sensitivity characteristic of the Ye pixel.
  • the curve L V represents a wavelength characteristic of the V light
  • curve L B represents a wavelength characteristic of the B light
  • the curve L G represents a wavelength characteristic of the G light
  • the curve L A represents a wavelength characteristic of the A light
  • curve L R represents the wavelength characteristic of the R light.
  • the Ye filter transmits R light, A light, and G light.
  • the Cy filter transmits V light, B light, and G light. Therefore, as shown in FIG. 21, when the light source device 3C is emitted from the G light, the image pickup device 244 can generate an image P G all the pixels have a G pixel.
  • the imaging element 244 when the light source device 3 ⁇ / b> C emits B light and R light, the imaging element 244 can generate an image P BR in which the R pixel and the B pixel are checkered. Furthermore, as shown in FIG.
  • imaging device 244 may generate an image P AV where A pixel and V pixel a checkered pattern. That is, according to the fourth embodiment, a plurality of images having different uses, for example, three or more types of images can be generated in three frames.
  • FIG. 24 is a timing chart illustrating an outline of processing executed by the endoscope system 1C.
  • the light source device 3C has a G light irradiation frame for irradiating G light, a B + R light irradiation frame for simultaneously irradiating B light and R light, and a V + A light for simultaneously irradiating V light and A light. Three patterns of irradiation frames are performed. In FIG. 24, the horizontal axis indicates time. Further, in FIG.
  • the time obtained by adding the irradiation time and the signal readout time is one cycle, and the time of this one cycle is 1/90 seconds (90 fps).
  • the time of one cycle can be changed as appropriate, and may be 1/120 seconds or 1/240 seconds, for example, or can be changed as appropriate depending on the subject, the examination content, the site, and the like.
  • the light source device 3C supplies G light to the endoscope 2, so that the endoscope 2 irradiates the subject with G light.
  • the image sensor 244 images the subject irradiated with the G light.
  • the imaging control unit 246 reads a signal from the imaging element 244 and outputs the signal to the image processing unit 51.
  • the light source device 3C supplies the B light + R light to the endoscope 2 so that the endoscope 2 irradiates the subject with the B light + R light.
  • the image sensor 244 images the subject irradiated with the B light + R light.
  • the imaging control unit 246 reads a signal from the imaging element 244 and outputs the signal to the image processing unit 51.
  • the image processing unit 51 performs the separation process and the interpolation process similar to those of the first embodiment described above on the image P BR to generate the image P B2 and the image PR2 , and the image P G and the image P A white image is generated by performing a combining process for combining B2 and the image PR2 .
  • the light source device 3C supplies V light + A light to the endoscope 2 so that the endoscope 2 irradiates the subject with V light + A light.
  • the image sensor 244 images the subject irradiated with the V light + A light.
  • the imaging control unit 246 reads a signal from the imaging element 244 and outputs the signal to the image processing unit 51.
  • the light source device 3 ⁇ / b> C supplies G light to the endoscope 2, so that the endoscope 2 irradiates the subject with G light.
  • the image sensor 244 images the subject irradiated with the G light.
  • the imaging control unit 246 reads a signal from the imaging element 244 and outputs the signal to the image processing unit 51.
  • the image processing unit 51 generates the image P V and the image P A by performing separation processing and interpolation processing on the image P VA , and the generated image P V and the image P A. generating a MBI image (Multiband Imaging) by combining G and B lights + R light image P G generated upon irradiation of the image P B2 and the image P R2.
  • the image P A is used when extracting the abnormal area having a predetermined feature quantity by learning, not shown.
  • the abnormal areas are a cancer area, a bleeding area, and a lesion area.
  • the endoscope system 1C has a plurality of types including a white image and an MBI image by repeating a pattern of one period in three frames of irradiation patterns of G light, B light + R light, and V light + A light. Images can be generated.
  • the illumination control unit 33 causes the light source device 3C to repeat the pattern of one cycle in the irradiation pattern of G light, B light + R light, and V light + A light.
  • Multiple types of images can be generated including white and MBI images.
  • a learning device or AI that has learned the feature amount of an abnormal region (for example, cancer, bleeding, etc.) may extract the abnormal region.
  • the feature part or feature region extracted by the learning device or AI may be superimposed on the white image and displayed on the display device 4.
  • color information necessary for diagnosis can be generated from a small number of frames while achieving both high image quality and downsizing, and color misregistration can be suppressed.
  • the frame rate is increased by sequentially irradiating V light, B light, A light, and R light, or by simultaneously irradiating only V light and B light. can do.
  • the sensitivity may be increased by adding pixel values of pixels of the same color as in the third embodiment.
  • control device and the light source device are separate, but may be integrally formed.
  • each light is emitted by the LED lamp.
  • the present invention is not limited to this.
  • each light may be emitted by using a laser light source.
  • each light may be emitted using a white light source and a filter that transmits the wavelength band of each light.
  • the endoscope system is used.
  • a capsule endoscope, a video microscope for imaging a subject, a mobile phone having an imaging function, and an imaging function are provided. Even a tablet terminal can be applied.
  • the endoscope system includes a flexible endoscope.
  • an endoscope system including a rigid endoscope, and an industrial endoscope are used. Even an endoscopic system provided can be applied.
  • the endoscope system includes an endoscope that is inserted into a subject.
  • an endoscope such as a sinus endoscope and an electric knife or a test probe is used. Even a system can be applied.
  • the “unit” described above can be read as “means”, “circuit”, and the like.
  • the control unit can be read as control means or a control circuit.
  • the program executed by the endoscope system according to the first to fourth embodiments of the present disclosure is a file data in an installable format or an executable format, and is a CD-ROM, a flexible disk (FD), a CD-R, Provided by being recorded on a computer-readable recording medium such as a DVD (Digital Versatile Disk), a USB medium, or a flash memory.
  • a computer-readable recording medium such as a DVD (Digital Versatile Disk), a USB medium, or a flash memory.
  • the program executed by the endoscope system according to Embodiments 1 to 4 of the present disclosure is configured to be provided by being stored on a computer connected to a network such as the Internet and downloaded via the network. May be.
  • the program to be executed by the endoscope system according to the first to fourth embodiments of the present disclosure may be provided or distributed via a network such as the Internet.
  • signals are transmitted from the endoscope to the processing device using the transmission cable. May be.
  • an image signal or the like may be transmitted from the endoscope to the processing device according to a predetermined wireless communication standard (for example, Wi-Fi (registered trademark) or Bluetooth (registered trademark)).
  • a predetermined wireless communication standard for example, Wi-Fi (registered trademark) or Bluetooth (registered trademark)
  • wireless communication may be performed according to other wireless communication standards.

Abstract

Provided are an endoscope system, an image processing method, and a program that are capable of obtaining high-quality, color drift-free color images and special images. The endoscope system 1 comprises: an imaging element 244 having a color filter 2442 laminated upon the light-reception surface of each pixel and capable of generating image data by capturing images of a subject; a light source 31 capable of irradiating light in a first wavelength band and light in a second wavelength band; an illumination control unit 33 that causes the light source 31 to alternately irradiate light in the first wavelength band and light in the second wavelength band; and an image processing unit 51 that synthesizes first image data generated by the imaging element 244 when the light source 31 irradiates light in the first wavelength band and second image data generated by the imaging element 244 when the light source 31 irradiates light in the second wavelength band.

Description

内視鏡システム、画像処理方法およびプログラムEndoscope system, image processing method and program
 本開示は、被検体の生体内に導入され、該生体内の画像を撮像する内視鏡システム、画像処理方法およびプログラムに関する。 The present disclosure relates to an endoscope system, an image processing method, and a program that are introduced into a living body of a subject and capture an image in the living body.
 従来、胃カメラ等の撮像装置において、光源と被写体との間に白色および緑色の透過フィルタを設けるとともに、イメージングの受光面上にイエローおよびシアンの色分解フィルタをモザイク状に配置し、白色または緑色の透過フィルタを切り替えることによって、従来の3フィールドで1枚のカラー画像を得る場合に対して、2フィールドで1枚の高画質のカラー画像を得ることによって色ズレ(Color Breaking)を防止する技術が知られている(特許文献1参照)。 2. Description of the Related Art Conventionally, in an imaging apparatus such as a stomach camera, white and green transmission filters are provided between a light source and a subject, and yellow and cyan color separation filters are arranged in a mosaic pattern on the imaging light-receiving surface, and white or green Technology to prevent color shifting by obtaining one high-quality color image in two fields, compared to the conventional case of obtaining one color image in three fields by switching the transmission filter Is known (see Patent Document 1).
 また、近年、内視鏡の観察方式として、白色の照明光(白色光)を用いた白色光観察方式(WLI:White Light Imaging)と、青色および緑色の波長帯域にそれぞれ含まれる二つの狭帯域光からなる照明光(狭帯域光)を用いた狭帯域光観察方式(NBI:Narrow Band Imaging)等の特殊光観察方式が広く知られている。白色光観察方式では、白色光を照射することによってカラー画像を得ることができる。挟帯域光観察方式では、狭帯光を照射することによって、生体の粘膜表層に存在する毛細血管および粘膜微細模様等を強調表示する特殊画像を得ることができる。 In recent years, as an endoscope observation method, a white light observation method (WLI: White Light Imaging) using white illumination light (white light) and two narrow bands included in the blue and green wavelength bands, respectively. Special light observation methods such as a narrow band light observation method (NBI: Narrow Band Imaging) using illumination light (narrow band light) made of light are widely known. In the white light observation method, a color image can be obtained by irradiating white light. In the narrow band light observation method, by irradiating narrow band light, it is possible to obtain a special image that highlights capillaries and mucous membrane fine patterns existing on the surface of the mucosa of the living body.
特開平6-225316号公報JP-A-6-225316
 ところで、近年、内視鏡では、カラー画像と特殊画像とを取得することが望まれている。しかしながら、上述した特許文献1では、カラー画像のみしか考慮されておらず、特殊画像を得ることまで考慮していなかった。このため、高画質で色ズレのないカラー画像および特殊画像の各々を得ることができる技術が望まれていた。 Incidentally, in recent years, it has been desired for endoscopes to obtain color images and special images. However, in Patent Document 1 described above, only a color image is considered, and no consideration is given to obtaining a special image. For this reason, a technique capable of obtaining each of a color image and a special image with high image quality and no color shift has been desired.
 本開示は、上記に鑑みてなされたものであって、高画質で色ズレのないカラー画像および特殊画像の各々を得ることができる内視鏡システム、画像処理方法およびプログラムを提供することを目的とする。 The present disclosure has been made in view of the above, and an object of the present disclosure is to provide an endoscope system, an image processing method, and a program capable of obtaining each of a color image and a special image with high image quality and no color misregistration. And
 上述した課題を解決し、目的を達成するために、本開示に係る内視鏡システムは、赤色の波長帯域の光、緑色の波長帯域の光および青色の波長帯域の光のいずれかのうち互いに異なる2つの波長帯域の光を透過する2種類のフィルタを用いて構成されたカラーフィルタを各画素の受光面に積層されてなり、被写体を撮像することによって画像データを生成可能な撮像素子と、前記2つの波長帯域の光の少なくとも一方の波長帯域の光を含む第1の波長帯域の光と、前記2つの波長帯域の光の少なくとも他方の波長帯域の光を含む第2の波長帯域の光と、を照射可能な光源部と、前記光源部に前記第1の波長帯域の光と前記第2の波長帯域の光を交互に照射させる照明制御部と、前記光源部が前記第1の波長帯域の光を照射した際に前記撮像素子によって生成された第1の画像データと、前記光源部が前記第2の波長帯域の光を照射した際に前記撮像素子によって生成された第2の画像データと、を合成する画像処理部と、を備えることを特徴とする。 In order to solve the above-described problems and achieve the object, an endoscope system according to the present disclosure includes a red wavelength band light, a green wavelength band light, and a blue wavelength band light. An image sensor that is formed by stacking color filters configured using two types of filters that transmit light of two different wavelength bands on the light receiving surface of each pixel, and that can generate image data by imaging a subject; Light in a first wavelength band that includes light in at least one of the two wavelength bands and light in a second wavelength band that includes light in at least the other wavelength band of the two wavelength bands A light source unit that can irradiate the light source unit, an illumination control unit that alternately irradiates light of the first wavelength band and light of the second wavelength band to the light source unit, and the light source unit has the first wavelength The imaging when irradiated with light of the band An image processing unit that synthesizes the first image data generated by the child and the second image data generated by the imaging device when the light source unit emits light of the second wavelength band; It is characterized by providing.
 また、本開示に係る内視鏡システムは、上記開示において、前記2種類のフィルタは、前記緑色の波長帯域の光および前記青色の波長帯域の光を透過するシアンフィルタおよび前記赤色の波長帯域の光および前記緑色の波長帯域の光を透過するイエローフィルタであり、前記光源部は、前記赤色の波長帯域の光を照射可能な第1光源と、前記緑色の波長帯域の光を照射可能な第2光源と、前記青色の波長帯域の光を照射可能な第3光源と、を有し、前記照明制御部は、前記第1の波長帯域の光として前記第2光源に前記緑色の波長帯域の光を照射させる一方、前記第2の波長帯域の光として前記第1光源に前記赤色の波長帯域の光と前記第3光源に前記青色の波長帯域の光とを同時に照射させ、前記画像処理部は、前記第2の画像データに対応する第2の画像から前記赤色の波長帯域の光の画素値と前記青色の波長帯域の光の画素値とを分離することによって赤色の画像データと青色の画像データを生成する分離部と、前記分離部が生成した前記赤色の画像データと、前記青色の画像データと、前記第1の画像データとを合成することによって白色画像データを生成する合成部と、を有することを特徴とする。 Further, in the endoscope system according to the present disclosure, in the above disclosure, the two types of filters include a cyan filter that transmits light in the green wavelength band and light in the blue wavelength band, and a red wavelength band. A yellow filter that transmits light and light in the green wavelength band, and the light source unit emits light in the red wavelength band and first light source that can emit light in the green wavelength band Two light sources and a third light source capable of irradiating light in the blue wavelength band, and the illumination control unit supplies the second light source with light in the green wavelength band as light in the first wavelength band. While irradiating light, as the light of the second wavelength band, the first light source is simultaneously irradiated with light of the red wavelength band and the third light source of light of the blue wavelength band, and the image processing unit Is included in the second image data. A separation unit that generates red image data and blue image data by separating the pixel value of the light in the red wavelength band and the pixel value of the light in the blue wavelength band from the corresponding second image; And a combining unit that generates white image data by combining the red image data generated by the separating unit, the blue image data, and the first image data.
 また、本開示に係る内視鏡システムは、上記開示において、前記合成部は、前記分離部が生成した前記青色の画像データと前記第1の画像データとを合成することによって特殊画像データを生成することを特徴とする。 In the endoscope system according to the present disclosure, in the above disclosure, the combining unit generates special image data by combining the blue image data generated by the separating unit and the first image data. It is characterized by doing.
 また、本開示に係る内視鏡システムは、上記開示において、前記2種類のフィルタは、前記緑色の波長帯域の光および前記青色の波長帯域の光を透過するシアンフィルタおよび前記赤色の波長帯域の光および前記青色の波長帯域の光を透過するマゼンタフィルタであり、前記光源部は、前記赤色の波長帯域の光を照射可能な第1光源と、前記緑色の波長帯域の光を照射可能な第2光源と、前記青色の波長帯域の光を照射可能な第3光源と、を有し、前記照明制御部は、前記第1の波長帯域の光として前記第3光源に前記青色の波長帯域の光を照射させる一方、前記第2の波長帯域の光として前記第1光源に前記赤色の波長帯域の光と前記第2光源に前記緑色の波長帯域の光とを同時に照射させ、前記画像処理部は、前記第2の画像データに対応する第2の画像から前記赤色の波長帯域の光の画素値と前記緑色の波長帯域の光の画素値とを分離することによって赤色の画像データと緑色の画像データを生成する分離部と、前記分離部が生成した前記赤色の画像データと、前記緑色の画像データと、前記第1の画像データとを合成することによって白色画像データを生成する合成部と、を有することを特徴とする。 Further, in the endoscope system according to the present disclosure, in the above disclosure, the two types of filters include a cyan filter that transmits light in the green wavelength band and light in the blue wavelength band, and a red wavelength band. A magenta filter that transmits light and light in the blue wavelength band, wherein the light source unit includes a first light source capable of irradiating light in the red wavelength band, and a first light source capable of irradiating light in the green wavelength band. Two light sources and a third light source capable of irradiating light in the blue wavelength band, and the illumination control unit supplies the third light source with light in the blue wavelength band as light in the first wavelength band. While irradiating light, as the light of the second wavelength band, the first light source is simultaneously irradiated with light of the red wavelength band and light of the green wavelength band to the second light source, and the image processing unit Is included in the second image data. A separation unit that generates red image data and green image data by separating the pixel value of the light in the red wavelength band and the pixel value of the light in the green wavelength band from the corresponding second image; And a combining unit that generates white image data by combining the red image data generated by the separation unit, the green image data, and the first image data.
 また、本開示に係る内視鏡システムは、上記開示において、前記合成部は、前記分離部が分離した前記緑色の画像データと前記第1の画像データとを合成することによって特殊画像データを生成することを特徴とする。 In the endoscope system according to the present disclosure, in the above disclosure, the synthesis unit generates special image data by synthesizing the green image data separated by the separation unit and the first image data. It is characterized by doing.
 また、本開示に係る内視鏡システムは、上記開示において、前記光源部が前記第1の波長帯域の光を照射した際に、前記撮像素子を構成する各画素の画素値を所定の画素数毎に加算して出力させる一方、前記光源部が前記第2の波長帯域の光を照射した際に、前記撮像素子を構成する各画素のうち、同じ波長帯域の光を受光可能な画素の画素値を所定の画素数毎に加算して出力させる撮像制御部を備えることを特徴とする。 In addition, in the endoscope system according to the present disclosure, in the above disclosure, when the light source unit emits light in the first wavelength band, a pixel value of each pixel constituting the imaging element is set to a predetermined number of pixels. Pixels of pixels that can receive light of the same wavelength band among the pixels that constitute the image sensor when the light source unit emits light of the second wavelength band while adding and outputting each time An imaging control unit that adds and outputs a value for each predetermined number of pixels is provided.
 また、本開示に係る内視鏡システムは、上記開示において、前記光源部は、紫色の波長帯域の光を照射可能な第4光源と、橙色の波長帯域の光を照射可能な第5光源と、をさらに有し、前記照明制御部は、前記第2の波長帯域の光を前記光源部に照射させた後に、前記第4光源に前記紫色の波長帯域の光と前記第5光源に前記橙色の波長帯域の光とを同時に出射させることを特徴とする。 Further, in the endoscope system according to the present disclosure, in the above disclosure, the light source unit includes a fourth light source capable of emitting light in a purple wavelength band, and a fifth light source capable of emitting light in an orange wavelength band. The illumination control unit irradiates the light of the second wavelength band to the light source unit, and then causes the fourth light source to emit light of the purple wavelength band and the fifth light source to orange. It is characterized in that the light of the wavelength band is simultaneously emitted.
 また、本開示に係る画像処理方法は、赤色の波長帯域の光、緑色の波長帯域の光および青色の波長帯域の光のいずれかのうち互いに異なる2つの波長帯域の光を透過する2種類のフィルタを用いて構成されたカラーフィルタを各画素の受光面に積層されてなり、被写体を撮像することによって画像データを生成可能な撮像素子と、前記2つの波長帯域の光の少なくとも一方の波長帯域の光を含む第1の波長帯域の光と、前記2つの波長帯域の光の少なくとも他方の波長帯域の光を含む第2の波長帯域の光と、を照射可能な光源部と、を備えた内視鏡システムが実行する制御方法であって、前記光源部に前記第1の波長帯域の光と前記第2の波長帯域の光を交互に照射させる照明制御ステップと、前記光源部が前記第1の波長帯域の光を照射した際に前記撮像素子によって生成された第1の画像データと、前記光源部が前記第2の波長帯域の光を照射した際に前記撮像素子によって生成された第2の画像データと、を合成する画像処理ステップと、を含むことを特徴とする。 Further, the image processing method according to the present disclosure transmits two kinds of light in two different wavelength bands among light in the red wavelength band, light in the green wavelength band, and light in the blue wavelength band. A color filter configured using a filter is stacked on the light-receiving surface of each pixel, and an image sensor that can generate image data by imaging a subject, and at least one wavelength band of light in the two wavelength bands And a light source unit capable of irradiating light in a first wavelength band including the light in the second wavelength band and light in a second wavelength band including light in at least the other wavelength band of the light in the two wavelength bands. A control method executed by an endoscope system, wherein the light source unit alternately irradiates light of the first wavelength band and light of the second wavelength band to the light source unit, and the light source unit Irradiates light of 1 wavelength band The first image data generated by the image sensor when the light source unit emits light of the second wavelength band, and the second image data generated by the image sensor And an image processing step.
 また、本開示に係るプログラムは、赤色の波長帯域の光、緑色の波長帯域の光および青色の波長帯域の光のいずれかのうち互いに異なる2つの波長帯域の光を透過する2種類のフィルタを用いて構成されたカラーフィルタを各画素の受光面に積層されてなり、被写体を撮像することによって画像データを生成可能な撮像素子と、前記2つの波長帯域の光の少なくとも一方の波長帯域の光を含む第1の波長帯域の光と、前記2つの波長帯域の光の少なくとも他方の波長帯域の光を含む第2の波長帯域の光と、を照射可能な光源部と、を備えた内視鏡システムに、前記光源部に前記第1の波長帯域の光と前記第2の波長帯域の光を交互に照射させる照明制御ステップと、前記光源部が前記第1の波長帯域の光を照射した際に前記撮像素子によって生成された第1の画像データと、前記光源部が前記第2の波長帯域の光を照射した際に前記撮像素子によって生成された第2の画像データと、を合成する画像処理ステップと、を実行させることを特徴とする。 Further, the program according to the present disclosure includes two types of filters that transmit light in two different wavelength bands among light in a red wavelength band, light in a green wavelength band, and light in a blue wavelength band. A color filter formed using the light-receiving surface of each pixel, and an image sensor capable of generating image data by imaging a subject, and light in at least one wavelength band of the two wavelength bands And a light source unit capable of irradiating light in a first wavelength band including light and light in a second wavelength band including light in at least the other wavelength band of the two wavelength bands. An illumination control step for causing the light source unit to alternately irradiate the light of the first wavelength band and the light of the second wavelength band to the mirror system, and the light source unit irradiated the light of the first wavelength band When the image sensor An image processing step of combining the generated first image data and the second image data generated by the imaging device when the light source unit emits light of the second wavelength band. It is made to perform.
 本開示によれば、高画質で色ズレのないカラー画像および特殊画像の各々を得ることができるという効果を奏する。 According to the present disclosure, it is possible to obtain each of a color image and a special image with high image quality and no color misregistration.
図1は、実施の形態1に係る内視鏡システムの概略構成図である。FIG. 1 is a schematic configuration diagram of an endoscope system according to the first embodiment. 図2は、実施の形態1に係る内視鏡システムの要部の機能構成を示すブロック図である。FIG. 2 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the first embodiment. 図3は、カラーフィルタの構成を模式的に示す図である。FIG. 3 is a diagram schematically showing the configuration of the color filter. 図4は、カラーフィルタと光源装置が出射する各光との分光特性を模式的に示す図である。FIG. 4 is a diagram schematically illustrating spectral characteristics of the color filter and each light emitted from the light source device. 図5は、光源装置が緑色の波長帯域の光を照射した際に撮像素子が生成する画像の一例を示す図である。FIG. 5 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in the green wavelength band. 図6は、光源装置が赤色の波長帯域の光と青色の波長帯域の光を同時に照射した際に撮像素子が生成する画像の一例を示す図である。FIG. 6 is a diagram illustrating an example of an image generated by the imaging device when the light source device simultaneously emits light in the red wavelength band and light in the blue wavelength band. 図7は、実施の形態1に係る内視鏡システムが実行する処理の概要を示すフローチャートである。FIG. 7 is a flowchart illustrating an outline of processing executed by the endoscope system according to the first embodiment. 図8は、画像処理部が生成する画像を模式的に示す図である。FIG. 8 is a diagram schematically illustrating an image generated by the image processing unit. 図9は、本開示の実施の形態1の変形例に係るカラーフィルタの構成を模式的に示す図である。FIG. 9 is a diagram schematically illustrating a configuration of a color filter according to a modification of the first embodiment of the present disclosure. 図10は、光源装置が緑色の波長帯域の光を照射した際に撮像素子が生成する画像の一例を示す図である。FIG. 10 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in the green wavelength band. 図11は、光源装置が赤色の波長帯域の光と青色の波長帯域の光を同時に照射した際に撮像素子が生成する画像の一例を示す図である。FIG. 11 is a diagram illustrating an example of an image generated by the imaging device when the light source device simultaneously emits light in the red wavelength band and light in the blue wavelength band. 図12は、実施の形態2に係る撮像素子の構成を模式的に示し回路図である。FIG. 12 is a circuit diagram schematically showing the configuration of the image sensor according to the second embodiment. 図13は、撮像素子の加算方法を模式的に示す図である。FIG. 13 is a diagram schematically illustrating an image sensor addition method. 図14は、撮像素子の加算方法を模式的に示す図である。FIG. 14 is a diagram schematically illustrating an image sensor addition method. 図15は、撮像素子の加算方法を模式的に示す図である。FIG. 15 is a diagram schematically illustrating an image sensor addition method. 図16は、実施の形態3に係るカラーフィルタの構成を模式的に示す図である。FIG. 16 is a diagram schematically illustrating a configuration of a color filter according to the third embodiment. 図17は、光源装置が緑色の波長帯域の光および赤色の波長帯域の光を同時に照射した際に撮像素子が生成する画像の一例を示す図である。FIG. 17 is a diagram illustrating an example of an image generated by the imaging device when the light source device simultaneously emits light in the green wavelength band and light in the red wavelength band. 図18は、光源装置が青色の波長帯域の光を照射した際に撮像素子が生成する画像の一例を示す図である。FIG. 18 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in a blue wavelength band. 図19は、実施の形態4に係る内視鏡システムの要部の機能構成を示すブロック図である。FIG. 19 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the fourth embodiment. 図20は、カラーフィルタ2442と光源装置3Cが出射する各光との分光特性を模式的に示す図である。FIG. 20 is a diagram schematically illustrating spectral characteristics of the color filter 2442 and each light emitted from the light source device 3C. 図21は、光源装置が緑色の波長帯域の光を照射した際に撮像素子が生成する画像の一例を示す図である。FIG. 21 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in the green wavelength band. 図22は、光源装置が青色の波長帯域の光と赤色の波長帯域の光を同時に照射した際に撮像素子が生成する画像の一例を示す図である。FIG. 22 is a diagram illustrating an example of an image generated by the imaging device when the light source device simultaneously emits light in the blue wavelength band and light in the red wavelength band. 図23は、光源装置が紫色の波長帯域の光と橙色の波長帯域の光を照射した際に撮像素子が生成する画像の一例を示す図である。FIG. 23 is a diagram illustrating an example of an image generated by the imaging device when the light source device emits light in a purple wavelength band and light in an orange wavelength band. 図24は、実施の形態4に係る内視鏡システムが実行する処理の概要を示すタイミングチャートである。FIG. 24 is a timing chart illustrating an outline of processing executed by the endoscope system according to the fourth embodiment.
 以下、本開示を実施するための形態(以下、「実施の形態」という)を説明する。本開示の実施の形態では、患者等の被検体の体腔内の画像を撮像して表示する医療用の内視鏡システムについて説明する。また、この実施の形態により、この発明が限定されるものではない。さらに、図面の記載において、同一部分には同一の符号を付して説明する。 Hereinafter, modes for carrying out the present disclosure (hereinafter referred to as “embodiments”) will be described. In the embodiment of the present disclosure, a medical endoscope system that captures and displays an image of a body cavity of a subject such as a patient will be described. Moreover, this invention is not limited by this embodiment. Furthermore, in the description of the drawings, the same portions will be described with the same reference numerals.
(実施の形態1)
 〔内視鏡システムの構成〕
 図1は、実施の形態1に係る内視鏡システムの概略構成図である。図2は、実施の形態1に係る内視鏡システムの要部の機能構成を示すブロック図である。図1および図2に示す内視鏡システム1は、患者等の被検体に内視鏡を挿入して被検体の体内を撮像し、この撮像した画像データを外部の表示装置へ出力する。医者等の使用者は、表示装置で表示された体内画像の観察を行うことによって、検出対象部位である出血部位、腫瘍部位および異常部位それぞれの有無を検査する。内視鏡システム1は、内視鏡2と、光源装置3と、表示装置4と、処理装置5(プロセッサ)と、を備える。
(Embodiment 1)
[Configuration of endoscope system]
FIG. 1 is a schematic configuration diagram of an endoscope system according to the first embodiment. FIG. 2 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the first embodiment. An endoscope system 1 shown in FIGS. 1 and 2 inserts an endoscope into a subject such as a patient, images the inside of the subject, and outputs the captured image data to an external display device. A user such as a doctor examines the presence or absence of each of a bleeding site, a tumor site, and an abnormal site, which are detection target sites, by observing the in-vivo images displayed on the display device. The endoscope system 1 includes an endoscope 2, a light source device 3, a display device 4, and a processing device 5 (processor).
 〔内視鏡の構成〕
 まず、内視鏡2の構成について説明する。
 内視鏡2は、被検体の体内を撮像して画像データ(RAWデータ)を生成し、この生成した画像データを処理装置5へ出力する。内視鏡2は、挿入部21と、操作部22と、ユニバーサルコード23と、を備える。
[Configuration of endoscope]
First, the configuration of the endoscope 2 will be described.
The endoscope 2 captures an image of the inside of the subject, generates image data (RAW data), and outputs the generated image data to the processing device 5. The endoscope 2 includes an insertion unit 21, an operation unit 22, and a universal cord 23.
 挿入部21は、可撓性を有する細長形状をなす。挿入部21は、後述する撮像素子244を内蔵した先端部24と、複数の湾曲駒によって構成された湾曲自在な湾曲部25と、湾曲部25の基端側に接続され、可撓性を有する長尺状の可撓管部26と、を有する。 The insertion part 21 has an elongated shape having flexibility. The insertion portion 21 is connected to a distal end portion 24 incorporating an imaging element 244 described later, a bendable bending portion 25 constituted by a plurality of bending pieces, and a proximal end side of the bending portion 25, and has flexibility. And a long flexible tube portion 26.
 先端部24は、グラスファイバ等を用いて構成されて光源装置3が発光した光の導光路をなすライトガイド241と、ライトガイド241の先端に設けられた照明レンズ242と、集光用の光学系243と、光学系243の結像位置に設けられ、光学系243が集光して光を受光して電気信号に光電変換する複数の画素が2次元状に配列された撮像素子244と、内視鏡2に関する各種情報を記録する内視鏡記録部245と、撮像素子244を制御する撮像制御部246と、を有する。 The distal end portion 24 is configured using a glass fiber or the like, and forms a light guide path for light emitted from the light source device 3. An illumination lens 242 provided at the distal end of the light guide 241. A system 243, and an image pickup device 244 provided at an image forming position of the optical system 243, in which a plurality of pixels that the optical system 243 collects light, receives light, and photoelectrically converts it into an electrical signal are arranged two-dimensionally; An endoscope recording unit 245 that records various types of information related to the endoscope 2 and an imaging control unit 246 that controls the imaging element 244 are provided.
 撮像素子244は、CCD(Charge Coupled Device)またはCMOS(Complementary Metal Oxide Semiconductor)等のイメージセンサを用いて構成される。具体的には、撮像素子244は、光を受光して光電変換を行うことによって電気信号を出力する複数の画素が2次元状に配列され、被写体(体腔)を所定のフレームレートで撮像して画像データ(RAWデータ)を出力する。撮像素子244は、画素部2441と、カラーフィルタ2442と、を有する。 The image sensor 244 is configured using an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). Specifically, the imaging element 244 receives a light and performs photoelectric conversion to arrange a plurality of pixels that output an electrical signal in a two-dimensional array, and images a subject (body cavity) at a predetermined frame rate. Image data (RAW data) is output. The imaging element 244 includes a pixel portion 2441 and a color filter 2442.
 画素部2441は、光量に応じた電荷を蓄積するフォトダイオードおよびフォトダイオードが蓄積した電荷を増幅する増幅器をそれぞれ有する複数の画素が2次元マトリックス状に配設されてなる。 The pixel portion 2441 is formed by arranging a plurality of pixels each having a photodiode for accumulating charges according to the amount of light and an amplifier for amplifying the charges accumulated by the photodiodes in a two-dimensional matrix.
 カラーフィルタ2442は、緑色の波長帯域の光(500nm~600nm)と青色の波長帯域の光(390nm~500nm)を透過するシアンフィルタ(以下、単に「Cyフィルタ」という)と、緑色の波長帯域の光と赤色の波長帯域の光(600nm~700nm)を透過するイエローフィルタ(以下、単に「Yeフィルタ」という)と、を用いて構成される。具体的には、図3に示すように、カラーフィルタ2442は、2つのCyフィルタおよび2つのYeフィルタを用いたフィルタユニットU1で形成され、このフィルタユニットU1が縦横2ユニット(2×2)で形成される。カラーフィルタ2442は、YeフィルタおよびCyフィルタが市松状に配置されて形成される。カラーフィルタ2442は、CyフィルタまたはYeフィルタが画素部2441の各画素の受光面に積層される。なお、CyフィルタおよびYeフィルタの分光特性については後述する。 The color filter 2442 includes a cyan filter (hereinafter simply referred to as a “Cy filter”) that transmits light in a green wavelength band (500 nm to 600 nm) and light in a blue wavelength band (390 nm to 500 nm), and a green wavelength band. And a yellow filter (hereinafter simply referred to as “Ye filter”) that transmits light and light in a red wavelength band (600 nm to 700 nm). Specifically, as shown in FIG. 3, the color filter 2442 is formed by a filter unit U1 using two Cy filters and two Ye filters, and the filter unit U1 is composed of two vertical and horizontal units (2 × 2). It is formed. The color filter 2442 is formed by arranging a Ye filter and a Cy filter in a checkered pattern. As the color filter 2442, a Cy filter or a Ye filter is stacked on the light receiving surface of each pixel of the pixel portion 2441. The spectral characteristics of the Cy filter and Ye filter will be described later.
 内視鏡記録部245は、内視鏡2に関する各種情報を記録する。例えば、内視鏡記録部245は、内視鏡2を識別する識別情報および撮像素子244の識別情報等を記録する。内視鏡記録部245は、不揮発性メモリ等を用いて構成される。 The endoscope recording unit 245 records various information related to the endoscope 2. For example, the endoscope recording unit 245 records identification information for identifying the endoscope 2, identification information for the imaging element 244, and the like. The endoscope recording unit 245 is configured using a nonvolatile memory or the like.
 撮像制御部246は、処理装置5から入力される指示情報に基づいて、撮像素子244の動作を制御する。具体的には、撮像制御部246は、処理装置5から入力される指示情報に基づいて、撮像素子244のフレームレートや撮影タイミングを制御する。例えば、撮像制御部246は、撮像素子244に120fpsで画像データを生成させて出力させる。 The imaging control unit 246 controls the operation of the imaging element 244 based on the instruction information input from the processing device 5. Specifically, the imaging control unit 246 controls the frame rate and imaging timing of the imaging element 244 based on the instruction information input from the processing device 5. For example, the imaging control unit 246 causes the imaging element 244 to generate and output image data at 120 fps.
 操作部22は、湾曲部25を上下方向および左右方向に湾曲させる湾曲ノブ221と、体腔内に生体鉗子、レーザメスおよび検査プローブ等の処置具を挿入する処置具挿入部222と、光源装置3、処理装置5に加えて、送気手段、送水手段、送ガス手段等の周辺機器の操作指示信号や撮像素子244に静止画撮影を指示するプリフリーズ信号を入力する操作入力部である複数のスイッチ223と、を有する。処置具挿入部222から挿入される処置具は、先端部24の処置具チャンネル(図示せず)を経由して開口部(図示せず)から表出する。 The operation unit 22 includes a bending knob 221 that bends the bending unit 25 in the vertical direction and the left-right direction, a treatment tool insertion unit 222 that inserts a treatment tool such as a biological forceps, a laser knife, and an inspection probe into the body cavity, and the light source device 3. In addition to the processing device 5, a plurality of switches that are operation input units for inputting operation instruction signals of peripheral devices such as an air supply means, a water supply means, and a gas supply means and a pre-freeze signal that instructs the image pickup device 244 to take a still image 223. The treatment tool inserted from the treatment tool insertion portion 222 is exposed from the opening (not shown) via the treatment tool channel (not shown) of the distal end portion 24.
 ユニバーサルコード23は、ライトガイド241と、1または複数のケーブルをまとめた集光ケーブルと、を少なくとも内蔵している。集合ケーブルは、内視鏡2および光源装置3と処理装置5との間で信号を送受信する信号線であって、設定データを送受信するための信号線、画像データを送受信するための信号線、撮像素子244を駆動するための駆動用のタイミング信号を送受信するための信号線等を含む。ユニバーサルコード23は、光源装置3に着脱自在なコネクタ部27を有する。コネクタ部27は、コイル状のコイルケーブル27aが延設し、コイルケーブル27aの延出端に処理装置5に着脱自在なコネクタ部28を有する。 The universal cord 23 includes at least a light guide 241 and a condensing cable in which one or a plurality of cables are collected. The collective cable is a signal line for transmitting and receiving signals between the endoscope 2 and the light source device 3 and the processing device 5, and is a signal line for transmitting and receiving setting data, a signal line for transmitting and receiving image data, A signal line for transmitting and receiving a driving timing signal for driving the image sensor 244 is included. The universal cord 23 has a connector portion 27 that can be attached to and detached from the light source device 3. The connector 27 includes a coiled coil cable 27a and a connector 28 that can be attached to and detached from the processing apparatus 5 at the extended end of the coil cable 27a.
 〔光源装置の構成〕
 次に、光源装置3の構成について説明する。
 光源装置3は、内視鏡2の先端部24から被検体を照射するための照明光を供給する。光源装置3は、光源部31と、光源ドライバ32と、照明制御部33と、を備える。
[Configuration of light source device]
Next, the configuration of the light source device 3 will be described.
The light source device 3 supplies illumination light for irradiating the subject from the distal end portion 24 of the endoscope 2. The light source device 3 includes a light source unit 31, a light source driver 32, and an illumination control unit 33.
 光源部31は、被検体を照射する照明光を出射する。具体的には、光源部31は、赤色の波長帯域の光、緑色の波長帯域の光および青色の波長帯域の光のいずれかのうち互いに異なる2つの波長帯域の光の少なくとも一方の波長帯域の光を含む第1の波長帯域の光と、赤色の波長帯域の光、緑色の波長帯域の光および青色の波長帯域の光のうち互いに異なる2つの波長帯域の光のいずれかのうち少なくとも他方の波長帯域の光を含む第2の波長帯域の光と、を照射する。光源部31は、集光レンズと311と、第1光源312と、第2光源313と、第3光源314と、を備える。 The light source unit 31 emits illumination light that irradiates the subject. Specifically, the light source unit 31 has at least one wavelength band of two different wavelength bands among the light in the red wavelength band, the light in the green wavelength band, and the light in the blue wavelength band. At least the other of the light in the first wavelength band including light, the light in the red wavelength band, the light in the green wavelength band, and the light in the two wavelength bands different from each other in the blue wavelength band And light in a second wavelength band including light in the wavelength band. The light source unit 31 includes a condenser lens 311, a first light source 312, a second light source 313, and a third light source 314.
 集光レンズ311は、1または複数のレンズを用いて構成される。集光レンズ311は、第1光源312、第2光源313および第3光源314の各々が出射した照明光を集光してライトガイド241へ出射する。 The condensing lens 311 is configured using one or a plurality of lenses. The condensing lens 311 condenses the illumination light emitted from each of the first light source 312, the second light source 313, and the third light source 314 and emits it to the light guide 241.
 第1光源312は、赤色LED(Light Emitting Diode)ランプを用いて構成される。第1光源312は、光源ドライバ32から供給される電流に基づいて、赤色の波長帯域の光(以下、単に「R光」という)を出射する。 The first light source 312 is configured using a red LED (Light Emitting Diode) lamp. The first light source 312 emits light in the red wavelength band (hereinafter simply referred to as “R light”) based on the current supplied from the light source driver 32.
 第2光源313は、緑色LEDランプを用いて構成される。第2光源313は、光源ドライバ32から供給される電流に基づいて、緑色の波長帯域の光(以下、単に「G光」という)を出射する。 The second light source 313 is configured using a green LED lamp. The second light source 313 emits light in the green wavelength band (hereinafter simply referred to as “G light”) based on the current supplied from the light source driver 32.
 第3光源314は、青色LEDランプを用いて構成される。第3光源314は、光源ドライバ32から供給される電流に基づいて、青色の波長帯域の光(以下、単に「B光」という)を出射する。 The third light source 314 is configured using a blue LED lamp. The third light source 314 emits light in the blue wavelength band (hereinafter simply referred to as “B light”) based on the current supplied from the light source driver 32.
 光源ドライバ32は、照明制御部33の制御のもと、第1光源312、第2光源313および第3光源314に対して、電流を供給することによって、第2光源313にG光のみを出射させ、かつ、第1光源312および第3光源314にR光とB光とを同時に出射させることによって、R光とB光とを含むマゼンタの光(以下、単に「Mg光」という)を出射させる。なお、実施の形態1では、G光が第1の波長帯域の光に該当し、Mg光が第2の波長帯域の光に該当する。 The light source driver 32 emits only G light to the second light source 313 by supplying current to the first light source 312, the second light source 313, and the third light source 314 under the control of the illumination control unit 33. In addition, by causing the first light source 312 and the third light source 314 to emit R light and B light simultaneously, magenta light (hereinafter simply referred to as “Mg light”) including R light and B light is emitted. Let In the first embodiment, G light corresponds to light in the first wavelength band, and Mg light corresponds to light in the second wavelength band.
 照明制御部33は、処理装置5から受信した指示信号に基づいて、光源部31の点灯タイミングを制御する。具体的には、照明制御部33は、所定の周期で第2光源313にG光を出射させることによって内視鏡2にG光を供給する一方、第1光源312にR光および第3光源314にB光を同時に出射させることによってMg光を内視鏡2に供給する。この場合、照明制御部33は、光源部31にG光とMg光とを交互に間欠的に出射させる。照明制御部33は、CPU(Central Processing Unit)等を用いて構成される。 The illumination control unit 33 controls the lighting timing of the light source unit 31 based on the instruction signal received from the processing device 5. Specifically, the illumination control unit 33 supplies G light to the endoscope 2 by causing the second light source 313 to emit G light at a predetermined cycle, while supplying R light and a third light source to the first light source 312. Mg light is supplied to the endoscope 2 by causing the 314 to emit B light simultaneously. In this case, the illumination control unit 33 causes the light source unit 31 to emit G light and Mg light alternately and intermittently. The illumination control unit 33 is configured using a CPU (Central Processing Unit) or the like.
 〔表示装置の構成〕
 次に、表示装置4の構成について説明する。
 表示装置4は、処理装置5から受信した内視鏡2によって生成された画像データに対応する画像を表示する。表示装置4は、内視鏡システム1に関する各種情報を表示する。表示装置4は、液晶または有機EL(Electro Luminescence)等の表示パネル等を用いて構成される。
[Configuration of display device]
Next, the configuration of the display device 4 will be described.
The display device 4 displays an image corresponding to the image data generated by the endoscope 2 received from the processing device 5. The display device 4 displays various information related to the endoscope system 1. The display device 4 is configured using a display panel such as liquid crystal or organic EL (Electro Luminescence).
 〔処理装置の構成〕
 次に、処理装置5の構成について説明する。
 処理装置5は、内視鏡2が生成した画像データを受信し、この受信した画像データに対して所定の画像処理を施して表示装置4へ出力する。また、処理装置5は、内視鏡システム1全体の動作を統括的に制御する。処理装置5は、画像処理部51と、入力部52と、記録部53と、処理制御部54と、を備える。
[Configuration of processing equipment]
Next, the configuration of the processing device 5 will be described.
The processing device 5 receives the image data generated by the endoscope 2, performs predetermined image processing on the received image data, and outputs it to the display device 4. Further, the processing device 5 comprehensively controls the operation of the entire endoscope system 1. The processing device 5 includes an image processing unit 51, an input unit 52, a recording unit 53, and a processing control unit 54.
 画像処理部51は、処理制御部54の制御のもと、内視鏡2が生成した画像データを受信し、受信した画像データに対して所定の画像処理を施して表示装置4へ出力する。具体的には、画像処理部51は、光源部31がG光を照射した際に撮像素子244によって生成された第1の画像データと、光源部31がMg光を照射した際に撮像素子244によって生成された第2の画像データと、を合成する。ここで、所定の画像処理としては、分離処理、合成処理、補間処理、OBクランプ処理、ゲイン調整処理およびフォーマット変換処理等を行う。画像処理部51は、GPU(Graphics Processing Unit)、DSP(Digital Signal Processing)またはFPGA(Field Programmable Gate Array)等を用いて構成される。画像処理部51は、少なくとも分離部511、補間部512および合成部513を有する。 The image processing unit 51 receives the image data generated by the endoscope 2 under the control of the processing control unit 54, performs predetermined image processing on the received image data, and outputs the image data to the display device 4. Specifically, the image processing unit 51 includes the first image data generated by the image sensor 244 when the light source unit 31 emits G light, and the image sensor 244 when the light source unit 31 emits Mg light. And the second image data generated by the above. Here, as predetermined image processing, separation processing, composition processing, interpolation processing, OB clamping processing, gain adjustment processing, format conversion processing, and the like are performed. The image processing unit 51 is configured using a GPU (Graphics Processing Unit), a DSP (Digital Signal Processing), or an FPGA (Field Programmable Gate Array). The image processing unit 51 includes at least a separation unit 511, an interpolation unit 512, and a synthesis unit 513.
 分離部511は、内視鏡2が生成した画像データに対応する画像に対して、色分離を行う。具体的には、分離部511は、光源装置3がMg光を出射時に、撮像素子244が生成した画像データに対応する画像からR画素の画素値とB画素の画素値とを分離する。 The separation unit 511 performs color separation on the image corresponding to the image data generated by the endoscope 2. Specifically, the separation unit 511 separates the pixel value of the R pixel and the pixel value of the B pixel from the image corresponding to the image data generated by the image sensor 244 when the light source device 3 emits Mg light.
 補間部512は、分離部511がRB画像から分離したR画素の画素値を用いて周知の補間処理を行うことによって、全画素にR画素の画素値が補間されたR画像データを生成する。また、補間部512は、分離部511がRB画像から分離したB画素の画素値を用いて周知の補間処理を行うことによって、全画素にB画素の画素値を有するB画像データを生成する。ここで、補間処理としては、バイリニア補間処理や方向判別補間処理等である。もちろん、補間部512は、他の補間処理を用いてR画像データおよびB画像データを生成してもよい。 The interpolation unit 512 performs known interpolation processing using the pixel values of the R pixels separated from the RB image by the separation unit 511, thereby generating R image data in which the pixel values of the R pixels are interpolated for all the pixels. In addition, the interpolation unit 512 generates B image data having pixel values of B pixels in all pixels by performing a known interpolation process using the pixel values of B pixels separated from the RB image by the separation unit 511. Here, the interpolation processing includes bilinear interpolation processing, direction discrimination interpolation processing, and the like. Of course, the interpolation unit 512 may generate the R image data and the B image data using another interpolation process.
 合成部513は、内視鏡システム1が白色光観察方式(通常観察)を行う場合、補間部512が生成したR画像データ、B画像データ、および光源装置3がG光を出射時に撮像素子244が生成したG画像データを合成してカラー画像データを生成し、この生成したカラー画像データを表示装置4に出力する。また、内視鏡システム1がNBI観察方式を行う場合、補間部512が生成したB画像データに対して光源装置3がG光を出射時に撮像素子244が生成したG画像データを合成してNBI画像データを生成し、この生成したNBI画像データを表示装置4へ出力する。 When the endoscope system 1 performs the white light observation method (normal observation), the synthesizing unit 513 performs the imaging element 244 when the R image data and B image data generated by the interpolation unit 512 and the light source device 3 emit G light. The G image data generated by is generated to generate color image data, and the generated color image data is output to the display device 4. Further, when the endoscope system 1 performs the NBI observation method, the N image is generated by synthesizing the G image data generated by the imaging device 244 when the light source device 3 emits the G light with the B image data generated by the interpolation unit 512. Image data is generated, and the generated NBI image data is output to the display device 4.
 入力部52は、内視鏡システム1の動作を指示する指示信号の入力を受け付け、この受け付けた指示信号を処理制御部54へ出力する。例えば、入力部52は、白色光観察方式またはNBI観察方式のいずれかを指示する指示信号の入力を受け付け、この受け付けた指示信号を処理制御部54へ出力する。入力部52は、スイッチ、ボタンおよびタッチパネル等を用いて構成される。 The input unit 52 receives an input of an instruction signal for instructing the operation of the endoscope system 1 and outputs the received instruction signal to the processing control unit 54. For example, the input unit 52 receives an input of an instruction signal instructing either the white light observation method or the NBI observation method, and outputs the received instruction signal to the processing control unit 54. The input unit 52 is configured using switches, buttons, a touch panel, and the like.
 記録部53は、内視鏡システム1が実行する各種プログラム、内視鏡システム1が実行中のデータおよび内視鏡2が生成した画像データを記録する。記録部53は、揮発性メモリ、不揮発性メモリおよびメモリカード等を用いて構成される。記録部53は、内視鏡システム1が実行する各種プログラムを記録するプログラム記録部531を有する。 The recording unit 53 records various programs executed by the endoscope system 1, data being executed by the endoscope system 1, and image data generated by the endoscope 2. The recording unit 53 is configured using a volatile memory, a nonvolatile memory, a memory card, and the like. The recording unit 53 includes a program recording unit 531 that records various programs executed by the endoscope system 1.
 処理制御部54は、CPUを用いて構成される。処理制御部54は、内視鏡システム1を構成する各部を制御する。例えば、処理制御部54は、入力部52から光源装置3が出射する照明光を切り替える指示信号が入力された場合、照明制御部33を制御することによって、光源装置3が出射する照明光を切り替える。 The process control unit 54 is configured using a CPU. The process control unit 54 controls each unit constituting the endoscope system 1. For example, when the instruction signal for switching the illumination light emitted from the light source device 3 is input from the input unit 52, the processing control unit 54 controls the illumination control unit 33 to switch the illumination light emitted from the light source device 3. .
 〔カラーフィルタと照明光の分光特性〕
 次に、カラーフィルタ2442と光源装置3が出射する各光の分光特性について説明する。図4は、カラーフィルタ2442と光源装置3が出射する各光との分光特性を模式的に示す図である。図4において、横軸が波長を示し、右縦軸が各画素の感度(フィルタの透過率)を示し、左縦軸が各光の強度を示す。また、図4において、曲線LCyがCy画素の分光感度特性を示し、曲線LYeがYe画素の分光感度特性を示す。また、曲線LがB光の波長特性を示し、曲線LがG光の波長特性を示し、曲線LがR光の波長特性を示す。
[Spectral characteristics of color filters and illumination light]
Next, spectral characteristics of each light emitted from the color filter 2442 and the light source device 3 will be described. FIG. 4 is a diagram schematically showing the spectral characteristics of the color filter 2442 and each light emitted from the light source device 3. In FIG. 4, the horizontal axis indicates the wavelength, the right vertical axis indicates the sensitivity of each pixel (filter transmittance), and the left vertical axis indicates the intensity of each light. In FIG. 4, a curve L Cy indicates the spectral sensitivity characteristic of the Cy pixel, and a curve L Ye indicates the spectral sensitivity characteristic of the Ye pixel. The curve L B represents a wavelength characteristic of the B light, the curve L G represents a wavelength characteristic of the G light, the curve L R represents the wavelength characteristic of the R light.
 図4の曲線LYeに示すように、Ye画素は、R光およびG光に対して感度を有する(フィルタの透過率が高い)。また、Cy画素は、B光およびG光に対して感度を有する(フィルタの透過率が高い)。このため、図5に示すように、光源装置3がG光を出射した場合、撮像素子244は、全ての画素がG画素となる画像Pを生成することができる。また、図6に示すように、光源装置3がMgを出射した場合、撮像素子244は、R画素とB画素とが市松状となる画像PMgを生成することができる。このように内視鏡システム1は、処理装置5の制御のもと、光源装置3がG光またはMg光を内視鏡2へ供給し、撮像素子244がG光またはMg光で照射された被検体を順次撮像する。 As shown by the curve L Ye in FIG. 4, the Ye pixel has sensitivity to R light and G light (the transmittance of the filter is high). The Cy pixel has sensitivity to B light and G light (filter transmittance is high). Therefore, as shown in FIG. 5, when the light source apparatus 3 is emitted from the G light, the image pickup device 244 can generate an image P G all the pixels have a G pixel. As shown in FIG. 6, when the light source device 3 emits Mg, the image sensor 244 can generate an image P Mg in which R pixels and B pixels are in a checkered pattern. Thus, in the endoscope system 1, under the control of the processing device 5, the light source device 3 supplies G light or Mg light to the endoscope 2, and the imaging device 244 is irradiated with G light or Mg light. The subject is sequentially imaged.
 〔内視鏡システムの処理〕
 次に、内視鏡システム1が実行する処理について説明する。図7は、内視鏡システム1が実行する処理の概要を示すフローチャートである。図8は、画像処理部51が生成する画像を模式的に示す図である。
[Endoscope system processing]
Next, processing executed by the endoscope system 1 will be described. FIG. 7 is a flowchart showing an outline of processing executed by the endoscope system 1. FIG. 8 is a diagram schematically illustrating an image generated by the image processing unit 51.
 図7に示すように、まず、処理制御部54は、光源装置3にG光を出射させ(ステップS101)、撮像素子244にG光で照射された被写体を撮像させる(ステップS102)。この場合、図8に示すように、撮像素子244は、G光が照射された被写体を撮像することによって画像データに対応する画像Pを生成する。 As shown in FIG. 7, first, the process control unit 54 causes the light source device 3 to emit G light (step S101), and causes the image sensor 244 to image the subject irradiated with the G light (step S102). In this case, as shown in FIG. 8, the image pickup device 244 generates an image P G corresponding to the image data by imaging the subject G light is irradiated.
 続いて、処理制御部54は、光源装置3にMg光を出射させ(ステップS103)、撮像素子244にMg光で照射された被写体を撮像させる(ステップS104)。この場合、図8に示すように、撮像素子244は、Mg光が照射された被写体を撮像することによって画像データに対応する画像PMgを生成する。 Subsequently, the process control unit 54 causes the light source device 3 to emit Mg light (step S103), and causes the image sensor 244 to image the subject irradiated with the Mg light (step S104). In this case, as illustrated in FIG. 8, the imaging element 244 generates an image PMg corresponding to the image data by imaging the subject irradiated with the Mg light.
 その後、分離部511は、画像PMgに対して、R画素の画素値とB画素の画素値との色分離を行う(ステップS105)。この場合、図8に示すように、分離部511は、撮像素子244によって生成された画像データに対応する画像PMgに対してR画素の画素値を分離したR画素の画像PR1および画像PMgに対してB画素の画素値を分離したB画素の画像PB1を生成する。 Thereafter, the separation unit 511 performs color separation of the pixel value of the R pixel and the pixel value of the B pixel on the image PMg (Step S105). In this case, as illustrated in FIG. 8, the separation unit 511 separates the pixel value of the R pixel from the image P Mg corresponding to the image data generated by the image sensor 244 and the image P R1 and the image P of the R pixel. An image P B1 of B pixels obtained by separating the pixel values of B pixels from Mg is generated.
 続いて、補間部512は、画像PR1および画像PB1の各々に対して補間処理を行う(ステップS106)。具体的には、図8に示すように、補間部512は、画像PR1の画素値を用いて他のR画素の画素値を補間する補間処理を行うことによって全画素にR画素の画素値が補間された画像PR2を生成する。さらに、補間部512は、画像PB1の画素値を用いて他のB画素の画素値を補間する補間処理を行うことによって全画素にB画素の画素値が補間された画像PB2を生成する。 Subsequently, the interpolation unit 512 performs interpolation processing on each of the image P R1 and the image P B1 (step S106). Specifically, as illustrated in FIG. 8, the interpolation unit 512 performs an interpolation process for interpolating the pixel values of the other R pixels using the pixel values of the image PR1 , thereby causing the pixel values of the R pixels to be applied to all the pixels. Is generated as an interpolated image PR2 . Further, the interpolation unit 512 generates an image P B2 in which the pixel values of B pixels are interpolated in all pixels by performing an interpolation process that interpolates the pixel values of other B pixels using the pixel values of the image P B1. .
 その後、内視鏡システム1に白色光観察方式が設定されている場合(ステップS107:Yes)、合成部513は、白色画像を生成する(ステップS108)。具体的には、図8に示すように、合成部513は、画像PR2、画像PB2および画像Gを合成することによって、白色画像P(カラー画像)を生成する。この結果、内視鏡システム1は、G光およびMg光の2回の照射によって、2フレーム(2フィールド)で白色画像を生成することができる。 Thereafter, when the white light observation method is set in the endoscope system 1 (step S107: Yes), the synthesis unit 513 generates a white image (step S108). Specifically, as illustrated in FIG. 8, the combining unit 513 generates a white image P W (color image) by combining the image P R2 , the image P B2, and the image G. As a result, the endoscope system 1 can generate a white image in two frames (two fields) by two irradiations of G light and Mg light.
 続いて、操作部22または入力部52から終了を指示する指示信号が入力された場合(ステップS109:Yes)、内視鏡システム1は、本処理を終了する。これに対して、操作部22または入力部52から終了を指示する指示信号が入力されていない場合(ステップS100:No)、内視鏡システム1は、ステップS101へ戻る。 Subsequently, when an instruction signal for instructing termination is input from the operation unit 22 or the input unit 52 (step S109: Yes), the endoscope system 1 ends this process. On the other hand, when the instruction signal for instructing the end is not input from the operation unit 22 or the input unit 52 (step S100: No), the endoscope system 1 returns to step S101.
 ステップS107において、内視鏡システム1に白色観察方式が設定されていない場合(ステップS107:No)、合成部513は、NBI画像を生成する(ステップS110)。具体的には、図8に示すように、合成部513は、画像Pと画像PB2を合成することによってNBI画像PNBIを生成する。この結果、内視鏡システム1は、G光およびMg光の2回の照射によって、2フレーム(2フィールド)でNBI画像PNBIを生成することができる。なお、実施の形態1では、内視鏡システム1の設定に応じて、白色画像およびNBI画像のどちらか一方を生成していたが、これに限定されることなく、同時に白色画像およびNBI画像を生成してもよい。また、実施の形態1では、白色画像およびNBI画像を表示装置4に同時に表示させてもよいし、白色画像上に縮小したNBI画像を重畳して表示装置4に表示させてもよいし、NBI画像上に縮小した白色画像を重畳して表示装置4に表示させてもよい。ステップS110の後、内視鏡システム1は、ステップS109へ移行する。 In step S107, when the white observation method is not set in the endoscope system 1 (step S107: No), the synthesis unit 513 generates an NBI image (step S110). Specifically, as shown in FIG. 8, the synthesis unit 513 generates an NBI image P NBI by combining the images P G and the image P B2. As a result, the endoscope system 1 can generate an NBI image P NBI in two frames (two fields) by two irradiations of G light and Mg light. In the first embodiment, either one of the white image and the NBI image is generated according to the setting of the endoscope system 1. However, the present invention is not limited to this, and the white image and the NBI image are simultaneously generated. It may be generated. In the first embodiment, the white image and the NBI image may be simultaneously displayed on the display device 4, or the reduced NBI image may be superimposed on the white image and displayed on the display device 4, or the NBI may be displayed. A reduced white image may be superimposed on the image and displayed on the display device 4. After step S110, the endoscope system 1 proceeds to step S109.
 以上説明した実施の形態1によれば、画像処理部51が光源装置3によってG光が照射された際に撮像素子244によって生成された第1の画像データと、光源装置3によってMg光が照射された際に撮像素子244によって生成された第2の画像データと、を合成する。これにより、内視鏡システム1は、G光およびMg光の2回の照射によって、2フレーム(2フィールド)で白色画像およびNBI画像を生成することができる。 According to the first embodiment described above, the first image data generated by the image sensor 244 when the image processing unit 51 is irradiated with G light by the light source device 3, and Mg light is irradiated by the light source device 3. And the second image data generated by the image sensor 244 at the time. Thereby, the endoscope system 1 can generate a white image and an NBI image in two frames (two fields) by two irradiations of G light and Mg light.
(実施の形態1の変形例)
 次に、実施の形態1の変形例について説明する。実施の形態1の変形例は、上述した実施の形態1に係るカラーフィルタ2442と構成が異なる。以下においては、実施の形態1の変形例に係るカラーフィルタの構成について説明する。なお、上述した実施の形態1に係る内視鏡システム1と同一の構成には同一の符号を付して詳細な説明を適宜省略する。
(Modification of Embodiment 1)
Next, a modification of the first embodiment will be described. The modification of the first embodiment is different in configuration from the color filter 2442 according to the first embodiment described above. Hereinafter, a configuration of the color filter according to the modification of the first embodiment will be described. In addition, the same code | symbol is attached | subjected to the structure same as the endoscope system 1 which concerns on Embodiment 1 mentioned above, and detailed description is abbreviate | omitted suitably.
 〔カラーフィルタの構成〕
 図9は、実施の形態1の変形例に係るカラーフィルタの構成を模式的に示す図である。
[Color filter configuration]
FIG. 9 is a diagram schematically illustrating a configuration of a color filter according to a modification of the first embodiment.
 図9に示すカラーフィルタ2442Aは、1つのYeフィルタとおよび3つのCyフィルタを用いたフィルタユニットU2によって形成され、このフィルタユニットU2が縦横2ユニット(2×2)によって形成される。カラーフィルタ2442Aは、Yeフィルタの数がCyフィルタの数より少なく配置される。カラーフィルタ2442Aは、YeフィルタおよびCyフィルタの各々が画素部2441の各画素の受光面に積層される。 The color filter 2442A shown in FIG. 9 is formed by a filter unit U2 using one Ye filter and three Cy filters, and this filter unit U2 is formed by two vertical and horizontal units (2 × 2). The color filter 2442A is arranged such that the number of Ye filters is smaller than the number of Cy filters. In the color filter 2442A, each of the Ye filter and the Cy filter is stacked on the light receiving surface of each pixel of the pixel portion 2441.
 このように構成された内視鏡システム1は、処理装置5の制御のもと、光源装置3がG光またはMg光を内視鏡2へ供給し、撮像素子244がG光またはMg光で照射された被検体を順次撮像する。YeフィルタおよびCyフィルタは、G光を透過する。このため、図10に示すように、撮像素子244は、画像データに対応する画像Pを生成することができる。また、Yeフィルタは、R光を透過し、Cyフィルタは、B光を透過する。このため、図11に示すように、撮像素子244は、R画素とB画素とからなる画像データに対応する画像PMg2を生成することができる。R光は、空間周波数に高周波成分が少ない。このため、カラーフィルタ2442Aは、全体に対して1/4のYeフィルタを設けているので、色信号の解像度の最適化を図ることができる。さらに、カラーフィルタ2442Aは、全体に対して3/4のCyフィルタを設けているので、狭帯域光観察方式を行う場合、高解像度のNBI画像を得ることができる。 In the endoscope system 1 configured as described above, the light source device 3 supplies G light or Mg light to the endoscope 2 under the control of the processing device 5, and the imaging element 244 is G light or Mg light. The irradiated subject is sequentially imaged. The Ye filter and the Cy filter transmit G light. Therefore, as shown in FIG. 10, the image pickup device 244 can generate an image P G corresponding to the image data. The Ye filter transmits R light, and the Cy filter transmits B light. For this reason, as shown in FIG. 11, the image sensor 244 can generate an image P Mg2 corresponding to image data composed of R pixels and B pixels. R light has few high frequency components in the spatial frequency. For this reason, the color filter 2442A is provided with a 1/4 Ye filter as a whole, so that the resolution of the color signal can be optimized. Furthermore, since the color filter 2442A is provided with a 3/4 Cy filter as a whole, a high-resolution NBI image can be obtained when the narrow-band light observation method is performed.
 以上説明した実施の形態1の変形例によれば、カラーフィルタ2442Aの全体に対して3/4のCyフィルタを設けているので、狭帯域光観察方式を行う場合、高解像度のNBI画像を得ることができる。 According to the modification of the first embodiment described above, the 3/4 Cy filter is provided for the entire color filter 2442A. Therefore, when performing the narrow-band light observation method, a high-resolution NBI image is obtained. be able to.
(実施の形態2)
 次に、実施の形態2について説明する。実施の形態2では、所定の数の同色画素を加算する画素加算によって解像度と感度を向上させる。以下においては、上述した実施の形態1に係る内視鏡システム1と同一の構成には同一の符号を付して説明を省略する。
(Embodiment 2)
Next, a second embodiment will be described. In the second embodiment, resolution and sensitivity are improved by pixel addition in which a predetermined number of same color pixels are added. Below, the same code | symbol is attached | subjected to the structure same as the endoscope system 1 which concerns on Embodiment 1 mentioned above, and description is abbreviate | omitted.
 図12は、実施の形態2に係る撮像素子の構成を模式的に示し回路図である。図12に示す撮像素子244Bは、画素部2441Bと、カラーフィルタ2442と、を有する。なお、図12においては、カラーフィルタ2442を点線で示し、各画素の受光面に積層して配置している。 FIG. 12 is a circuit diagram schematically showing the configuration of the image sensor according to the second embodiment. An imaging element 244B illustrated in FIG. 12 includes a pixel portion 2441B and a color filter 2442. In FIG. 12, the color filter 2442 is indicated by a dotted line, and is stacked on the light receiving surface of each pixel.
 図12に示す画素部2441Bは、8画素で、電荷変換部、リセットトランジスタ、増幅トランジスタ、選択トランジスタを共有する。具体的には、図12に示す画素部2441Bは、8つの光電変換素子301(フォトダイオード)と、8つの光電変換素子301の各々に設けられた転送トランジスタ302と、電荷変換部303と、リセットトランジスタ304と、画素ソースフォロアトランジスタ305と、選択トランジスタ306と、垂直転送線307と、を有する。 The pixel portion 2441B shown in FIG. 12 is 8 pixels and shares the charge conversion portion, the reset transistor, the amplification transistor, and the selection transistor. Specifically, the pixel portion 2441B illustrated in FIG. 12 includes eight photoelectric conversion elements 301 (photodiodes), a transfer transistor 302 provided in each of the eight photoelectric conversion elements 301, a charge conversion unit 303, and a reset. The pixel includes a transistor 304, a pixel source follower transistor 305, a selection transistor 306, and a vertical transfer line 307.
 光電変換素子301は、入射光をその光量に応じた信号電荷量に光電変換して蓄積する。光電変換素子301は、カソード側が転送トランジスタ302の一端側に接続され、アノード側がグランドに接続される。 The photoelectric conversion element 301 photoelectrically converts incident light into a signal charge amount corresponding to the amount of light and accumulates it. The photoelectric conversion element 301 has a cathode side connected to one end side of the transfer transistor 302 and an anode side connected to the ground.
 転送トランジスタ302は、光電変換素子301から電荷変換部303へ電荷を転送する。転送トランジスタ302は、ゲートに駆動信号ΦTn(n=1、2、3…,n-1,n)が供給される信号線が接続され、一端側に光電変換素子301が接続され、他端側に電荷変換部303が接続さえる。転送トランジスタ302は、図示しない垂直走査部から信号線を経由して駆動信号ΦTが供給されると、オン状態となり、光電変換素子301から電荷変換部303へ電荷を転送する。 The transfer transistor 302 transfers charges from the photoelectric conversion element 301 to the charge conversion unit 303. The transfer transistor 302 has a gate connected to a signal line to which a drive signal ΦTn (n = 1, 2, 3,..., N−1, n) is supplied, a photoelectric conversion element 301 connected to one end side, and the other end side. The charge conversion unit 303 is connected to. The transfer transistor 302 is turned on when a drive signal ΦT is supplied from a vertical scanning unit (not shown) via a signal line, and transfers charge from the photoelectric conversion element 301 to the charge conversion unit 303.
 電荷変換部303は、浮遊拡散容量(FD)からなり、光電変換素子301で蓄積された電荷を電圧に変換する。 The charge conversion unit 303 includes a floating diffusion capacitor (FD), and converts the charge accumulated in the photoelectric conversion element 301 into a voltage.
 リセットトランジスタ304は、電荷変換部303を所定電位にリセットする。リセットトランジスタ304は、一端側が電源電圧VDDに接続され、他端側が電荷変換部303に接続され、ゲートに駆動信号ΦRが供給される信号線が接続される。リセットトランジスタ304は、図示しない垂直走査部から信号線を経由して駆動信号ΦRが供給されると、オン状態となり、電荷変換部303に蓄積された信号電荷を放出させ、電荷変換部303を所定電位にリセットする。 The reset transistor 304 resets the charge conversion unit 303 to a predetermined potential. The reset transistor 304 has one end connected to the power supply voltage VDD, the other end connected to the charge converter 303, and a gate connected to a signal line to which the drive signal ΦR is supplied. The reset transistor 304 is turned on when a drive signal ΦR is supplied from a vertical scanning unit (not shown) via a signal line, and releases the signal charge accumulated in the charge conversion unit 303, thereby causing the charge conversion unit 303 to be predetermined. Reset to potential.
 画素ソースフォロアトランジスタ305は、一端側が選択トランジスタ306に接続され、他端側が垂直転送線307に接続され、ゲートに電荷変換部303で電圧変換された信号(画像信号)を伝送する信号線が接続される。 The pixel source follower transistor 305 has one end connected to the selection transistor 306, the other end connected to the vertical transfer line 307, and a gate connected to a signal line for transmitting a signal (image signal) converted by the charge converter 303. Is done.
 選択トランジスタ306は、一端側が電源電圧VDDに接続され、他端側が画素ソースフォロアトランジスタ305に接続され、ゲートに行選択信号ΦXが供給される信号線が接続される。選択トランジスタ306は、行選択信号ΦXが供給されると、オン状態となり、画素ソースフォロアトランジスタ305から出力された信号(画像信号)を垂直転送線307へ転送する。 The selection transistor 306 has one end connected to the power supply voltage VDD, the other end connected to the pixel source follower transistor 305, and a gate connected to a signal line to which a row selection signal ΦX is supplied. The selection transistor 306 is turned on when the row selection signal ΦX is supplied, and transfers the signal (image signal) output from the pixel source follower transistor 305 to the vertical transfer line 307.
 〔撮像素子の駆動方法〕
 次に、上記の撮像素子244Bの駆動方法について説明する。
 まず、撮像素子244Bのプログレッシブ読み出し(順次走査)の場合について説明する。なお、撮像素子244Bがプログレッシブ読み出しを行う場合、光源装置3は、G光を出射する。このため、撮像素子244Bは、G光で照射された被写体を撮像する。
[Driving method of image sensor]
Next, a method for driving the image sensor 244B will be described.
First, the case of progressive reading (sequential scanning) of the image sensor 244B will be described. When the image sensor 244B performs progressive reading, the light source device 3 emits G light. For this reason, the image sensor 244B images the subject irradiated with the G light.
 図12に示すように、撮像制御部246は、信号線ΦT1に転送パルスを入力し、第1行のCy画素の信号電荷を読み出す。続いて、撮像制御部246は、信号線ΦT2に転送パルスを入力し、第1行のYe画素の信号電荷を読み出す。このように、撮像制御部246は、撮像素子244Bの行毎に信号電荷を順次読み出す。この場合において、図12に示すように、撮像素子244Bは、光源装置3がG光を出射しているので、全ての画素がG画素である画像Pを生成することができる。このとき、図13に示すように、合成部513は、4つのG画素を1つのG画素PGA1として合成する。これにより、感度を上げたG画像を生成することができる。 As illustrated in FIG. 12, the imaging control unit 246 inputs a transfer pulse to the signal line ΦT1 and reads the signal charges of the Cy pixels in the first row. Subsequently, the imaging control unit 246 inputs a transfer pulse to the signal line ΦT2 and reads the signal charge of the Ye pixels in the first row. In this manner, the imaging control unit 246 sequentially reads out the signal charges for each row of the imaging element 244B. In this case, as shown in FIG. 12, the imaging element 244B, since the light source apparatus 3 is emitted G light, it is possible that all the pixels to produce an image P G is a G pixel. At this time, as shown in FIG. 13, the combining unit 513 combines four G pixels as one G pixel PGA1 . As a result, a G image with increased sensitivity can be generated.
 次に、撮像素子244Bの画素混合読み出しの場合について説明する。なお、撮像素子244Bが画素混合読み出しを行う場合、光源装置3は、Mg光を出射する。このため、撮像素子244Bは、Mg光で照射された被写体を撮像する。 Next, the case of pixel mixture readout of the image sensor 244B will be described. Note that, when the image sensor 244B performs pixel mixture readout, the light source device 3 emits Mg light. For this reason, the image sensor 244B images the subject irradiated with the Mg light.
 図12に示すように、撮像制御部246は、信号線ΦT1および信号線ΦT4に転送パルスを入力し、第1行のCy画素の信号電荷および第2行のCy画素の信号電荷を読み出す。続いて、撮像制御部246は、信号線ΦT2および信号線ΦT3に転送パルスを入力し、第1行のYe画素の信号電荷および第2行のYe画素の信号電荷を読み出す。このように、撮像制御部246は、撮像素子244Bの2行毎に同色の画素から信号電荷を読み出す。この場合において、図14および図15に示すように、撮像素子244Bは、光源装置3がMg光を出射しているので、R画素とB画素が市松状なる画像PMg1を生成することができる。このとき、図14および図15に示すように、合成部513は、斜めに隣接する同色の2つの画素を1つの画素として合成する。例えば、図14に示すように、合成部513は、斜めに隣接する2つのR画素を1つのR画素PRA1として合成する。さらに、図15に示すように、合成部513は、斜めに隣接する2つのB画素を1つのB画素PBA1として合成する。これにより、感度を上げたR画像およびB画像を生成することができる。 As illustrated in FIG. 12, the imaging control unit 246 inputs a transfer pulse to the signal line ΦT1 and the signal line ΦT4, and reads the signal charges of the Cy pixels in the first row and the signal charges of the Cy pixels in the second row. Subsequently, the imaging control unit 246 inputs a transfer pulse to the signal line ΦT2 and the signal line ΦT3, and reads the signal charge of the Ye pixel in the first row and the signal charge of the Ye pixel in the second row. As described above, the imaging control unit 246 reads out signal charges from pixels of the same color every two rows of the imaging element 244B. In this case, as shown in FIGS. 14 and 15, the imaging element 244B, since the light source apparatus 3 is emitted Mg light, it is possible to generate an image P Mg1 which R and B pixels is checkered pattern . At this time, as illustrated in FIGS. 14 and 15, the combining unit 513 combines two pixels of the same color that are diagonally adjacent to each other as one pixel. For example, as illustrated in FIG. 14, the combining unit 513 combines two diagonally adjacent R pixels as one R pixel PRA1 . Further, as illustrated in FIG. 15, the combining unit 513 combines two diagonally adjacent B pixels as one B pixel P BA1 . Thereby, it is possible to generate an R image and a B image with increased sensitivity.
 以上説明した実施の形態2によれば、撮像制御部246が光源装置3によってG光が照射された際に撮像素子244Bを構成するG画素の画素値を所定の画素数毎に加算して出力させる。さらに、撮像制御部246が光源装置3によってMg光が照射された際に撮像素子244Bを構成する各画素のうち、同じ波長帯域の光を受光可能なB画素の画素値またはR画素の画素値を所定の画素数毎に加算して出力させる。この結果、感度の各々を上げた白色画像およびNBI画像を生成することができる。 According to the second embodiment described above, when the imaging control unit 246 is irradiated with G light from the light source device 3, the pixel values of the G pixels constituting the imaging element 244B are added for each predetermined number of pixels and output. Let Furthermore, the pixel value of the B pixel or the pixel value of the R pixel that can receive light in the same wavelength band among the pixels that constitute the imaging element 244B when the imaging control unit 246 is irradiated with Mg light by the light source device 3. Are added for each predetermined number of pixels and output. As a result, it is possible to generate a white image and an NBI image with increased sensitivity.
 なお、実施の形態2では、撮像制御部246が撮像素子244Bを構成する同じ波長帯域の光を受光可能な画素の画素値を所定の画素数毎に電荷変換部303で加算し、この換算した画素値を出力させることによって、合成部513による合成処理を省略させてもよい。 In the second embodiment, the imaging control unit 246 adds the pixel values of the pixels that can receive the light in the same wavelength band constituting the imaging element 244B by the charge conversion unit 303 for each predetermined number of pixels, and performs the conversion. By outputting the pixel value, the synthesis process by the synthesis unit 513 may be omitted.
(実施の形態3)
 次に、実施の形態3について説明する。実施の形態3は、上述した実施の形態1に係るカラーフィルタ2442と構成が異なるうえ、光源装置3が出射する照明光の種類が異なる。以下においては、実施の形態3に係るカラーフィルタの構成ついて説明する。なお、上述した実施の形態1に係る内視鏡システム1と同一の構成には同一の符号を付して詳細な説明を適宜省略する。
(Embodiment 3)
Next, Embodiment 3 will be described. The third embodiment is different in configuration from the color filter 2442 according to the first embodiment described above, and is different in the type of illumination light emitted from the light source device 3. Hereinafter, the configuration of the color filter according to Embodiment 3 will be described. In addition, the same code | symbol is attached | subjected to the structure same as the endoscope system 1 which concerns on Embodiment 1 mentioned above, and detailed description is abbreviate | omitted suitably.
 〔カラーフィルタの構成〕
 図16は、実施の形態3に係るカラーフィルタの構成を模式的に示す図である。
[Color filter configuration]
FIG. 16 is a diagram schematically illustrating a configuration of a color filter according to the third embodiment.
 図16に示すカラーフィルタ2442Bは、赤色の波長帯域の光と青色の波長帯域の光を透過するマゼンタフィルタ(以下、単に「Mgフィルタ」という)と、Cyフィルタと、を用いて構成される。カラーフィルタ2442Bは、2つのMgフィルタと、2つのCyフィルタを用いたフィルタユニットU3で形成され、このフィルタユニットU3が縦横2ユニット(2××)によって形成される。カラーフィルタ2442Bは、MgフィルタおよびCyフィルタが市松状に配置されて形成される。カラーフィルタ2442Bは、MgフィルタまたはCyフィルタが画素部2441の各画素の受光面に積層される。 16 is configured using a magenta filter (hereinafter simply referred to as “Mg filter”) that transmits light in the red wavelength band and light in the blue wavelength band, and a Cy filter. The color filter 2442B is formed by a filter unit U3 using two Mg filters and two Cy filters, and the filter unit U3 is formed by two vertical and horizontal units (2xx). The color filter 2442B is formed by arranging Mg filters and Cy filters in a checkered pattern. As the color filter 2442B, an Mg filter or a Cy filter is stacked on the light receiving surface of each pixel of the pixel portion 2441.
 このように構成された内視鏡システム1は、処理装置5の制御のもと、光源装置3がG光とR光を同時に出射することによって赤色の波長帯域の光と緑色の波長の光を含む光(以下、単に「Ye光」という)またはB光を内視鏡2へ供給し、撮像素子244がYe光またはB光で照射された被写体を順次撮像する。なお、実施の形態3では、B光が第1の波長帯域の光に該当し、Ye光が第2の波長帯域の光に該当する。 In the endoscope system 1 configured as described above, the light source device 3 emits G light and R light at the same time under the control of the processing device 5 to emit light in the red wavelength band and light in the green wavelength. Including light (hereinafter simply referred to as “Ye light”) or B light is supplied to the endoscope 2, and the imaging element 244 sequentially images the subject irradiated with the Ye light or B light. In the third embodiment, B light corresponds to light in the first wavelength band, and Ye light corresponds to light in the second wavelength band.
 Mgフィルタは、R光を透過し、Cyフィルタは、G光を透過する。このため、図17に示すように、撮像素子244は、R画素とG画素とが市松状となる画像PYeを生成することができる。また、MgフィルタおよびCyフィルタは、B光を透過する。このため、図18に示すように、撮像素子244は、画像Pを生成することができる。画像処理部51は、上述した実施の形態1と同様の処理を実行する。これにより、狭帯域観察方式を行う際に、B画素の画素値が多いので、NBI画像の解像感を向上させることができる。 The Mg filter transmits R light, and the Cy filter transmits G light. For this reason, as shown in FIG. 17, the image sensor 244 can generate an image P Ye in which the R pixel and the G pixel are in a checkered pattern. Further, the Mg filter and the Cy filter transmit B light. For this reason, as shown in FIG. 18, the image sensor 244 can generate an image P B. The image processing unit 51 performs the same process as that in the first embodiment. Thereby, when performing the narrow band observation method, since the pixel value of the B pixel is large, it is possible to improve the resolution of the NBI image.
 以上説明した実施の形態3によれば、画像処理部51が光源装置3によってB光が照射された際に撮像素子244によって生成された第1の画像データと、光源装置3によってYe光が照射された際に撮像素子244によって生成された第2の画像データと、を合成する。これにより、内視鏡システム1は、B光およびYe光の2回の照射によって、2フレーム(2フィールド)で白色画像およびNBI画像を生成することができる。 According to the third embodiment described above, the first image data generated by the imaging device 244 when the image processing unit 51 is irradiated with the B light by the light source device 3 and the Ye light is irradiated by the light source device 3. And the second image data generated by the image sensor 244 at the time. Thereby, the endoscope system 1 can generate a white image and an NBI image in two frames (two fields) by two irradiations of B light and Ye light.
(実施の形態4)
 次に、実施の形態4について説明する。本実施の形態4は、上述した実施の形態1に係る光源装置3が異なる。以下においては、実施の形態4に係る内視鏡システムの構成を説明する。なお、上述した実施の形態1に係る内視鏡システム1と同一の構成には同一の符号を付して省略な説明を適宜省略する。
(Embodiment 4)
Next, a fourth embodiment will be described. The fourth embodiment is different from the light source device 3 according to the first embodiment described above. In the following, the configuration of the endoscope system according to the fourth embodiment will be described. In addition, the same code | symbol is attached | subjected to the structure same as the endoscope system 1 which concerns on Embodiment 1 mentioned above, and the abbreviate | omitted description is abbreviate | omitted suitably.
 〔内視鏡システムの構成〕
 図19は、実施の形態4に係る内視鏡システムの要部の機能構成を示すブロック図である。図19に示す内視鏡システム1Cは、上述した実施の形態1に係る光源装置3に換えて、光源装置3Cを備える。
[Configuration of endoscope system]
FIG. 19 is a block diagram illustrating a functional configuration of a main part of the endoscope system according to the fourth embodiment. An endoscope system 1C shown in FIG. 19 includes a light source device 3C instead of the light source device 3 according to the first embodiment described above.
 光源装置3Cは、上述した実施の形態1に係る光源装置3の光源部31に換えて、光源部31Cを備える。光源部31Cは、上述した実施の形態1に係る光源部31の構成に加えて、第4光源315および第5光源316を備える。 The light source device 3C includes a light source unit 31C instead of the light source unit 31 of the light source device 3 according to Embodiment 1 described above. The light source unit 31C includes a fourth light source 315 and a fifth light source 316 in addition to the configuration of the light source unit 31 according to Embodiment 1 described above.
 第4光源315は、紫色LEDランプを用いて構成される。第4光源315は、光源ドライバ32から供給される電流に基づいて、紫色(Violet)の波長帯域(400nm~435nm)の光(以下、単に「V光」という)を出射する。 The fourth light source 315 is configured using a purple LED lamp. The fourth light source 315 emits light in the violet wavelength band (400 nm to 435 nm) (hereinafter simply referred to as “V light”) based on the current supplied from the light source driver 32.
 第5光源316は、橙色LEDランプを用いて構成される。第5光源316は、光源ドライバ32から供給される電流に基づいて、橙色(Amber)の波長帯域(585nm~620nm)の光(以下、単に「A光」という)を出射する。 The fifth light source 316 is configured using an orange LED lamp. The fifth light source 316 emits light in the orange (Amber) wavelength band (585 nm to 620 nm) (hereinafter simply referred to as “A light”) based on the current supplied from the light source driver 32.
 〔カラーフィルタと照明光の分光特性〕
 次に、カラーフィルタ2442と光源装置3Cが出射する各光との分光特性について説明する。図20は、カラーフィルタ2442と光源装置3Cが出射する各光との分光特性を模式的に示す図である。図20において、横軸が波長を示し、右縦軸が各画素の感度を示し、左縦軸が各光の強度を示す。また、図20において、曲線LCyがCy画素の分光感度特性を示し、曲線LYeがYe画素の分光感度特性を示す。また、曲線LがV光の波長特性を示し、曲線LがB光の波長特性を示し、曲線LがG光の波長特性を示し、曲線LがA光の波長特性を示し、曲線LがR光の波長特性を示す。
[Spectral characteristics of color filters and illumination light]
Next, spectral characteristics of the color filter 2442 and each light emitted from the light source device 3C will be described. FIG. 20 is a diagram schematically illustrating spectral characteristics of the color filter 2442 and each light emitted from the light source device 3C. In FIG. 20, the horizontal axis indicates the wavelength, the right vertical axis indicates the sensitivity of each pixel, and the left vertical axis indicates the intensity of each light. In FIG. 20, a curve L Cy indicates the spectral sensitivity characteristic of the Cy pixel, and a curve L Ye indicates the spectral sensitivity characteristic of the Ye pixel. The curve L V represents a wavelength characteristic of the V light, curve L B represents a wavelength characteristic of the B light, the curve L G represents a wavelength characteristic of the G light, the curve L A represents a wavelength characteristic of the A light, curve L R represents the wavelength characteristic of the R light.
 図20の曲線LYeに示すように、Yeフィルタは、R光、A光およびG光を透過する。また、Cyフィルタは、V光、B光およびG光を透過する。このため、図21に示すように、光源装置3CがG光を出射した場合、撮像素子244は、全ての画素がG画素となる画像Pを生成することができる。また、図22に示すように、光源装置3CがB光とR光を出射した場合、撮像素子244は、R画素とB画素とが市松状となる画像PBRを生成することができる。さらに、図23に示すように、光源装置3CがV光とA光を出射した場合、撮像素子244は、A画素とV画素とが市松状となる画像PAVを生成することができる。即ち、実施の形態4によれば、3フレームで、互いに用途が異なる複数の画像、例えば3種類以上の画像を生成することができる。 As indicated by a curve L Ye in FIG. 20, the Ye filter transmits R light, A light, and G light. The Cy filter transmits V light, B light, and G light. Therefore, as shown in FIG. 21, when the light source device 3C is emitted from the G light, the image pickup device 244 can generate an image P G all the pixels have a G pixel. In addition, as illustrated in FIG. 22, when the light source device 3 </ b> C emits B light and R light, the imaging element 244 can generate an image P BR in which the R pixel and the B pixel are checkered. Furthermore, as shown in FIG. 23, when the light source device 3C is emitted from the V light and A light, imaging device 244 may generate an image P AV where A pixel and V pixel a checkered pattern. That is, according to the fourth embodiment, a plurality of images having different uses, for example, three or more types of images can be generated in three frames.
 〔内視鏡システムの処理〕
 次に、内視鏡システム1Cが実行する処理について説明する。図24は、内視鏡システム1Cが実行する処理の概要を示すタイミングチャートである。なお、以下においては、光源装置3Cは、G光を照射するG光照射フレームと、B光とR光とを同時に照射するB+R光照射フレームと、V光とA光とを同時に照射するV+A光照射フレームの3パターンを行う。図24において、横軸が時間を示す。また、図24においては、照射時間と信号読み出し時間とを加算した時間を1周期とし、この1周期の時間が1/90秒(90fps)で行う場合について説明する。なお、1周期の時間は、適宜変更することができ、例えば1/120秒や1/240秒であってもよいし、被写体、検査内容および部位等によって適宜変更することができる。
[Endoscope system processing]
Next, processing executed by the endoscope system 1C will be described. FIG. 24 is a timing chart illustrating an outline of processing executed by the endoscope system 1C. In the following description, the light source device 3C has a G light irradiation frame for irradiating G light, a B + R light irradiation frame for simultaneously irradiating B light and R light, and a V + A light for simultaneously irradiating V light and A light. Three patterns of irradiation frames are performed. In FIG. 24, the horizontal axis indicates time. Further, in FIG. 24, a case will be described in which the time obtained by adding the irradiation time and the signal readout time is one cycle, and the time of this one cycle is 1/90 seconds (90 fps). Note that the time of one cycle can be changed as appropriate, and may be 1/120 seconds or 1/240 seconds, for example, or can be changed as appropriate depending on the subject, the examination content, the site, and the like.
 図24に示すように、まず、光源装置3Cは、内視鏡2にG光を供給することによって、内視鏡2が被写体にG光を照射する。この場合、撮像素子244は、G光が照射された被写体を撮像する。そして、撮像制御部246は、撮像素子244から信号を読み出せて画像処理部51へ出力させる。 24, first, the light source device 3C supplies G light to the endoscope 2, so that the endoscope 2 irradiates the subject with G light. In this case, the image sensor 244 images the subject irradiated with the G light. Then, the imaging control unit 246 reads a signal from the imaging element 244 and outputs the signal to the image processing unit 51.
 続いて、光源装置3Cは、内視鏡2にB光+R光を供給することによって、内視鏡2が被写体にB光+R光を照射する。この場合、撮像素子244は、B光+R光が照射された被写体を撮像する。そして、撮像制御部246は、撮像素子244から信号を読み出せて画像処理部51へ出力させる。このとき、画像処理部51は、画像PBRに対して、上述した実施の形態1と同様の分離処理および補間処理を行って画像PB2、画像PR2を生成し、画像PG、画像PB2および画像PR2を合成する合成処理を行うことによって、白色画像を生成する。 Subsequently, the light source device 3C supplies the B light + R light to the endoscope 2 so that the endoscope 2 irradiates the subject with the B light + R light. In this case, the image sensor 244 images the subject irradiated with the B light + R light. Then, the imaging control unit 246 reads a signal from the imaging element 244 and outputs the signal to the image processing unit 51. At this time, the image processing unit 51 performs the separation process and the interpolation process similar to those of the first embodiment described above on the image P BR to generate the image P B2 and the image PR2 , and the image P G and the image P A white image is generated by performing a combining process for combining B2 and the image PR2 .
 その後、光源装置3Cは、内視鏡2にV光+A光を供給することによって、内視鏡2が被写体にV光+A光を照射する。この場合、撮像素子244は、V光+A光が照射された被写体を撮像する。そして、撮像制御部246は、撮像素子244から信号を読み出せて画像処理部51へ出力させる。 Thereafter, the light source device 3C supplies V light + A light to the endoscope 2 so that the endoscope 2 irradiates the subject with V light + A light. In this case, the image sensor 244 images the subject irradiated with the V light + A light. Then, the imaging control unit 246 reads a signal from the imaging element 244 and outputs the signal to the image processing unit 51.
 続いて、光源装置3Cは、内視鏡2にG光を供給することによって、内視鏡2が被写体にG光を照射する。この場合、撮像素子244は、G光が照射された被写体を撮像する。そして、撮像制御部246は、撮像素子244から信号を読み出せて画像処理部51へ出力させる。このとき、画像処理部51は、画像PVAに対して、分離処理および補間処理を行うことによって画像Pおよび画像Pの各々を生成し、この生成した画像Pおよび画像Pと、G光およびB光+R光の照射時に生成された画像PG、画像PB2および画像PR2とを合成することによってMBI画像(Multiband Imaging)を生成する。なお、画像Pは、図示しない学習器によって所定の特徴量を有する異常領域を抽出する場合に用いられる。ここで、異常領域とは、ガン領域、出血領域および病変領域である。 Subsequently, the light source device 3 </ b> C supplies G light to the endoscope 2, so that the endoscope 2 irradiates the subject with G light. In this case, the image sensor 244 images the subject irradiated with the G light. Then, the imaging control unit 246 reads a signal from the imaging element 244 and outputs the signal to the image processing unit 51. At this time, the image processing unit 51 generates the image P V and the image P A by performing separation processing and interpolation processing on the image P VA , and the generated image P V and the image P A. generating a MBI image (Multiband Imaging) by combining G and B lights + R light image P G generated upon irradiation of the image P B2 and the image P R2. Note that the image P A is used when extracting the abnormal area having a predetermined feature quantity by learning, not shown. Here, the abnormal areas are a cancer area, a bleeding area, and a lesion area.
 このように、内視鏡システム1Cは、G光、B光+R光およびV光+A光の照射パターンの3フレームで1周期とするパターンを繰り返すことによって、白色画像およびMBI画像を含む複数の種類の画像を生成することができる。 As described above, the endoscope system 1C has a plurality of types including a white image and an MBI image by repeating a pattern of one period in three frames of irradiation patterns of G light, B light + R light, and V light + A light. Images can be generated.
 以上説明した実施の形態4によれば、照明制御部33が光源装置3CにG光、B光+R光およびV光+A光の照射パターンの3フレームで1周期とするパターンを繰り返させることによって、白色画像およびMBI画像を含む複数の種類の画像を生成することができる。これらの複数の種類の画像に対して、異常領域(例えばガンや出血等)の特徴量を学習した学習器やAIが異常領域を抽出するようにしてもよい。この場合、学習器やAIが抽出した特徴部分や特徴領域を白色画像に重畳して表示装置4に表示させてもよい。 According to the fourth embodiment described above, the illumination control unit 33 causes the light source device 3C to repeat the pattern of one cycle in the irradiation pattern of G light, B light + R light, and V light + A light. Multiple types of images can be generated including white and MBI images. For these multiple types of images, a learning device or AI that has learned the feature amount of an abnormal region (for example, cancer, bleeding, etc.) may extract the abnormal region. In this case, the feature part or feature region extracted by the learning device or AI may be superimposed on the white image and displayed on the display device 4.
 また、実施の形態4によれば、高画質と小型化を両立しつつ、診断に必要な色情報を少ないフレーム数から生成することができるうえ、色ずれを抑制することができる。 Further, according to the fourth embodiment, color information necessary for diagnosis can be generated from a small number of frames while achieving both high image quality and downsizing, and color misregistration can be suppressed.
 また、実施の形態4では、V光、B光、A光およびR光を同時に照射したり、V光およびB光のみ同時に照射したりすることによって2面順次とすることで、高フレームレート化することができる。 Further, in the fourth embodiment, the frame rate is increased by sequentially irradiating V light, B light, A light, and R light, or by simultaneously irradiating only V light and B light. can do.
 なお、実施の形態4では、上述した実施の形態3と同様に同色の画素の画素値を加算することによって感度を上げてもよい。 In the fourth embodiment, the sensitivity may be increased by adding pixel values of pixels of the same color as in the third embodiment.
(その他の実施の形態)
 上述した本開示の実施の形態1~4に係る内視鏡システムに開示されている複数の構成要素を適宜組み合わせることによって、種々の形態を形成することができる。例えば、上述した実施の形態1~4に記載した全構成要素からいくつかの構成要素を削除してもよい。さらに、上述した実施の形態1~4で説明した構成要素を適宜組み合わせてもよい。
(Other embodiments)
Various forms can be formed by appropriately combining a plurality of constituent elements disclosed in the endoscope systems according to the first to fourth embodiments of the present disclosure described above. For example, some components may be deleted from all the components described in the first to fourth embodiments. Furthermore, the constituent elements described in the first to fourth embodiments may be appropriately combined.
 また、本開示の実施の形態1~4では、制御装置と光源装置とが別体であったが、一体的に形成してもよい。 In the first to fourth embodiments of the present disclosure, the control device and the light source device are separate, but may be integrally formed.
 また、本開示の実施の形態1~4では、LEDランプによって各光を出射させていたが、これに限定されることなく、例えばレーザ光源を用いることによって各光を出射してもよい。もちろん、白色光源と各光の波長帯域を透過するフィルタを用いて各光を出射してもよい。 In Embodiments 1 to 4 of the present disclosure, each light is emitted by the LED lamp. However, the present invention is not limited to this. For example, each light may be emitted by using a laser light source. Of course, each light may be emitted using a white light source and a filter that transmits the wavelength band of each light.
 また、本開示の実施の形態1~4では、内視鏡システムであったが、例えばカプセル型の内視鏡、被検体を撮像するビデオマイクロスコープ、撮像機能を有する携帯電話および撮像機能を有するタブレット型端末であっても適用することができる。 In Embodiments 1 to 4 of the present disclosure, the endoscope system is used. However, for example, a capsule endoscope, a video microscope for imaging a subject, a mobile phone having an imaging function, and an imaging function are provided. Even a tablet terminal can be applied.
 また、本開示の実施の形態1~4では、軟性の内視鏡を備えた内視鏡システムであったが、硬性の内視鏡を備えた内視鏡システム、工業用の内視鏡を備えた内視鏡システムであっても適用することができる。 In the first to fourth embodiments of the present disclosure, the endoscope system includes a flexible endoscope. However, an endoscope system including a rigid endoscope, and an industrial endoscope are used. Even an endoscopic system provided can be applied.
 また、本開示の実施の形態1~4では、被検体に挿入される内視鏡を備えた内視鏡システムであったが、副鼻腔内視鏡および電気メスや検査プローブ等の内視鏡システムであっても適用することができる。 Further, in Embodiments 1 to 4 of the present disclosure, the endoscope system includes an endoscope that is inserted into a subject. However, an endoscope such as a sinus endoscope and an electric knife or a test probe is used. Even a system can be applied.
 また、本開示の実施の形態1~4では、上述してきた「部」は、「手段」や「回路」などに読み替えることができる。例えば、制御部は、制御手段や制御回路に読み替えることができる。 In the first to fourth embodiments of the present disclosure, the “unit” described above can be read as “means”, “circuit”, and the like. For example, the control unit can be read as control means or a control circuit.
 また、本開示の実施の形態1~4に係る内視鏡システムに実行させるプログラムは、インストール可能な形式または実行可能な形式のファイルデータでCD-ROM、フレキシブルディスク(FD)、CD-R、DVD(Digital Versatile Disk)、USB媒体、フラッシュメモリ等のコンピュータで読み取り可能な記録媒体に記録されて提供される。 The program executed by the endoscope system according to the first to fourth embodiments of the present disclosure is a file data in an installable format or an executable format, and is a CD-ROM, a flexible disk (FD), a CD-R, Provided by being recorded on a computer-readable recording medium such as a DVD (Digital Versatile Disk), a USB medium, or a flash memory.
 また、本開示の実施の形態1~4に係る内視鏡システムに実行させるプログラムは、インターネット等のネットワークに接続されたコンピュータ上に格納し、ネットワーク経由でダウンロードさせることにより提供するように構成してもよい。さらに、本開示の実施の形態1~4に係る内視鏡システムに実行させるプログラムをインターネット等のネットワーク経由で提供または配布するように構成しても良い。 In addition, the program executed by the endoscope system according to Embodiments 1 to 4 of the present disclosure is configured to be provided by being stored on a computer connected to a network such as the Internet and downloaded via the network. May be. Furthermore, the program to be executed by the endoscope system according to the first to fourth embodiments of the present disclosure may be provided or distributed via a network such as the Internet.
 また、本開示の実施の形態1~4に係る内視鏡システムでは、伝送ケーブルを用いて内視鏡から処理装置へ信号を送信していたが、例えば有線である必要はなく、無線であってもよい。この場合、所定の無線通信規格(例えばWi-Fi(登録商標)やBluetooth(登録商標))に従って、内視鏡から画像信号等を処理装置へ送信するようにすればよい。もちろん、他の無線通信規格に従って無線通信を行ってもよい。 In addition, in the endoscope systems according to Embodiments 1 to 4 of the present disclosure, signals are transmitted from the endoscope to the processing device using the transmission cable. May be. In this case, an image signal or the like may be transmitted from the endoscope to the processing device according to a predetermined wireless communication standard (for example, Wi-Fi (registered trademark) or Bluetooth (registered trademark)). Of course, wireless communication may be performed according to other wireless communication standards.
 なお、本明細書におけるフローチャートの説明では、「まず」、「その後」、「続いて」等の表現を用いてステップ間の処理の前後関係を明示していたが、本発明を実施するために必要な処理の順序は、それらの表現によって一意的に定められるわけではない。即ち、本明細書で記載したフローチャートにおける処理の順序は、矛盾のない範囲で変更することができる。また、こうした、単純な分岐処理からなるプログラムに限らず、より多くの判定項目を総合的に判定して分岐させてもよい。その場合、ユーザにマニュアル操作を促して学習を繰り返すうちに機械学習するような人工知能の技術を併用しても良い。また、多くの専門家が行う操作パターンを学習させて、さらに複雑な条件を入れ込む形で深層学習をさせて実行してもよい。 In the description of the flowchart in the present specification, the context of the processing between steps is clearly indicated using expressions such as “first”, “after”, “follow”, etc., in order to implement the present invention. The order of processing required is not uniquely determined by their representation. That is, the order of processing in the flowcharts described in this specification can be changed within a consistent range. Further, the program is not limited to such a simple branch process, and more determination items may be comprehensively determined and branched. In that case, an artificial intelligence technique that performs machine learning while prompting the user to perform manual operation and repeating learning may be used in combination. Further, it may be executed by learning an operation pattern performed by many specialists and performing deep learning in a form in which more complicated conditions are inserted.
 以上、本開示の実施の形態のいくつかを図面に基づいて詳細に説明したが、これらは例示であり、本発明の開示の欄に記載の態様を始めとして、当業者の知識に基づいて種々の変形、改良を施した他の形態で本発明を実施することが可能である。 As described above, some of the embodiments of the present disclosure have been described in detail with reference to the drawings. However, these are merely examples, and various embodiments based on the knowledge of those skilled in the art, including the aspects described in the disclosure section of the present invention. It is possible to carry out the present invention in other forms that have been modified and improved.
 1,1C 内視鏡システム
 2 内視鏡
 3,3C 光源装置
 4 表示装置
 5 処理装置
 21 挿入部
 22 操作部
 23 ユニバーサルコード
 24 先端部
 25 湾曲部
 26 可撓管部
 31,31C 光源部
 32 光源ドライバ
 33 照明制御部
 51 画像処理部
 52 入力部
 53 記録部
 54 処理制御部
 241 ライトガイド
 242 照明レンズ
 243 光学系
 244,244B 撮像素子
 245 内視鏡記録部
 246 撮像制御部
 301 光電変換素子
 302 転送トランジスタ
 303 電荷変換部
 304 リセットトランジスタ
 305 画素ソースフォロアトランジスタ
 306 選択トランジスタ
 307 垂直転送線
 311 集光レンズ
 312 第1光源
 313 第2光源
 314 第3光源
 315 第4光源
 316 第5光源
 511 分離部
 512 補間部
 513 合成部
 531 プログラム記録部
 2441,2441B 画素部
 2442,2442A,244B カラーフィルタ
DESCRIPTION OF SYMBOLS 1,1C Endoscope system 2 Endoscope 3,3C Light source apparatus 4 Display apparatus 5 Processing apparatus 21 Insertion part 22 Operation part 23 Universal code 24 Tip part 25 Bending part 26 Flexible tube part 31, 31C Light source part 32 Light source driver DESCRIPTION OF SYMBOLS 33 Illumination control part 51 Image processing part 52 Input part 53 Recording part 54 Processing control part 241 Light guide 242 Illumination lens 243 Optical system 244,244B Image sensor 245 Endoscope recording part 246 Imaging control part 301 Photoelectric conversion element 302 Transfer transistor 303 Charge conversion unit 304 Reset transistor 305 Pixel source follower transistor 306 Selection transistor 307 Vertical transfer line 311 Condensing lens 312 First light source 313 Second light source 314 Third light source 315 Fourth light source 316 Fifth light source 511 Separation unit 512 Interpolation unit 513 Component unit 531 Program recording unit 2441, 2441B Pixel unit 2442, 2442A, 244B Color filter

Claims (9)

  1.  赤色の波長帯域の光、緑色の波長帯域の光および青色の波長帯域の光のいずれかのうち互いに異なる2つの波長帯域の光を透過する2種類のフィルタを用いて構成されたカラーフィルタを各画素の受光面に積層されてなり、被写体を撮像することによって画像データを生成可能な撮像素子と、
     前記2つの波長帯域の光の少なくとも一方の波長帯域の光を含む第1の波長帯域の光と、前記2つの波長帯域の光の少なくとも他方の波長帯域の光を含む第2の波長帯域の光と、を照射可能な光源部と、
     前記光源部に前記第1の波長帯域の光と前記第2の波長帯域の光を交互に照射させる照明制御部と、
     前記光源部が前記第1の波長帯域の光を照射した際に前記撮像素子によって生成された第1の画像データと、前記光源部が前記第2の波長帯域の光を照射した際に前記撮像素子によって生成された第2の画像データと、を合成する画像処理部と、
     を備えることを特徴とする内視鏡システム。
    Each color filter is configured by using two types of filters that transmit light in two different wavelength bands among light in the red wavelength band, light in the green wavelength band, and light in the blue wavelength band. An image pickup element that is stacked on a light receiving surface of a pixel and that can generate image data by imaging a subject;
    Light in a first wavelength band that includes light in at least one of the two wavelength bands and light in a second wavelength band that includes light in at least the other wavelength band of the two wavelength bands A light source unit capable of irradiating
    An illumination control unit that causes the light source unit to alternately irradiate light of the first wavelength band and light of the second wavelength band; and
    The first image data generated by the image sensor when the light source unit irradiates light of the first wavelength band, and the imaging when the light source unit irradiates light of the second wavelength band. An image processing unit for combining the second image data generated by the element;
    An endoscope system comprising:
  2.  前記2種類のフィルタは、
     前記緑色の波長帯域の光および前記青色の波長帯域の光を透過するシアンフィルタおよび前記赤色の波長帯域の光および前記緑色の波長帯域の光を透過するイエローフィルタであり、
     前記光源部は、
     前記赤色の波長帯域の光を照射可能な第1光源と、
     前記緑色の波長帯域の光を照射可能な第2光源と、
     前記青色の波長帯域の光を照射可能な第3光源と、
     を有し、
     前記照明制御部は、
     前記第1の波長帯域の光として前記第2光源に前記緑色の波長帯域の光を照射させる一方、前記第2の波長帯域の光として前記第1光源に前記赤色の波長帯域の光と前記第3光源に前記青色の波長帯域の光とを同時に照射させ、
     前記画像処理部は、
     前記第2の画像データに対応する第2の画像から前記赤色の波長帯域の光の画素値と前記青色の波長帯域の光の画素値とを分離することによって赤色の画像データと青色の画像データを生成する分離部と、
     前記分離部が生成した前記赤色の画像データと、前記青色の画像データと、前記第1の画像データとを合成することによって白色画像データを生成する合成部と、
     を有することを特徴とする請求項1に記載の内視鏡システム。
    The two types of filters are:
    A cyan filter that transmits light in the green wavelength band and light in the blue wavelength band, and a yellow filter that transmits light in the red wavelength band and light in the green wavelength band;
    The light source unit is
    A first light source capable of emitting light in the red wavelength band;
    A second light source capable of emitting light in the green wavelength band;
    A third light source capable of emitting light in the blue wavelength band;
    Have
    The illumination control unit
    The second light source is irradiated with the light of the green wavelength band as the light of the first wavelength band, while the light of the red wavelength band is emitted to the first light source as the light of the second wavelength band. Simultaneously irradiating three light sources with light of the blue wavelength band,
    The image processing unit
    Red image data and blue image data are obtained by separating a pixel value of the light in the red wavelength band and a pixel value of the light in the blue wavelength band from the second image corresponding to the second image data. A separation unit for generating
    A combining unit that generates white image data by combining the red image data generated by the separating unit, the blue image data, and the first image data;
    The endoscope system according to claim 1, further comprising:
  3.  前記合成部は、前記分離部が生成した前記青色の画像データと前記第1の画像データとを合成することによって特殊画像データを生成することを特徴とする請求項2に記載の内視鏡システム。 The endoscope system according to claim 2, wherein the combining unit generates special image data by combining the blue image data generated by the separation unit and the first image data. .
  4.  前記2種類のフィルタは、
     前記緑色の波長帯域の光および前記青色の波長帯域の光を透過するシアンフィルタおよび前記赤色の波長帯域の光および前記青色の波長帯域の光を透過するマゼンタフィルタであり、
     前記光源部は、
     前記赤色の波長帯域の光を照射可能な第1光源と、
     前記緑色の波長帯域の光を照射可能な第2光源と、
     前記青色の波長帯域の光を照射可能な第3光源と、
     を有し、
     前記照明制御部は、
     前記第1の波長帯域の光として前記第3光源に前記青色の波長帯域の光を照射させる一方、前記第2の波長帯域の光として前記第1光源に前記赤色の波長帯域の光と前記第2光源に前記緑色の波長帯域の光とを同時に照射させ、
     前記画像処理部は、
     前記第2の画像データに対応する第2の画像から前記赤色の波長帯域の光の画素値と前記緑色の波長帯域の光の画素値とを分離することによって赤色の画像データと緑色の画像データを生成する分離部と、
     前記分離部が生成した前記赤色の画像データと、前記緑色の画像データと、前記第1の画像データとを合成することによって白色画像データを生成する合成部と、
     を有することを特徴とする請求項1に記載の内視鏡システム。
    The two types of filters are:
    A cyan filter that transmits light in the green wavelength band and light in the blue wavelength band, and a magenta filter that transmits light in the red wavelength band and light in the blue wavelength band;
    The light source unit is
    A first light source capable of emitting light in the red wavelength band;
    A second light source capable of emitting light in the green wavelength band;
    A third light source capable of emitting light in the blue wavelength band;
    Have
    The illumination control unit
    The third light source is irradiated with the blue wavelength band light as the first wavelength band light, while the red wavelength band light and the first light wavelength are emitted to the first light source as the second wavelength band light. Simultaneously illuminate two light sources with light in the green wavelength band,
    The image processing unit
    Separating the pixel value of the light in the red wavelength band and the pixel value of the light in the green wavelength band from the second image corresponding to the second image data, the red image data and the green image data A separation unit for generating
    A combining unit that generates white image data by combining the red image data generated by the separating unit, the green image data, and the first image data;
    The endoscope system according to claim 1, further comprising:
  5.  前記合成部は、前記分離部が分離した前記緑色の画像データと前記第1の画像データとを合成することによって特殊画像データを生成することを特徴とする請求項4に記載の内視鏡システム。 The endoscope system according to claim 4, wherein the synthesizing unit generates special image data by synthesizing the green image data separated by the separation unit and the first image data. .
  6.  前記光源部が前記第1の波長帯域の光を照射した際に、前記撮像素子を構成する各画素の画素値を所定の画素数毎に加算して出力させる一方、前記光源部が前記第2の波長帯域の光を照射した際に、前記撮像素子を構成する各画素のうち、同じ波長帯域の光を受光可能な画素の画素値を所定の画素数毎に加算して出力させる撮像制御部を備えることを特徴とする請求項1~5のいずれか一つに記載の内視鏡システム。 When the light source unit emits light of the first wavelength band, the pixel value of each pixel constituting the imaging element is added for each predetermined number of pixels and output, while the light source unit is the second light source. An imaging control unit that outputs a pixel value of a pixel capable of receiving light in the same wavelength band for each predetermined number of pixels among the pixels constituting the imaging element when irradiated with light in a predetermined wavelength band The endoscope system according to any one of claims 1 to 5, further comprising:
  7.  前記光源部は、
     紫色の波長帯域の光を照射可能な第4光源と、
     橙色の波長帯域の光を照射可能な第5光源と、
     をさらに有し、
     前記照明制御部は、前記第2の波長帯域の光を前記光源部に照射させた後に、前記第4光源に前記紫色の波長帯域の光と前記第5光源に前記橙色の波長帯域の光とを同時に出射させることを特徴とする請求項2または3に記載の内視鏡システム。
    The light source unit is
    A fourth light source capable of emitting light in a purple wavelength band;
    A fifth light source capable of emitting light in an orange wavelength band;
    Further comprising
    The illumination control unit irradiates the light source unit with light of the second wavelength band, and then causes the fourth light source to emit light of the purple wavelength band and the fifth light source to light of the orange wavelength band. The endoscope system according to claim 2, wherein the endoscope system is simultaneously emitted.
  8.  赤色の波長帯域の光、緑色の波長帯域の光および青色の波長帯域の光のいずれかのうち互いに異なる2つの波長帯域の光を透過する2種類のフィルタを用いて構成されたカラーフィルタを各画素の受光面に積層されてなり、被写体を撮像することによって画像データを生成可能な撮像素子と、前記2つの波長帯域の光の少なくとも一方の波長帯域の光を含む第1の波長帯域の光と、前記2つの波長帯域の光の少なくとも他方の波長帯域の光を含む第2の波長帯域の光と、を照射可能な光源部と、を備えた内視鏡システムが実行する制御方法であって、
     前記光源部に前記第1の波長帯域の光と前記第2の波長帯域の光を交互に照射させる照明制御ステップと、
     前記光源部が前記第1の波長帯域の光を照射した際に前記撮像素子によって生成された第1の画像データと、前記光源部が前記第2の波長帯域の光を照射した際に前記撮像素子によって生成された第2の画像データと、を合成する画像処理ステップと、
     を含むことを特徴とする画像処理方法。
    Each color filter is configured by using two types of filters that transmit light in two different wavelength bands among light in the red wavelength band, light in the green wavelength band, and light in the blue wavelength band. An image sensor that is stacked on the light-receiving surface of the pixel and can generate image data by imaging a subject, and light in a first wavelength band including light in at least one of the two wavelength bands And a light source unit capable of irradiating light in a second wavelength band including light in at least the other wavelength band of the light in the two wavelength bands. And
    An illumination control step of alternately irradiating the light source unit with the light of the first wavelength band and the light of the second wavelength band;
    The first image data generated by the image sensor when the light source unit irradiates light of the first wavelength band, and the imaging when the light source unit irradiates light of the second wavelength band. An image processing step of combining the second image data generated by the element;
    An image processing method comprising:
  9.  赤色の波長帯域の光、緑色の波長帯域の光および青色の波長帯域の光のいずれかのうち互いに異なる2つの波長帯域の光を透過する2種類のフィルタを用いて構成されたカラーフィルタを各画素の受光面に積層されてなり、被写体を撮像することによって画像データを生成可能な撮像素子と、前記2つの波長帯域の光の少なくとも一方の波長帯域の光を含む第1の波長帯域の光と、前記2つの波長帯域の光の少なくとも他方の波長帯域の光を含む第2の波長帯域の光と、を照射可能な光源部と、を備えた内視鏡システムに、
     前記光源部に前記第1の波長帯域の光と前記第2の波長帯域の光を交互に照射させる照明制御ステップと、
     前記光源部が前記第1の波長帯域の光を照射した際に前記撮像素子によって生成された第1の画像データと、前記光源部が前記第2の波長帯域の光を照射した際に前記撮像素子によって生成された第2の画像データと、を合成する画像処理ステップと、
     を実行させることを特徴とするプログラム。
    Each color filter is configured by using two types of filters that transmit light in two different wavelength bands among light in the red wavelength band, light in the green wavelength band, and light in the blue wavelength band. An image sensor that is stacked on the light-receiving surface of the pixel and can generate image data by imaging a subject, and light in a first wavelength band including light in at least one of the two wavelength bands And a light source unit capable of irradiating light in a second wavelength band including light in at least the other wavelength band of the light in the two wavelength bands.
    An illumination control step of alternately irradiating the light source unit with the light of the first wavelength band and the light of the second wavelength band;
    The first image data generated by the image sensor when the light source unit irradiates light of the first wavelength band, and the imaging when the light source unit irradiates light of the second wavelength band. An image processing step of combining the second image data generated by the element;
    A program characterized by having executed.
PCT/JP2018/029313 2018-03-23 2018-08-03 Endoscope system, image processing method, and program WO2019180983A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-055586 2018-03-23
JP2018055586 2018-03-23

Publications (1)

Publication Number Publication Date
WO2019180983A1 true WO2019180983A1 (en) 2019-09-26

Family

ID=67986818

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/029313 WO2019180983A1 (en) 2018-03-23 2018-08-03 Endoscope system, image processing method, and program

Country Status (1)

Country Link
WO (1) WO2019180983A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021149357A1 (en) * 2020-01-23 2021-07-29 富士フイルム株式会社 Endoscope system and method for operating same

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5846935A (en) * 1981-09-12 1983-03-18 富士写真光機株式会社 Endoscope apparatus using solid photographing element
JPS58103432A (en) * 1981-12-14 1983-06-20 富士写真フイルム株式会社 Endoscope apparatus using fixed photographing element
JPH03101397A (en) * 1989-09-13 1991-04-26 Olympus Optical Co Ltd Electronic endoscope
JPH06225316A (en) * 1993-01-28 1994-08-12 Canon Inc Color photoelectric conversion device
JP2014128423A (en) * 2012-12-28 2014-07-10 Olympus Medical Systems Corp Endoscope system
JP2015066062A (en) * 2013-09-27 2015-04-13 富士フイルム株式会社 Endoscope system, operation method for the same, and light source device for endoscope

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5846935A (en) * 1981-09-12 1983-03-18 富士写真光機株式会社 Endoscope apparatus using solid photographing element
JPS58103432A (en) * 1981-12-14 1983-06-20 富士写真フイルム株式会社 Endoscope apparatus using fixed photographing element
JPH03101397A (en) * 1989-09-13 1991-04-26 Olympus Optical Co Ltd Electronic endoscope
JPH06225316A (en) * 1993-01-28 1994-08-12 Canon Inc Color photoelectric conversion device
JP2014128423A (en) * 2012-12-28 2014-07-10 Olympus Medical Systems Corp Endoscope system
JP2015066062A (en) * 2013-09-27 2015-04-13 富士フイルム株式会社 Endoscope system, operation method for the same, and light source device for endoscope

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021149357A1 (en) * 2020-01-23 2021-07-29 富士フイルム株式会社 Endoscope system and method for operating same
JP7386266B2 (en) 2020-01-23 2023-11-24 富士フイルム株式会社 Endoscope system and its operating method

Similar Documents

Publication Publication Date Title
US10159404B2 (en) Endoscope apparatus
JP5616664B2 (en) Endoscope system
JP5245022B1 (en) Imaging device
JP2011250926A (en) Electronic endoscope system
WO2015093104A1 (en) Endoscope device
JP5899172B2 (en) Endoscope device
JP2013000176A (en) Endoscope system, light source apparatus for the same, and light quantity control method
WO2017158692A1 (en) Endoscope device, image processing device, image processing method, and program
US20190246875A1 (en) Endoscope system and endoscope
WO2016129062A1 (en) Image processing device, endoscope system, imaging device, image processing method, and program
US11571111B2 (en) Endoscope scope, endoscope processor, and endoscope adaptor
WO2019069414A1 (en) Endoscope device, image processing method, and program
WO2019180983A1 (en) Endoscope system, image processing method, and program
US10863149B2 (en) Image processing apparatus, image processing method, and computer readable recording medium
JP6190906B2 (en) Imaging module and endoscope apparatus
JP6137892B2 (en) Imaging system
JP2010184047A (en) Endoscope, endoscope driving method, and endoscope system
JP6444450B2 (en) Endoscope system, processor device for endoscope system, and method for operating endoscope system
JP2010184046A (en) Endoscope, endoscope driving method, and endoscope system
US20210007575A1 (en) Image processing device, endoscope system, image processing method, and computer-readable recording medium
WO2017056784A1 (en) Imaging device, endoscope, and endoscope system
US20220173145A1 (en) Image sensor, endoscope, and endoscope system
US9517002B2 (en) Solid state image sensor, endoscope, and endoscope system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18910852

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18910852

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP