WO2020026323A1 - Endoscope device, and endoscope device operating method and program - Google Patents

Endoscope device, and endoscope device operating method and program Download PDF

Info

Publication number
WO2020026323A1
WO2020026323A1 PCT/JP2018/028542 JP2018028542W WO2020026323A1 WO 2020026323 A1 WO2020026323 A1 WO 2020026323A1 JP 2018028542 W JP2018028542 W JP 2018028542W WO 2020026323 A1 WO2020026323 A1 WO 2020026323A1
Authority
WO
WIPO (PCT)
Prior art keywords
narrowband
pixel signal
light
narrow
band
Prior art date
Application number
PCT/JP2018/028542
Other languages
French (fr)
Japanese (ja)
Inventor
順平 高橋
恵仁 森田
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to PCT/JP2018/028542 priority Critical patent/WO2020026323A1/en
Publication of WO2020026323A1 publication Critical patent/WO2020026323A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only

Definitions

  • the present invention relates to an endoscope apparatus, an operation method of the endoscope apparatus, a program, and the like.
  • endoscope devices are widely used for various examinations.
  • endoscope devices in the medical field, by inserting a flexible insertion portion having an elongated shape, an in-vivo image in a body cavity can be acquired without incising the subject, thereby reducing the burden on the subject.
  • endoscope devices have been widely used.
  • An imaging device having a plurality of pixels is provided at the tip of the insertion section, and an in-vivo image in a body cavity is taken by the imaging device.
  • a white light observation (WLI: White Light Imaging) mode using white illumination light and a narrow band light observation mode using narrow band illumination light are known.
  • An example of the narrow-band light observation mode is an NBI (Narrow Band Imaging) mode.
  • NBI Near Band Imaging
  • illumination light including narrow-band light included in the wavelength band of blue light and narrow-band light included in the wavelength band of green light is used.
  • a simultaneous single-chip image sensor is generally used as an image sensor for acquiring an image.
  • a plurality of color filters having different colors are arranged in the simultaneous single-chip image sensor. In an image picked up by the image pickup device, only one color is obtained for one pixel, and other colors are lost. Therefore, the interpolation process is an essential process.
  • the interpolation processing is, for example, linear interpolation processing or bicubic interpolation processing, and interpolates missing pixels. Therefore, the resolution is reduced by the interpolation processing.
  • an image sensor having a complementary color filter is used, a pixel signal in which RGB color signals are mixed is obtained, so that color separation between RGB is reduced.
  • Patent Document 1 A method for improving the sense of resolution when a simultaneous imaging device is used is disclosed in, for example, Patent Document 1.
  • complementary color filters for yellow and cyan and two green primary color filters have a basic array of 2 rows and 2 columns, and the basic array is repeatedly arranged, and two green primary color filters are provided in the basic array. Are arranged side by side in an oblique direction. Then, by performing direction discrimination-type interpolation processing on the G signal obtained from the image provided with the green primary color filter, the resolution is improved.
  • the G signal is input to the G channel of the display image.
  • the technique of Patent Document 1 it is possible to improve the resolution of a G signal in which a blood vessel or a gland duct structure of a living body is clearly depicted.
  • the G signal is not always input to the G channel of the display image.
  • a B signal is input to a G channel of a display image.
  • a change in color tone such as a blood vessel running pattern on the mucosal surface layer or a brownish area is observed.
  • the contrast or color change of blood vessels is observed.
  • bladder observation when distinguishing an inflammatory lesion from a flat cancer (CIS: Cancer In situ), an improvement in resolution or color separation is desired.
  • an endoscope apparatus and an endoscope apparatus capable of improving resolution or color separation in a narrow-band light observation mode while maintaining image quality in a white light observation mode.
  • An operation method, a program, and the like can be provided.
  • One embodiment of the present invention includes a light source that emits narrow-band illumination light having first narrow-band light and second narrow-band light, and a color filter in which each filter unit is provided for each of a plurality of pixels.
  • An image sensor that photoelectrically converts light received by the plurality of pixels, a first narrow band pixel signal having sensitivity to the first narrow band light among the pixel signals from the image sensor, A processing circuit for selecting a second narrow band pixel signal having sensitivity to the band light, wherein the color filter has a filter unit that transmits the first narrow band light at least half of the total number, and (2) a first interpolation process for interpolating the first narrowband pixel signal without using the second narrowband pixel signal, wherein the processing circuit has at least half of the total number of filter units transmitting the narrowband light; And the second narrow band image Signal, related to an endoscope apparatus which performs a second interpolation process of interpolating processing without using the first narrowband pixel signal.
  • a light source that emits narrow-band illumination light having first narrow-band light and second narrow-band light, and a color filter in which each filter unit is provided for each of a plurality of pixels.
  • an image sensor having the filter, wherein the color filter has at least half of filter units that transmit the first narrow band light, and transmits the second narrow band light.
  • the number of filter units is more than half, among the pixel signals from the image sensor, a first narrowband pixel signal having sensitivity to the first narrowband light and a second narrowband pixel signal having sensitivity to the second narrowband light are provided.
  • Another aspect of the present invention is a program for processing an image captured by an image sensor having a color filter provided with each filter unit for each of a plurality of pixels, wherein the color filter has a narrow band illumination.
  • the pixel signal from the image sensor is The first narrowband pixel signal having sensitivity to the first narrowband light and the second narrowband pixel signal having sensitivity to the second narrowband light are selected, and the first narrowband pixel signal is A first interpolation process for interpolating without using the second narrowband pixel signal and a second interpolation process for interpolating the second narrowband pixel signal without using the first narrowband pixel signal are performed.
  • FIG. 4 is a diagram illustrating an interlaced imaging operation using a complementary color image sensor.
  • FIG. 4 is a diagram illustrating a process performed by a pixel signal selection unit. 4 illustrates an example of sensitivity of a pixel signal. An example of a color filter that is not a complementary color filter.
  • FIG. 4 is a diagram illustrating a process performed by an interpolation processing unit.
  • FIG. 4 is a diagram illustrating color mixing in pixel signals.
  • FIG. 4 is a diagram illustrating color mixing in pixel signals.
  • FIG. 9 is a diagram illustrating a second example of the direction discrimination type interpolation processing.
  • FIG. 9 is a diagram illustrating a second example of the direction discrimination type interpolation processing.
  • FIG. 9 is a diagram illustrating a second example of the direction discrimination type interpolation processing.
  • FIG. 9 is a diagram illustrating a second example of the direction discrimination type interpolation processing.
  • FIG. 1 is a configuration example of an endoscope device.
  • the endoscope device includes an insertion unit 200, a control device 300, a display unit 400, an external I / F unit 500, and a light source 100.
  • As the endoscope device for example, a flexible endoscope used for a digestive tract and the like, and a rigid endoscope used for a laparoscope and the like can be assumed.
  • the insertion section 200 is also called a scope.
  • Control device 300 is also called a main unit or a processing device.
  • the display unit 400 is also called a display device.
  • the external I / F unit 500 is also called an operation unit or an operation device.
  • the light source 100 is also called a lighting unit or a lighting device.
  • the endoscope device has a white light observation mode and a narrow band light observation mode.
  • the narrow-band light observation mode is the NBI mode
  • the narrow-band light observation mode is not limited to the NBI mode, and may be any observation mode that uses a plurality of narrow-band lights having different colors.
  • the insertion part 200 is a part to be inserted into the body.
  • the insertion section 200 includes a light guide 210, an illumination lens 220, an objective lens 230, an image sensor 240, and an A / D conversion circuit 250. Further, the insertion section 200 can include a memory 260.
  • the image sensor 240 is also called an image sensor.
  • the insertion section 200 has a connector (not shown), and is attached to and detached from the control device 300 by the connector.
  • the light source 100 includes a white light source 110, a filter 130, and a lens 120.
  • the white light source 110 is, for example, an LED (Light Emitting Diode) or a xenon lamp, and emits white illumination light.
  • the filter 130 is removed from between the white light source 110 and the lens 120, and white illumination light enters the lens 120.
  • the filter 130 is inserted between the white light source 110 and the lens 120, the white illumination light passes through the filter 130, and the light passing through the filter 130 enters the lens 120.
  • the lens 120 condenses the light that has entered the lens 120 and causes the light to enter the light guide 210.
  • the filter 130 passes a first narrow band centered on the wavelength 410 nm and a second narrow band centered on the wavelength 540 nm.
  • the first narrow band is, for example, 390 nm to 445 nm and belongs to the blue wavelength band of white light.
  • the second narrow band is, for example, 530 nm to 550 nm and belongs to the green wavelength band of white light.
  • B narrowband light the first narrowband light that has passed through the filter 130
  • G narrowband light the second narrowband light that has passed through the filter 130.
  • the configuration of the light source 100 is not limited to this.
  • the light source 100 includes a white light source and a narrow band light source, the white light source emits white light in the white light observation mode, and the narrow band light source emits the B narrow band light and the G narrow band light in the NBI observation mode. Is also good.
  • the filter 130 is omitted.
  • Light guide 210 guides the illumination light from light source 100 to the tip of insertion section 200.
  • the illumination lens 220 irradiates the subject with the illumination light guided by the light guide 210.
  • the subject is a living body.
  • Light reflected from the subject enters the objective lens 230.
  • a subject image is formed by the objective lens 230, and the imaging element 240 captures the subject image.
  • the image sensor 240 includes a plurality of pixels for subjecting a subject image to photoelectric conversion, and acquires a pixel signal from the plurality of pixels.
  • the image sensor 240 may read out pixel signals for each pixel, or may obtain pixel signals by adding and reading out from a plurality of pixels.
  • the image sensor 240 is a simultaneous single-chip image sensor that can obtain pixel signals of a plurality of colors by one image pickup.
  • the image sensor 240 is, for example, a complementary color image sensor, but the image sensor 240 is not limited to a complementary image sensor.
  • the A / D conversion circuit 250 A / D converts an analog pixel signal from the image sensor 240 into a digital pixel signal.
  • the A / D conversion circuit 250 may be built in the image sensor 240.
  • the control device 300 performs signal processing including image processing.
  • the control device 300 controls each part of the endoscope device.
  • the control device 300 includes a processing circuit 310 and a control circuit 320.
  • Each of the processing circuit 310 and the control circuit 320 is realized by, for example, an integrated circuit device or the like.
  • the control circuit 320 may be configured as a separate circuit from the processing circuit 310, or may be included in the processing circuit 310 as a control unit.
  • the processing circuit 310 may be constituted by, for example, a processor described later.
  • the control circuit 320 controls each section of the endoscope device. For example, the user operates the external I / F unit 500 to set the observation mode.
  • the control circuit 320 causes the light source 100 to emit white light, and causes the processing circuit 310 to execute image processing in the white light observation mode.
  • the control circuit 320 causes the light source 100 to emit B narrowband light and G narrowband light, and causes the processing circuit 310 to execute image processing in the NBI mode.
  • the memory 260 of the insertion unit 200 stores information on the insertion unit 200.
  • the control circuit 320 controls each unit of the endoscope apparatus based on the information read from the memory 260.
  • the memory 260 stores information about the image sensor 240.
  • the information on the image sensor 240 is, for example, information on the arrangement of color filters.
  • the control circuit 320 causes the processing circuit 310 to perform corresponding image processing based on the information on the image sensor 240 read from the memory 260.
  • the processing circuit 310 generates a display image by performing image processing based on the pixel signal from the A / D conversion circuit 250, and outputs the display image to the display unit 400.
  • the display unit 400 is, for example, a liquid crystal display device or the like, and displays a display image from the processing circuit 310.
  • the processing circuit 310 includes a pixel signal selection unit 311 and an interpolation processing unit 312.
  • the interpolation processing unit 312 In the white light observation mode, the interpolation processing unit 312 generates a white light image by performing an interpolation process on the pixel signal, and outputs the white light image as a display image.
  • the pixel signal selection unit 311 selects a pixel signal of a part of the plurality of pixel signals as a luminance component pixel signal, and selects a remaining color pixel signal as a non-luminance component pixel signal.
  • the luminance component pixel signal is a pixel signal having sensitivity to the B narrow band light.
  • the non-luminance component pixel signal is a pixel signal having sensitivity to G narrow band light.
  • the interpolation processing unit 312 generates a luminance component image by performing a first interpolation process on the luminance component pixel signal, and generates a non-luminance component image by performing a second interpolation process on the non-luminance component pixel signal.
  • the processing circuit 310 independently performs the first interpolation process and the second interpolation process as separate processes.
  • the first interpolation processing and the second interpolation processing may be the same algorithm, but are independent as processing. For example, when performing direction discrimination-type interpolation processing, the processing circuit 310 performs direction discrimination based on the luminance component pixel signal in the first interpolation processing, and interpolates the luminance component pixel signal based on the discrimination result.
  • a non-luminance component pixel signal is not used.
  • the processing circuit 310 performs the direction determination based on the non-luminance component pixel signal in the second interpolation processing, and interpolates the non-luminance component pixel signal based on the determination result.
  • a luminance component pixel signal is not used.
  • the processing circuit 310 inputs the luminance component image to the G channel and the B channel of the display image, and inputs the non-luminance component image to the R channel of the display image.
  • the NBI image is output as a display image.
  • the display image may be either a moving image or a still image.
  • the external I / F unit 500 is an interface for performing input from the user to the endoscope apparatus and the like. That is, it is an interface for operating the endoscope apparatus, an interface for setting operation of the endoscope apparatus, or the like. For example, a button, a dial, a lever, and the like for selecting an observation mode are included.
  • the endoscope apparatus includes the light source 100, the image sensor 240, and the processing circuit 310.
  • the light source 100 emits narrow-band illumination light having first narrow-band light and second narrow-band light.
  • the image sensor 240 has a color filter in which each filter unit is provided for each of the plurality of pixels, and photoelectrically converts light received by the plurality of pixels.
  • the processing circuit 310 selects a first narrowband pixel signal having sensitivity to the first narrowband light and a second narrowband pixel signal having sensitivity to the second narrowband light among the pixel signals from the image sensor 240. I do.
  • the color filter has at least half of the total number of filter units transmitting the first narrowband light, and at least half of the total number of filter units transmitting the second narrowband light.
  • the processing circuit 310 performs a first interpolation process of interpolating the first narrowband pixel signal without using the second narrowband pixel signal, and uses the first narrowband pixel signal to perform the second narrowband pixel signal. And a second interpolation process for performing the interpolation process.
  • the processing circuit 310 selects the B narrowband signal as the first narrowband pixel signal, and selects the B narrowband signal as the second narrowband pixel signal.
  • the color filter is a complementary color filter
  • the Mg and Cy filter units transmit B narrow band light
  • the Mg, Cy, Ye and G filter units transmit G narrow band light. In each case, it is more than half of the whole.
  • the processing circuit 310 independently performs the first interpolation process of interpolating the B narrowband signal and the second interpolation process of interpolating the G narrowband signal.
  • the B narrow band signal is a luminance component pixel signal.
  • the first narrowband signal is not limited to a luminance component pixel signal.
  • a luminance component of a display image may be generated based on both the first narrowband pixel signal and the second narrowband pixel signal.
  • the narrow-band light observation mode is not limited to the NBI mode.
  • the narrow-band observation mode may be any mode in which illumination light including a plurality of narrow-band lights is used.
  • an image captured by a complementary color image sensor is subjected to interpolation processing by pixel addition or the like, and the image resolution or color separation may be reduced.
  • the contrast of the blood vessel is high.
  • the contrast of a blood vessel is the resolution or color separation of the blood vessel.
  • the first interpolation processing for interpolating the first narrowband pixel signal and the second interpolation processing for interpolating the second narrowband pixel signal are performed independently. That is, an image obtained by each narrow-band illumination light is interpolated independently without adding pixels. Since pixel addition or the like is unnecessary, the contrast of blood vessels can be improved.
  • more than half of the filter units that transmit the first narrow-band light are provided in the image sensor 240, and more than half of the filter units that transmit the second narrow-band light are provided. Then, a first narrowband pixel signal having sensitivity to the first narrowband light and a second narrowband pixel signal having sensitivity to the second narrowband light are selected. As a result, of all the pixels forming one image, half of the pixels are pixels of the first narrow band, and the remaining half are pixels of the second narrow band. By performing the first interpolation processing and the second interpolation processing on such an image, an image having a high resolution is obtained.
  • the processing circuit 310 includes a first narrowband image obtained by the first interpolation process and a second narrowband image obtained by the second interpolation process. , A color separation process between the first narrowband image and the second narrowband image is performed.
  • the first narrowband pixel signal is also sensitive to the second narrowband light
  • the second narrowband pixel signal is also sensitive to the first narrowband light. That is, the first narrowband pixel signal and the second narrowband pixel signal have a mixed color.
  • the color mixture can be reduced. By reducing the color mixture, the color separation in the narrow-band light observation mode can be improved.
  • the processing circuit 310 determines the direction based on the brightness of the first narrowband pixel signal and the brightness of the second narrowband pixel signal.
  • An index value RLB (x, y) indicating the reliability of the direction discrimination in the discrimination type interpolation processing is obtained.
  • the processing circuit 310 performs a direction determination type interpolation process based on the index value RLB (x, y) in the first interpolation process and the second interpolation process.
  • the index value RLB (x, y) indicates the degree of variation in the brightness of the first narrowband pixel signal and the second narrowband pixel signal around the position (x, y).
  • the processing circuit 310 performs the interpolation processing such that the lower the reliability, the smaller the weight of the interpolation processing result in the edge direction. This can reduce the occurrence of artifacts due to the interpolation processing.
  • the processing circuit 310 compares the edge direction interpolation processing result and the non-direction discrimination type interpolation processing result with a blend ratio based on the index value RLB (x, y). Blend with.
  • the processing circuit 310 increases the blend ratio of the non-direction discrimination type interpolation processing result as the reliability is lower.
  • the blend ratio of the result of the non-direction discrimination-type interpolation processing is (1-RLB (x, y)).
  • the lower the reliability the higher the blending ratio of the non-direction discrimination type interpolation processing result.
  • the greater the degree of variation in brightness between the first narrowband pixel signal and the second narrowband pixel signal the greater the blending rate of the non-directional discrimination type interpolation processing result, and the higher the blending rate of the direction discrimination type interpolation processing. Can be reduced.
  • the processing circuit 310 calculates the index value RLB (x, y) based on the ratio between the brightness of the first narrowband pixel signal and the brightness of the second narrowband pixel signal. Ask.
  • the ratio of the brightness of the first narrowband pixel signal to the brightness of the second narrowband pixel signal changes according to the degree of variation in the brightness of the first narrowband pixel signal and the second narrowband pixel signal. This makes it possible to control the blending ratio of the non-direction discrimination-type interpolation processing according to the degree of variation in brightness between the first narrowband pixel signal and the second narrowband pixel signal.
  • the first narrow-band pixel signal is composed of the first pixel signal and the second pixel signal having higher sensitivity to the first narrow-band light than the first pixel signal.
  • the second narrowband pixel signal includes a third pixel signal and a fourth pixel signal having a higher sensitivity to the second narrowband light than the third pixel signal.
  • the processing circuit 310 makes the weight of the second pixel signal larger than the weight of the first pixel signal in the first interpolation processing, and makes the weight of the fourth pixel signal larger than the weight of the third pixel signal in the second interpolation processing. Enlarge.
  • the first pixel signal is an MgYe signal
  • the second pixel signal is an MgCy signal
  • the third pixel signal is a GCy signal
  • the fourth pixel signal is a GYe signal.
  • the weight of the pixel signal having high sensitivity to the first narrowband light is increased, thereby realizing the first interpolation processing with higher resolution. it can.
  • the weight of a pixel signal having high sensitivity to the second narrowband light is increased, thereby realizing the first interpolation processing with higher resolution. Therefore, the sense of resolution is improved for the display image obtained by combining these.
  • the first narrowband pixel signal includes a first pixel signal and a second pixel signal corresponding to different colors.
  • the second narrowband pixel signal includes a third pixel signal and a fourth pixel signal corresponding to different colors.
  • the processing circuit 310 performs the first interpolation based on the average value of the first pixel signal in the peripheral pixels of the first target pixel and the average value of the second pixel signal in the peripheral pixels of the first target pixel. The brightness correction between the pixel signal and the second pixel signal is performed.
  • the processing circuit 310 performs the second interpolation based on the average value of the third pixel signal in the peripheral pixels of the second target pixel and the average value of the fourth pixel signal in the peripheral pixels of the second target pixel.
  • the brightness correction between the third pixel signal and the fourth pixel signal is performed.
  • the B narrowband signal which is the first narrowband pixel signal, includes the MgYe signal as the first pixel signal, and includes the MgCy signal as the second pixel signal.
  • the G narrow-band signal which is the second narrow-band pixel signal, includes the GCy signal as the third pixel signal, and includes the GYe signal as the fourth pixel signal.
  • the processing circuit 310 multiplies the MgYe signal by the average value of the MgCy signal / the average value of the MgYe signal, and multiplies the GCy signal by the average value of the GYe signal / the average value of the GCy signal. That is, the processing circuit 310 performs the brightness correction based on the average value of the peripheral same color pixels. This brightness correction is performed, for example, before the interpolation processing.
  • the interpolation processing can be appropriately executed. For example, in the direction determination type interpolation processing, the degree of variation in brightness between signals is reduced, so that the accuracy of edge direction determination is improved.
  • the processing circuit 310 selects a pixel signal whose sensitivity to the first narrowband light is equal to or higher than the sensitivity to the second narrowband light as the first narrowband pixel signal.
  • a pixel signal whose sensitivity to the second narrowband light is greater than the sensitivity to the first narrowband light is selected as the second narrowband pixel signal.
  • the first narrowband pixel signal and the second narrowband pixel signal are selected according to the sensitivity to the narrowband light, so that the pixel signals are converted into the first narrowband pixel signal and the second narrowband pixel. Can be separated into signals. As described above, by separating the pixel signal into the first narrowband pixel signal and the second narrowband pixel signal and performing independent interpolation processing on each pixel signal, a display image with high contrast can be obtained.
  • the processing circuit 310 performs the first narrowband pixel signal and the second narrowband pixel signal on the basis of the narrowband pixel signal having a small color mixture among the first narrowband pixel signal and the second narrowband pixel signal. Adjust the brightness level between and.
  • the processing circuit 310 multiplies the first narrowband pixel signal by the average of the second narrowband pixel signal / the average of the first narrowband pixel signal. This brightness level adjustment is performed, for example, before or after the interpolation processing.
  • the color filter is a complementary color filter.
  • the color filter includes magenta, yellow, cyan, and green filter units.
  • an image captured by a complementary color image sensor is subjected to interpolation processing by pixel addition or the like.
  • the first interpolation processing for interpolating the first narrowband pixel signal and the second interpolation processing for interpolating the second narrowband pixel signal in the narrowband light observation mode are independently performed. Done. This eliminates the need for pixel addition and the like, and thus can improve the contrast of blood vessels.
  • the color filter may be a color filter in which a red filter unit is replaced with a magenta filter unit in a primary color Bayer array color filter.
  • the present invention can be applied to a case where an image sensor provided with such a color filter is used. Also in this case, in the narrow-band light observation mode, the first interpolation process of interpolating the first narrow-band pixel signal and the second interpolation process of interpolating the second narrow-band pixel signal are performed independently. The contrast of blood vessels can be improved.
  • the processing circuit 310 generates a display image by inputting the first narrowband image obtained by the first interpolation process to the G channel of the display image.
  • the first narrowband image is a luminance component image. That is, the processing circuit 310 selects the first narrowband pixel signal as the luminance component pixel signal.
  • the first narrowband light belongs to the blue wavelength band of white light, and has a wavelength band narrower than the blue wavelength band.
  • the second narrowband light belongs to the green wavelength band of the white light, and has a wavelength band narrower than the green wavelength band.
  • the processing circuit 310 inputs the first narrowband image to the G and B channels of the display image, and inputs the second narrowband image obtained by the second interpolation process to the R channel of the display image.
  • the NBI mode can be realized as the narrow-band light observation mode.
  • blood vessels near the surface layer of the mucous membrane can be photographed with high contrast.
  • a lesion in which blood vessels are densely located near the surface of the mucous membrane can be observed as a brownish area.
  • the brownish area is an area that looks brown.
  • a blood vessel can be photographed with high contrast in the NBI mode. For example, although both CIS and inflammation appear in the brownish area, the blood vessels appear to have high contrast, so that it is expected that CIS and inflammation can be easily distinguished.
  • the light source 100 emits white illumination light in the white light observation mode, and emits narrow band illumination light in the narrow band light observation mode.
  • the color filter of the image sensor 240 has a plurality of filter units for capturing a white light image when white illumination light is emitted.
  • the number of filter units transmitting the first narrowband light is at least half of the plurality of filter units, and the number of filter units transmitting the second narrowband light is at least half of the plurality of filter units.
  • the processing circuit 310 selects the first narrow-band pixel signal and the second narrow-band pixel signal, and performs the first interpolation process and the second interpolation process.
  • the complementary color filter has Mg, Cy, Ye, and G filter units as shown in FIG. These four filter units are for capturing a white light image.
  • a white light image is obtained by the interpolation processing of the following equations (1) to (4).
  • the Mg and Cy filter units transmit the B narrow band light
  • the Mg, Cy, Ye and G filter units transmit the G narrow band light. In each case, two or more of the four are half.
  • pixel addition or the like is performed as in the following equations (1) to (4).
  • a high-contrast narrow-band light image can be obtained by independently interpolating the first narrow-band pixel signal and the second narrow-band pixel signal in the narrow-band light observation mode.
  • control device 300 of the present embodiment may be configured as follows. That is, the control device 300 of the present embodiment includes a memory that stores information, and a processor that operates based on the information stored in the memory.
  • the information is, for example, a program and various data.
  • the processor includes hardware.
  • the processor When the narrow-band illumination light is emitted, the processor has a first narrow-band pixel signal having sensitivity to the first narrow-band light and a sensitivity to the second narrow-band light among the pixel signals from the image sensor 240. And a second narrowband pixel signal.
  • the processor interpolates the first narrowband pixel signal without using the second narrowband pixel signal, and interpolates the second narrowband pixel signal without using the first narrowband pixel signal. And a second interpolation process.
  • each unit may be realized by individual hardware, or the function of each unit may be realized by integrated hardware.
  • a processor includes hardware, and the hardware can include at least one of a circuit that processes digital signals and a circuit that processes analog signals.
  • the processor can be configured with one or a plurality of circuit devices mounted on a circuit board or one or a plurality of circuit elements.
  • the one or more circuit devices are, for example, ICs.
  • the one or more circuit elements are, for example, resistors, capacitors, and the like.
  • the processor may be, for example, a CPU (Central Processing Unit).
  • the processor is not limited to the CPU, and various processors such as a GPU (Graphics Processing Unit) or a DSP (Digital Signal Processor) can be used. Further, the processor may be a hardware circuit based on an ASIC. Further, the processor may include an amplifier circuit and a filter circuit for processing an analog signal.
  • the memory may be a semiconductor memory such as an SRAM or a DRAM, a register, a magnetic storage device such as a hard disk device, or an optical storage device such as an optical disk device. You may. For example, the memory stores a computer-readable instruction, and when the instruction is executed by the processor, the function of each unit of the control device 300 is realized as a process.
  • the instruction here may be an instruction of an instruction set constituting a program or an instruction for instructing a hardware circuit of a processor to operate.
  • the processor realizes the function of the processing circuit 310 in FIG.
  • the processor realizes the functions of the processing circuit 310 and the control circuit 320 in FIG.
  • Each part of the endoscope apparatus may be realized as a module of a program that operates on a processor.
  • the program may include a pixel signal selection module realizing the pixel signal selection unit 311 and an interpolation processing module realizing the interpolation processing unit 312.
  • the pixel signal selection module When the narrow-band illumination light is emitted, the pixel signal selection module outputs a first narrow-band pixel signal having sensitivity to the first narrow-band light and a second narrow-band light among the pixel signals from the image sensor 240. And a second narrowband pixel signal having sensitivity.
  • the interpolation processing module performs a first interpolation process of interpolating the first narrowband pixel signal without using the second narrowband pixel signal, and a second interpolation process of using the second narrowband pixel signal without using the first narrowband pixel signal. And a second interpolation process for performing the interpolation process.
  • the program that implements the processing performed by each unit of the control device 300 of the present embodiment can be stored in, for example, an information storage medium that is a computer-readable medium.
  • the information storage medium can be realized by, for example, an optical disk, a memory card, an HDD, or a semiconductor memory.
  • the semiconductor memory is, for example, a ROM.
  • the processing circuit 310 and the control circuit 320 perform various processes of the present embodiment based on programs and data stored in the information storage medium. That is, the information storage medium stores a program for causing a computer to function as each unit of the endoscope apparatus according to the present embodiment.
  • the computer is a device including an input device, a processing unit, a storage unit, and an output unit.
  • the program is a program for causing a computer to execute the processing of each unit.
  • the program is recorded on an information storage medium.
  • various recording media readable by an optical detection system such as an optical disk such as a DVD and a CD, a magneto-optical disk, a hard disk, and a memory such as a nonvolatile memory and a RAM, can be assumed.
  • FIG. 2 is a view for explaining an interlaced imaging operation using a complementary color image sensor.
  • X indicates a horizontal scanning direction in the image sensor 240
  • Y indicates a vertical scanning direction in the image sensor 240.
  • a frame is composed of an even field and an odd field, and the processing circuit 310 generates a frame image from a pixel signal obtained in the even field and a pixel signal obtained in the odd field.
  • pixels are arranged in a grid on the image sensor 240, and one filter unit is provided for each pixel.
  • Four color filter units of magenta, green, cyan, and yellow are arranged in two rows and two columns.
  • Mg indicates magenta
  • G indicates green
  • Cy indicates cyan
  • Ye indicates yellow.
  • colors will be described using these codes.
  • a pixel provided with an Mg filter unit is referred to as an Mg pixel.
  • the image sensor 240 adds and reads pixel signals from two pixels arranged in the Y direction. That is, in the even field, the imaging element 240 performs addition reading from the Mg pixels in the first row and the Cy pixels in the second row to obtain the MgCy signal, and obtains the MgCy signal from the G pixels in the first row and the Ye pixels in the second row.
  • the GYe signal is obtained by performing addition reading.
  • a first horizontal line is configured by the MgCy signal and the GYe signal.
  • the image sensor 240 obtains the GCy signal and the MgYe signal constituting the third horizontal line by performing addition readout from the pixels in the third and fourth rows.
  • the image sensor 240 reads out the GCy signal and the MgYe signal constituting the second horizontal line by performing addition reading from the pixels in the second and third rows. Further, the image sensor 240 reads out the MgCy signal and the GYe signal constituting the fourth horizontal line by performing addition reading from the pixels in the fourth and fifth rows.
  • the first to fourth horizontal lines constitute an image of one frame.
  • the interpolation processing unit 312 converts a pixel signal into a YCrCb signal according to the following equations (1) to (4).
  • MgCy means the signal value of the MgCy pixel signal.
  • the interpolation processing unit 312 generates a display image by interpolating the YCrCb signal to generate a YCrCb signal for all pixels, and converting the YCrCb signal to an RGB signal.
  • FIG. 3 is a diagram illustrating a process performed by the pixel signal selection unit 311.
  • x is a position in the horizontal scanning direction
  • y is a position in the vertical scanning direction.
  • the positions x and y mean pixel positions on the image when the pixel signals are arranged on the image.
  • the positions x and y are each an integer.
  • the horizontal scanning direction is also referred to as an x direction
  • the vertical scanning direction is also referred to as a y direction.
  • the MgCy signal, MgYe signal, GCy signal, and GYe signal acquired in FIG. 2 are arranged as shown in FIG.
  • the pixel signal selection unit 311 selects the MgCy signal and the MgYe signal as the B narrowband signal, and selects the GCy signal and the GYe signal as the G narrowband signal.
  • a pixel in which a B narrowband signal is arranged is referred to as a B narrowband pixel.
  • each of the B narrowband pixel and the G narrowband pixel occupies half of the whole.
  • B narrowband pixels and G narrowband pixels are alternately arranged.
  • FIG. 4 is an example of the sensitivity of a pixel signal.
  • the sensitivity to the G narrow band light and the sensitivity to the B narrow band light are represented by relative numerical values.
  • the pixel signal selection unit 311 selects the MgCy signal and the MgYe signal whose sensitivity to the B narrow band light is equal to or higher than the sensitivity to the G narrow band light as the B narrow band signal.
  • the pixel signal selection unit 311 selects the GCy signal and the GYe signal whose sensitivity to the G narrow band light is equal to or higher than the sensitivity to the B narrow band light, as the G narrow band signal.
  • the color filter of the image sensor 240 is not limited to a complementary color filter.
  • FIG. 5 shows an example of a color filter that is not a complementary color filter.
  • the 2 ⁇ 2 pixels are repeatedly arranged on the image sensor 240.
  • the image sensor 240 reads out the G signal, the B signal, and the Mg signal from the G pixel, the B pixel, and the Mg pixel, respectively.
  • the B filter and the Mg filter transmit the B narrow band light
  • the G filter transmits the G narrow band light.
  • the pixel signal selection unit 311 selects the B signal and the Mg signal as the B narrowband signal, and selects the G signal as the G narrowband signal.
  • the arrangement of the B narrow band pixel and the G narrow band pixel is the same as that of the complementary color filter.
  • the following interpolation processing can be applied in the same manner as in the case of the complementary color filter.
  • FIG. 6 is a diagram illustrating the processing performed by the interpolation processing unit 312.
  • the interpolation processing unit 312 independently performs interpolation processing on the B narrowband signal and the G narrowband signal.
  • the pixel signal input from the pixel signal selection unit 311 to the interpolation processing unit 312 is a pixel signal of one color per pixel. That is, the B narrow band signal is missing in half of the pixels, and the G narrow band signal is missing in the remaining half of the pixels.
  • the interpolation processing unit 312 performs an interpolation process on the B narrowband signal from the interpolation processing unit 312.
  • the interpolation processing is, for example, a smoothing filter processing of 5 ⁇ 5 pixels. Note that the smoothing range of the smoothing filter is not limited to 5 ⁇ 5 pixels.
  • the B narrowband signal is interpolated with respect to the pixel where the B narrowband signal is missing.
  • the image constituted by the B narrowband signal is hereinafter referred to as a B narrowband image.
  • the interpolation processing unit 312 performs an edge enhancement process on the B narrowband image.
  • the interpolation processing unit 312 extracts a high-frequency component from the B narrow-band image and adds the high-frequency component to the B narrow-band image.
  • the interpolation processing unit 312 extracts a high-frequency component by performing, for example, an adaptive low-pass filter process of 7 ⁇ 7 pixels on the B narrowband image, and then performing a high-pass filter process of 7 ⁇ 7 pixels on the result.
  • the interpolation processing unit 312 outputs the B narrowband signal after the edge enhancement processing as a final B narrowband signal.
  • the content of the edge enhancement processing is not limited to the above. Further, the edge enhancement processing may be omitted.
  • the interpolation processing unit 312 performs an interpolation process on the G narrow band signal from the interpolation processing unit 312.
  • the interpolation filter is, for example, a 5 ⁇ 5 pixel smoothing filter. Note that the smoothing range of the smoothing filter is not limited to 5 ⁇ 5 pixels.
  • the G narrow band signal is interpolated with respect to the pixel where the G narrow band signal is missing.
  • the image constituted by the G narrow band signal is also called a G narrow band image.
  • the interpolation processing unit 312 outputs the G narrow band signal after the interpolation processing.
  • the interpolation processing unit 312 sets the B narrowband image as the G channel and the B channel of the display image, and sets the G narrowband image as the R channel of the display image. Thus, a display image in the NBI mode is generated.
  • FIGS. 7 and 8 are diagrams illustrating color mixing in pixel signals. 7 and 8, BN indicates the spectrum of the B narrow band light, and GN indicates the spectrum of the G narrow band light.
  • FIG. 7 shows the spectra of the MgCy signal and the MgYe signal selected as the B narrowband signal. These signals are sensitive not only to B narrowband light but also to G narrowband light.
  • FIG. 8 shows the spectra of the GCy signal and the GYe signal selected as the G narrow-band signal. These signals are sensitive not only to G narrowband light but also to B narrowband light. Since the B narrow-band light and the G narrow-band light are simultaneously irradiated on the subject, the B narrow-band signal is mixed with the G narrow-band component, and the G narrow-band signal is mixed with the B narrow-band component. .
  • the interpolation processing unit 312 reduces the above-described color mixture by performing a color separation process represented by the following equation (5).
  • B (x, y) is a B narrowband signal at a position (x, y)
  • G (x, y) is a G narrowband signal at a position (x, y).
  • the interpolation processing unit 312 performs color separation processing of the B narrowband signal between the interpolation processing and the edge enhancement processing in FIG. 6, for example.
  • the interpolation processing unit 312 performs a direction discrimination type interpolation process. That is, the interpolation processing of FIG. 6 is a direction determination type interpolation processing.
  • the interpolation process for the B narrowband signal will be described as an example, but the same applies to the interpolation process for the G narrowband signal.
  • interpolation processing is performed using a range of 3 ⁇ 3 pixels.
  • the interpolation processing unit 312 determines the edge direction at (x, y), and selects one of the following equations (6) to (8) based on the determination result.
  • (X, y) is the position where the B narrowband signal is missing.
  • the interpolation processing unit 312 obtains a B narrowband signal by the following equation (6).
  • the interpolation processing unit 312 obtains a B narrowband signal by the following equation (7).
  • the interpolation processing unit 312 obtains a B narrowband signal by the following equation (8).
  • FIGS. 9 to 11 are diagrams for explaining a second example of the direction discrimination-type interpolation processing.
  • 9 to 11 the range of pixels used for the interpolation processing is indicated by a dotted line. The bold square in the center indicates a pixel to be interpolated.
  • the interpolation processing unit 312 determines the edge direction at (x, y), and selects one of the following equations (9) to (11) based on the determination result. When it is determined that the edge direction is the y direction, the interpolation processing unit 312 obtains a B narrowband signal by using the pixels in the range shown in FIG. When it is determined that the edge direction is the x direction, the interpolation processing unit 312 obtains a B narrowband signal by using the pixels in the range shown in FIG. If it is determined that there is no edge, the interpolation processing unit 312 obtains a B narrowband signal using the pixels in the range shown in FIG.
  • the result of the interpolation processing in the edge direction and the result of the interpolation processing without using the direction determination are blended based on the reliability.
  • the interpolation process for the B narrowband signal will be described as an example, but the same applies to the interpolation process for the G narrowband signal.
  • the interpolation processing unit 312 obtains the reliability index value RLB (x, y) by the following equation (12).
  • ave_MgCy is the average value of the MgCy signal at (x, y), and is the average value obtained from the MgCy signals of the peripheral pixels at (x, y). The same applies to ave_MgYe.
  • the index value RLB (x, y) is a numerical value in the range of 0 ⁇ RLB (x, y) ⁇ 1. The larger the index value RLB (x, y), the higher the reliability.
  • the interpolation processing unit 312 obtains a B narrowband signal at (x, y) by the following equation (13).
  • Bededge (x, y) is the result of interpolation processing in the edge direction.
  • Bedge (x, y) is the calculation result of the above equation (6) or (7).
  • Bave (x, y) is a result of interpolation processing without using direction discrimination, and is, for example, a calculation result of the above equation (8).
  • the weight of the pixel signal in the interpolation processing is set based on the sensitivity of the pixel signal to narrowband light.
  • the interpolation process for the B narrowband signal will be described as an example, but the same applies to the interpolation process for the G narrowband signal.
  • the weight of the MgCy signal is set higher than the weight of the MgYe signal.
  • W_MgCy the term of MgCy is multiplied by the weight W_MgCy
  • W_MgCy and W_MgYe are set in advance based on the spectrum of the pixel signal and the like.

Abstract

This endoscope device includes a light source 100, an imaging element 240, and a processing circuit 310. The imaging element 240 has a color filter. The processing circuit 310 selects a signal from first narrow-band pixels having sensitivity to a first narrow-band light, and a signal from second narrow-band pixels having sensitivity to second narrow-band light. Of a plurality of filter units in the color filter, half or more comprise filter units transmitting the first narrow-band light, and half or more comprise filter units transmitting the second narrow-band light. Then, the processing circuit 310 executes first interpolation processing wherein the signal from the first narrow-band pixels is processed for interpolation without using the signal from the second narrow-band pixels, and second interpolation processing wherein the signal from the second narrow-band pixels is processed for interpolation without using the signal from the first narrow-band pixels.

Description

内視鏡装置、内視鏡装置の作動方法及びプログラムEndoscope apparatus, method of operating endoscope apparatus, and program
 本発明は、内視鏡装置、内視鏡装置の作動方法及びプログラム等に関する。 The present invention relates to an endoscope apparatus, an operation method of the endoscope apparatus, a program, and the like.
 医療分野及び工業分野において、各種検査のために内視鏡装置が広く用いられている。このうち、医療分野において、細長形状をなす可撓性の挿入部を挿入することによって、被検体を切開しなくても体腔内の体内画像を取得できるため、被検体への負担が少なくなる。このため、内視鏡装置の普及が進んでいる。挿入部の先端には、複数の画素を有する撮像素子が設けられており、その撮像素子により体腔内の体内画像が撮影される。 内 In medical and industrial fields, endoscope devices are widely used for various examinations. Among them, in the medical field, by inserting a flexible insertion portion having an elongated shape, an in-vivo image in a body cavity can be acquired without incising the subject, thereby reducing the burden on the subject. For this reason, endoscope devices have been widely used. An imaging device having a plurality of pixels is provided at the tip of the insertion section, and an in-vivo image in a body cavity is taken by the imaging device.
 このような内視鏡装置の観察モードとして、白色照明光を用いた白色光観察(WLI: White Light Imaging)モードと、狭帯域照明光を用いた狭帯域光観察モードとが知られている。狭帯域光観察モードの一例としてNBI(Narrow Band Imaging)モードがある。NBIモードでは、青色光の波長帯域に含まれる狭帯域光と、緑色光の波長帯域に含まれる狭帯域光とからなる照明光が用いられる。 As an observation mode of such an endoscope apparatus, a white light observation (WLI: White Light Imaging) mode using white illumination light and a narrow band light observation mode using narrow band illumination light are known. An example of the narrow-band light observation mode is an NBI (Narrow Band Imaging) mode. In the NBI mode, illumination light including narrow-band light included in the wavelength band of blue light and narrow-band light included in the wavelength band of green light is used.
 一方、画像を取得する撮像素子として、同時式の単板撮像素子が一般的に用いられている。同時式の単板撮像素子には、互いに色が異なる複数のカラーフィルタが配置されている。この撮像素子により撮像された画像では、1つの画素においては単一の色のみが取得され他の色が欠落した状態となる。このため、補間処理が必須の処理となる。 On the other hand, a simultaneous single-chip image sensor is generally used as an image sensor for acquiring an image. A plurality of color filters having different colors are arranged in the simultaneous single-chip image sensor. In an image picked up by the image pickup device, only one color is obtained for one pixel, and other colors are lost. Therefore, the interpolation process is an essential process.
 補間処理は、例えば線形補間処理又はバイキュービック補間処理等であり、欠落画素を補間する。このため、補間処理によって解像度が低下する。また、補色フィルタが配置された撮像素子を用いた場合には、RGBの色信号が混色した画素信号が得られるため、RGB間の色分離性が低下する。 The interpolation processing is, for example, linear interpolation processing or bicubic interpolation processing, and interpolates missing pixels. Therefore, the resolution is reduced by the interpolation processing. In addition, when an image sensor having a complementary color filter is used, a pixel signal in which RGB color signals are mixed is obtained, so that color separation between RGB is reduced.
 同時式の撮像素子を用いた場合における解像感を向上させる手法が、例えば特許文献1に開示されている。特許文献1では、イエロー及びシアンの補色フィルタと、2つの緑色の原色フィルタとが2行2列の基本配列となっており、その基本配列が繰り返し配置され、基本配列において2つの緑色の原色フィルタが斜め方向に並んで配置されている。そして、緑色の原色フィルタが設けられた画像によって得られたG信号に対して、方向判別型の補間処理を行うことで、解像感を向上させている。 手法 A method for improving the sense of resolution when a simultaneous imaging device is used is disclosed in, for example, Patent Document 1. In Patent Literature 1, complementary color filters for yellow and cyan and two green primary color filters have a basic array of 2 rows and 2 columns, and the basic array is repeatedly arranged, and two green primary color filters are provided in the basic array. Are arranged side by side in an oblique direction. Then, by performing direction discrimination-type interpolation processing on the G signal obtained from the image provided with the green primary color filter, the resolution is improved.
特開2003-087804号公報JP 2003-087804 A
 白色光観察モードでは、G信号が表示画像のGチャンネルに入力される。上記特許文献1の技術では、生体の血管や腺管構造が明瞭に描出されるG信号の解像感を向上させることが可能である。しかし、狭帯域光観察モードでは、G信号が表示画像のGチャンネルに入力されるとは限らない。例えばNBIモードでは、B信号が表示画像のGチャンネルに入力される。このため、G信号の解像感だけを向上させる特許文献1の技術では、狭帯域光観察モードにおける解像感の向上或いは色分離性の向上を望めない。NBIモードでは、粘膜表層の血管走行パターン、或いはブラウニッシュエリア等の色調変化を観察するため、解像感又は色分離性の向上が望まれる。例えば、癌と炎症を識別する際に、血管のコントラスト又は色調変化を観察する。例えば、膀胱観察において、平坦型の癌(CIS: Cancer In Situ)と炎症性病変を識別する際に、解像感又は色分離性の向上が望まれている。 で は In the white light observation mode, the G signal is input to the G channel of the display image. According to the technique of Patent Document 1, it is possible to improve the resolution of a G signal in which a blood vessel or a gland duct structure of a living body is clearly depicted. However, in the narrow band light observation mode, the G signal is not always input to the G channel of the display image. For example, in the NBI mode, a B signal is input to a G channel of a display image. For this reason, in the technique of Patent Document 1 in which only the resolution of the G signal is improved, improvement in resolution or color separation in the narrow-band light observation mode cannot be expected. In the NBI mode, a change in color tone such as a blood vessel running pattern on the mucosal surface layer or a brownish area is observed. For example, when distinguishing between cancer and inflammation, the contrast or color change of blood vessels is observed. For example, in bladder observation, when distinguishing an inflammatory lesion from a flat cancer (CIS: Cancer In Situ), an improvement in resolution or color separation is desired.
 本発明の幾つかの態様によれば、白色光観察モード時の画質を維持しつつ、狭帯域光観察モード時の解像感或いは色分離性を向上できる内視鏡装置及び内視鏡装置の作動方法、プログラム等を提供できる。 According to some aspects of the invention, an endoscope apparatus and an endoscope apparatus capable of improving resolution or color separation in a narrow-band light observation mode while maintaining image quality in a white light observation mode. An operation method, a program, and the like can be provided.
 本発明の一態様は、第1狭帯域光及び第2狭帯域光を有する狭帯域照明光を出射する光源と、複数の画素の各画素に対して各フィルタユニットが設けられるカラーフィルタを有し、前記複数の画素によって受光された光を光電変換する撮像素子と、前記撮像素子からの画素信号のうち、前記第1狭帯域光に感度を有する第1狭帯域画素信号と、前記第2狭帯域光に感度を有する第2狭帯域画素信号とを選択する処理回路と、を含み、前記カラーフィルタは、前記第1狭帯域光を透過するフィルタユニットを総数の半数以上有し、且つ前記第2狭帯域光を透過するフィルタユニットを前記総数の半数以上有し、前記処理回路は、前記第1狭帯域画素信号を、前記第2狭帯域画素信号を用いずに補間処理する第1補間処理と、前記第2狭帯域画素信号を、前記第1狭帯域画素信号を用いずに補間処理する第2補間処理とを行う内視鏡装置に関係する。 One embodiment of the present invention includes a light source that emits narrow-band illumination light having first narrow-band light and second narrow-band light, and a color filter in which each filter unit is provided for each of a plurality of pixels. An image sensor that photoelectrically converts light received by the plurality of pixels, a first narrow band pixel signal having sensitivity to the first narrow band light among the pixel signals from the image sensor, A processing circuit for selecting a second narrow band pixel signal having sensitivity to the band light, wherein the color filter has a filter unit that transmits the first narrow band light at least half of the total number, and (2) a first interpolation process for interpolating the first narrowband pixel signal without using the second narrowband pixel signal, wherein the processing circuit has at least half of the total number of filter units transmitting the narrowband light; And the second narrow band image Signal, related to an endoscope apparatus which performs a second interpolation process of interpolating processing without using the first narrowband pixel signal.
 また本発明の他の態様は、第1狭帯域光及び第2狭帯域光を有する狭帯域照明光を出射する光源と、複数の画素の各画素に対して各フィルタユニットが設けられるカラーフィルタを有する撮像素子と、を含む内視鏡装置の作動方法であって、前記カラーフィルタが、前記第1狭帯域光を透過するフィルタユニットを半数以上有し、且つ前記第2狭帯域光を透過するフィルタユニットを半数以上有する場合において、前記撮像素子からの画素信号のうち、前記第1狭帯域光に感度を有する第1狭帯域画素信号と、前記第2狭帯域光に感度を有する第2狭帯域画素信号とを選択し、前記第1狭帯域画素信号を、前記第2狭帯域画素信号を用いずに補間処理する第1補間処理と、前記第2狭帯域画素信号を、前記第1狭帯域画素信号を用いずに補間理する第2補間処理とを行う内視鏡装置の作動方法に関係する。 According to another aspect of the present invention, there is provided a light source that emits narrow-band illumination light having first narrow-band light and second narrow-band light, and a color filter in which each filter unit is provided for each of a plurality of pixels. And an image sensor having the filter, wherein the color filter has at least half of filter units that transmit the first narrow band light, and transmits the second narrow band light. When the number of filter units is more than half, among the pixel signals from the image sensor, a first narrowband pixel signal having sensitivity to the first narrowband light and a second narrowband pixel signal having sensitivity to the second narrowband light are provided. A first interpolation process of selecting the first narrowband pixel signal without using the second narrowband pixel signal, and a first interpolation process of interpolating the first narrowband pixel signal without using the second narrowband pixel signal. Compensation without using band pixel signal Relating to the operation method of the endoscope apparatus which performs a second interpolation process that sense.
 また本発明の他の態様は、複数の画素の各画素に対して各フィルタユニットが設けられるカラーフィルタを有する撮像素子が撮影した画像を処理するプログラムであって、前記カラーフィルタが、狭帯域照明光の第1狭帯域光を透過するフィルタユニットを半数以上有し、且つ前記狭帯域照明光の第2狭帯域光を透過するフィルタユニットを半数以上有する場合において、前記撮像素子からの画素信号のうち、前記第1狭帯域光に感度を有する第1狭帯域画素信号と、前記第2狭帯域光に感度を有する第2狭帯域画素信号とを選択し、前記第1狭帯域画素信号を、前記第2狭帯域画素信号を用いずに補間処理する第1補間処理と、前記第2狭帯域画素信号を、前記第1狭帯域画素信号を用いずに補間処理する第2補間処理とを行うステップを、コンピュータに実行させるプログラムに関係する。 Another aspect of the present invention is a program for processing an image captured by an image sensor having a color filter provided with each filter unit for each of a plurality of pixels, wherein the color filter has a narrow band illumination. When more than half the filter units that transmit the first narrowband light of light and more than half the filter units that transmit the second narrowband light of the narrowband illumination light, the pixel signal from the image sensor is The first narrowband pixel signal having sensitivity to the first narrowband light and the second narrowband pixel signal having sensitivity to the second narrowband light are selected, and the first narrowband pixel signal is A first interpolation process for interpolating without using the second narrowband pixel signal and a second interpolation process for interpolating the second narrowband pixel signal without using the first narrowband pixel signal are performed. Steps And a program causing a computer to execute.
内視鏡装置の構成例。3 illustrates a configuration example of an endoscope apparatus. 補色イメージセンサを用いたインタレース方式の撮像動作を説明する図。FIG. 4 is a diagram illustrating an interlaced imaging operation using a complementary color image sensor. 画素信号選択部が行う処理を説明する図。FIG. 4 is a diagram illustrating a process performed by a pixel signal selection unit. 画素信号が有する感度の例。4 illustrates an example of sensitivity of a pixel signal. 補色フィルタではないカラーフィルタの一例。An example of a color filter that is not a complementary color filter. 補間処理部が行う処理を説明する図。FIG. 4 is a diagram illustrating a process performed by an interpolation processing unit. 画素信号における混色を説明する図。FIG. 4 is a diagram illustrating color mixing in pixel signals. 画素信号における混色を説明する図。FIG. 4 is a diagram illustrating color mixing in pixel signals. 方向判別型の補間処理の第2例を説明する図である。FIG. 9 is a diagram illustrating a second example of the direction discrimination type interpolation processing. 方向判別型の補間処理の第2例を説明する図である。FIG. 9 is a diagram illustrating a second example of the direction discrimination type interpolation processing. 方向判別型の補間処理の第2例を説明する図である。FIG. 9 is a diagram illustrating a second example of the direction discrimination type interpolation processing.
 以下、本実施形態について説明する。なお、以下に説明する本実施形態は、請求の範囲に記載された本発明の内容を不当に限定するものではない。また本実施形態で説明される構成の全てが、本発明の必須構成要件であるとは限らない。 Hereinafter, the present embodiment will be described. The present embodiment described below does not unduly limit the contents of the present invention described in the claims. Further, all of the configurations described in the present embodiment are not necessarily essential components of the invention.
 1.内視鏡装置 1. Endoscope device
 図1は、内視鏡装置の構成例である。内視鏡装置は、挿入部200と制御装置300と表示部400と外部I/F部500と光源100とを含む。内視鏡装置としては、例えば消化管等に用いられる軟性鏡や、腹腔鏡等に用いられる硬性鏡を想定できる。なお、挿入部200をスコープとも呼ぶ。また制御装置300を本体部又は処理装置とも呼ぶ。表示部400を表示装置とも呼ぶ。外部I/F部500を操作部又は操作装置とも呼ぶ。光源100を照明部又は照明装置とも呼ぶ。 FIG. 1 is a configuration example of an endoscope device. The endoscope device includes an insertion unit 200, a control device 300, a display unit 400, an external I / F unit 500, and a light source 100. As the endoscope device, for example, a flexible endoscope used for a digestive tract and the like, and a rigid endoscope used for a laparoscope and the like can be assumed. Note that the insertion section 200 is also called a scope. Control device 300 is also called a main unit or a processing device. The display unit 400 is also called a display device. The external I / F unit 500 is also called an operation unit or an operation device. The light source 100 is also called a lighting unit or a lighting device.
 内視鏡装置は白色光観察モードと狭帯域光観察モードとを有する。以下、狭帯域光観察モードがNBIモードである場合を例に説明する。但し、狭帯域光観察モードはNBIモードに限定されず、互いに色が異なる複数の狭帯域光を用いる観察モードであればよい。 The endoscope device has a white light observation mode and a narrow band light observation mode. Hereinafter, a case where the narrow-band light observation mode is the NBI mode will be described as an example. However, the narrow-band light observation mode is not limited to the NBI mode, and may be any observation mode that uses a plurality of narrow-band lights having different colors.
 挿入部200は、体内へ挿入される部分である。挿入部200は、ライトガイド210と照明レンズ220と対物レンズ230と撮像素子240とA/D変換回路250とを含む。また挿入部200はメモリ260を含むことができる。撮像素子240はイメージセンサとも呼ぶ。挿入部200は不図示のコネクタを有し、そのコネクタにより制御装置300に対して着脱される。 The insertion part 200 is a part to be inserted into the body. The insertion section 200 includes a light guide 210, an illumination lens 220, an objective lens 230, an image sensor 240, and an A / D conversion circuit 250. Further, the insertion section 200 can include a memory 260. The image sensor 240 is also called an image sensor. The insertion section 200 has a connector (not shown), and is attached to and detached from the control device 300 by the connector.
 光源100は、白色光源110とフィルタ130とレンズ120とを含む。白色光源110は、例えばLED(Light Emitting Diode)又はキセノンランプ等であり、白色照明光を出射する。白色光観察モードにおいて、白色光源110とレンズ120の間からフィルタ130が除かれ、レンズ120に白色照明光が入射する。NBIモードにおいて、白色光源110とレンズ120の間にフィルタ130が挿入され、白色照明光がフィルタ130を通過し、そのフィルタ130を通過した光がレンズ120に入射する。レンズ120は、レンズ120に入射した光を集光することでライトガイド210へ入射させる。 The light source 100 includes a white light source 110, a filter 130, and a lens 120. The white light source 110 is, for example, an LED (Light Emitting Diode) or a xenon lamp, and emits white illumination light. In the white light observation mode, the filter 130 is removed from between the white light source 110 and the lens 120, and white illumination light enters the lens 120. In the NBI mode, the filter 130 is inserted between the white light source 110 and the lens 120, the white illumination light passes through the filter 130, and the light passing through the filter 130 enters the lens 120. The lens 120 condenses the light that has entered the lens 120 and causes the light to enter the light guide 210.
 フィルタ130は、波長410nmを中心とする第1狭帯域と、波長540nmを中心とする第2狭帯域とを通過させる。第1狭帯域は、例えば390nm~445nmであり、白色光における青色波長帯域に属する。第2狭帯域は、例えば530nm~550nmであり、白色光における緑色波長帯域に属する。以下では、フィルタ130を通過した第1狭帯域の光をB狭帯域光と呼び、フィルタ130を通過した第2狭帯域の光をG狭帯域光と呼ぶ。これらの狭帯域光は、血液中のヘモグロビンに吸収されやすい特性を有する。 The filter 130 passes a first narrow band centered on the wavelength 410 nm and a second narrow band centered on the wavelength 540 nm. The first narrow band is, for example, 390 nm to 445 nm and belongs to the blue wavelength band of white light. The second narrow band is, for example, 530 nm to 550 nm and belongs to the green wavelength band of white light. Hereinafter, the first narrowband light that has passed through the filter 130 is referred to as B narrowband light, and the second narrowband light that has passed through the filter 130 is referred to as G narrowband light. These narrow-band lights have characteristics that are easily absorbed by hemoglobin in blood.
 なお、光源100の構成はこれに限定されない。例えば、光源100が白色光源と狭帯域光源とを含み、白色光観察モードにおいて白色光源が白色光を出射し、NBI観察モードにおいて狭帯域光源がB狭帯域光及びG狭帯域光を出射してもよい。この場合、フィルタ130が省略される。 The configuration of the light source 100 is not limited to this. For example, the light source 100 includes a white light source and a narrow band light source, the white light source emits white light in the white light observation mode, and the narrow band light source emits the B narrow band light and the G narrow band light in the NBI observation mode. Is also good. In this case, the filter 130 is omitted.
 ライトガイド210は、光源100からの照明光を、挿入部200の先端まで導光する。照明レンズ220は、ライトガイド210により導光された照明光を被写体へ照射する。本実施形態において被写体は生体である。被写体からの反射光は、対物レンズ230へ入射する。対物レンズ230によって被写体像が結像され、撮像素子240が、その被写体像を撮像する。 Light guide 210 guides the illumination light from light source 100 to the tip of insertion section 200. The illumination lens 220 irradiates the subject with the illumination light guided by the light guide 210. In the present embodiment, the subject is a living body. Light reflected from the subject enters the objective lens 230. A subject image is formed by the objective lens 230, and the imaging element 240 captures the subject image.
 撮像素子240は、被写体像を光電変換する複数の画素を含み、その複数の画素から画素信号を取得する。撮像素子240は、1画素毎に画素信号を読み出してもよいし、複数の画素から加算読み出しにより画素信号を取得してもよい。撮像素子240は、1回の撮像により複数色の画素信号が得られる同時式の単板撮像素子である。撮像素子240は例えば補色イメージセンサであるが、撮像素子240は補色イメージセンサに限定されない。 The image sensor 240 includes a plurality of pixels for subjecting a subject image to photoelectric conversion, and acquires a pixel signal from the plurality of pixels. The image sensor 240 may read out pixel signals for each pixel, or may obtain pixel signals by adding and reading out from a plurality of pixels. The image sensor 240 is a simultaneous single-chip image sensor that can obtain pixel signals of a plurality of colors by one image pickup. The image sensor 240 is, for example, a complementary color image sensor, but the image sensor 240 is not limited to a complementary image sensor.
 A/D変換回路250は、撮像素子240からのアナログの画素信号をデジタルの画素信号にA/D変換する。なお、A/D変換回路250は撮像素子240に内蔵されてもよい。 The A / D conversion circuit 250 A / D converts an analog pixel signal from the image sensor 240 into a digital pixel signal. The A / D conversion circuit 250 may be built in the image sensor 240.
 制御装置300は、画像処理を含む信号処理を行う。また制御装置300は、内視鏡装置の各部を制御する。制御装置300は、処理回路310と制御回路320とを含む。処理回路310及び制御回路320の各々は、例えば集積回路装置等によって実現される。制御回路320は処理回路310とは別個の回路として構成されてもよいし、或いは制御部として処理回路310に含まれてもよい。処理回路310は、例えば後述するプロセッサにより構成されてもよい。 (4) The control device 300 performs signal processing including image processing. The control device 300 controls each part of the endoscope device. The control device 300 includes a processing circuit 310 and a control circuit 320. Each of the processing circuit 310 and the control circuit 320 is realized by, for example, an integrated circuit device or the like. The control circuit 320 may be configured as a separate circuit from the processing circuit 310, or may be included in the processing circuit 310 as a control unit. The processing circuit 310 may be constituted by, for example, a processor described later.
 制御回路320は、内視鏡装置の各部を制御する。例えばユーザが外部I/F部500を操作して観察モードを設定する。白色光観察モードが設定された場合、制御回路320は、光源100から白色光を出射させ、白色光観察モードにおける画像処理を処理回路310に実行させる。NBIモードが設定された場合、制御回路320は、光源100からB狭帯域光及びG狭帯域光を出射させ、NBIモードにおける画像処理を処理回路310に実行させる。 The control circuit 320 controls each section of the endoscope device. For example, the user operates the external I / F unit 500 to set the observation mode. When the white light observation mode is set, the control circuit 320 causes the light source 100 to emit white light, and causes the processing circuit 310 to execute image processing in the white light observation mode. When the NBI mode is set, the control circuit 320 causes the light source 100 to emit B narrowband light and G narrowband light, and causes the processing circuit 310 to execute image processing in the NBI mode.
 挿入部200のメモリ260は、挿入部200に関する情報を記憶している。制御回路320は、メモリ260から読み出した情報に基づいて、内視鏡装置の各部を制御する。例えばメモリ260は撮像素子240に関する情報を記憶している。撮像素子240に関する情報は、例えばカラーフィルタの配列に関する情報等である。制御回路320は、メモリ260から読み出した撮像素子240に関する情報に基づいて、それに対応した画像処理を処理回路310に行わせる。 The memory 260 of the insertion unit 200 stores information on the insertion unit 200. The control circuit 320 controls each unit of the endoscope apparatus based on the information read from the memory 260. For example, the memory 260 stores information about the image sensor 240. The information on the image sensor 240 is, for example, information on the arrangement of color filters. The control circuit 320 causes the processing circuit 310 to perform corresponding image processing based on the information on the image sensor 240 read from the memory 260.
 処理回路310は、A/D変換回路250からの画素信号に基づいて画像処理を行うことで、表示画像を生成し、その表示画像を表示部400へ出力する。表示部400は、例えば液晶表示装置等であり、処理回路310からの表示画像を表示する。処理回路310は、画素信号選択部311と補間処理部312とを含む。 The processing circuit 310 generates a display image by performing image processing based on the pixel signal from the A / D conversion circuit 250, and outputs the display image to the display unit 400. The display unit 400 is, for example, a liquid crystal display device or the like, and displays a display image from the processing circuit 310. The processing circuit 310 includes a pixel signal selection unit 311 and an interpolation processing unit 312.
 白色光観察モードにおいて、補間処理部312は、画素信号に対する補間処理を行うことで白色光画像を生成し、その白色光画像を表示画像として出力する。NBIモードにおいて、画素信号選択部311は、複数色の画素信号のうち一部の色の画素信号を輝度成分画素信号として選択し、残りの色の画素信号を非輝度成分画素信号として選択する。輝度成分画素信号はB狭帯域光に感度を有する画素信号である。非輝度成分画素信号は、G狭帯域光に感度を有する画素信号である。補間処理部312は、輝度成分画素信号に対する第1補間処理を行うことで輝度成分画像を生成し、非輝度成分画素信号に対する第2補間処理を行うことで非輝度成分画像を生成する。処理回路310は、第1補間処理と第2補間処理を、別個の処理として独立に行う。第1補間処理と第2補間処理は同一アルゴリズムであってもよいが、処理としては独立している。例えば、方向判別型の補間処理を行う場合、処理回路310は、第1補間処理において輝度成分画素信号に基づいて方向判別を行い、その判別結果に基づいて輝度成分画素信号を補間処理する。第1補間処理では、非輝度成分画素信号は用いられない。また、処理回路310は、第2補間処理において非輝度成分画素信号に基づいて方向判別を行い、その判別結果に基づいて非輝度成分画素信号を補間処理する。第2補間処理では、輝度成分画素信号は用いられない。処理回路310は、輝度成分画像を表示画像のGチャンネル及びBチャンネルに入力し、非輝度成分画像を表示画像のRチャンネルに入力する。これにより、NBI画像が表示画像として出力される。なお、表示画像は動画及び静止画のいずれであってもよい。 In the white light observation mode, the interpolation processing unit 312 generates a white light image by performing an interpolation process on the pixel signal, and outputs the white light image as a display image. In the NBI mode, the pixel signal selection unit 311 selects a pixel signal of a part of the plurality of pixel signals as a luminance component pixel signal, and selects a remaining color pixel signal as a non-luminance component pixel signal. The luminance component pixel signal is a pixel signal having sensitivity to the B narrow band light. The non-luminance component pixel signal is a pixel signal having sensitivity to G narrow band light. The interpolation processing unit 312 generates a luminance component image by performing a first interpolation process on the luminance component pixel signal, and generates a non-luminance component image by performing a second interpolation process on the non-luminance component pixel signal. The processing circuit 310 independently performs the first interpolation process and the second interpolation process as separate processes. The first interpolation processing and the second interpolation processing may be the same algorithm, but are independent as processing. For example, when performing direction discrimination-type interpolation processing, the processing circuit 310 performs direction discrimination based on the luminance component pixel signal in the first interpolation processing, and interpolates the luminance component pixel signal based on the discrimination result. In the first interpolation processing, a non-luminance component pixel signal is not used. In addition, the processing circuit 310 performs the direction determination based on the non-luminance component pixel signal in the second interpolation processing, and interpolates the non-luminance component pixel signal based on the determination result. In the second interpolation processing, a luminance component pixel signal is not used. The processing circuit 310 inputs the luminance component image to the G channel and the B channel of the display image, and inputs the non-luminance component image to the R channel of the display image. Thus, the NBI image is output as a display image. The display image may be either a moving image or a still image.
 外部I/F部500は、内視鏡装置に対するユーザからの入力等を行うためのインターフェースである。即ち、内視鏡装置を操作するためのインターフェース、或いは内視鏡装置の動作設定を行うためのインターフェース等である。例えば、観察モードを選択するためのボタン又はダイヤル、レバー等を含む。 The external I / F unit 500 is an interface for performing input from the user to the endoscope apparatus and the like. That is, it is an interface for operating the endoscope apparatus, an interface for setting operation of the endoscope apparatus, or the like. For example, a button, a dial, a lever, and the like for selecting an observation mode are included.
 以下、本実施形態における内視鏡装置の動作及び処理について説明する。なお、詳細については後述する。ここでは、後述の実施形態を適宜に引用しながら説明する。 Hereinafter, operation and processing of the endoscope apparatus according to the present embodiment will be described. The details will be described later. Here, the embodiments will be described with reference to the following embodiments as appropriate.
 内視鏡装置は、光源100と撮像素子240と処理回路310とを含む。光源100は、第1狭帯域光及び第2狭帯域光を有する狭帯域照明光を出射する。撮像素子240は、複数の画素の各画素に対して各フィルタユニットが設けられるカラーフィルタを有し、複数の画素によって受光された光を光電変換する。処理回路310は、撮像素子240からの画素信号のうち、第1狭帯域光に感度を有する第1狭帯域画素信号と、第2狭帯域光に感度を有する第2狭帯域画素信号とを選択する。カラーフィルタは、第1狭帯域光を透過するフィルタユニットを総数の半数以上有し、且つ第2狭帯域光を透過するフィルタユニットを総数の半数以上有する。そして、処理回路310は、第1狭帯域画素信号を、第2狭帯域画素信号を用いずに補間処理する第1補間処理と、第2狭帯域画素信号を、第1狭帯域画素信号を用いずに補間処理する第2補間処理とを行う。 The endoscope apparatus includes the light source 100, the image sensor 240, and the processing circuit 310. The light source 100 emits narrow-band illumination light having first narrow-band light and second narrow-band light. The image sensor 240 has a color filter in which each filter unit is provided for each of the plurality of pixels, and photoelectrically converts light received by the plurality of pixels. The processing circuit 310 selects a first narrowband pixel signal having sensitivity to the first narrowband light and a second narrowband pixel signal having sensitivity to the second narrowband light among the pixel signals from the image sensor 240. I do. The color filter has at least half of the total number of filter units transmitting the first narrowband light, and at least half of the total number of filter units transmitting the second narrowband light. Then, the processing circuit 310 performs a first interpolation process of interpolating the first narrowband pixel signal without using the second narrowband pixel signal, and uses the first narrowband pixel signal to perform the second narrowband pixel signal. And a second interpolation process for performing the interpolation process.
 狭帯域光観察モードがNBIモードである場合、第1狭帯域光はB狭帯域光であり、第2狭帯域光はG狭帯域光である。図3、図4で説明するように、処理回路310は、B狭帯域信号を第1狭帯域画素信号として選択し、B狭帯域信号を第2狭帯域画素信号として選択する。図2で説明するように、カラーフィルタが補色フィルタである場合、Mg、CyフィルタユニットがB狭帯域光を透過し、Mg、Cy、Ye、GフィルタユニットがG狭帯域光を透過する。いずれも全体の半数以上である。図6で説明するように、処理回路310は、B狭帯域信号を補間処理する第1補間処理と、G狭帯域信号を補間処理する第2補間処理とを、独立に行う。なお、NBIモードでは、B狭帯域信号に基づいて表示画像のGチャンネルが生成されるので、B狭帯域信号が輝度成分画素信号となる。但し、第1狭帯域信号は輝度成分画素信号に限定されない。例えば、第1狭帯域画素信号及び第2狭帯域画素信号の両方に基づいて表示画像の輝度成分が生成されてもよい。また、狭帯域光観察モードはNBIモードに限定されない。狭帯域観察モードは、複数の狭帯域光を含む照明光が用いられるモードであればよい。 When the narrow-band light observation mode is the NBI mode, the first narrow-band light is B narrow-band light and the second narrow-band light is G narrow-band light. As described with reference to FIGS. 3 and 4, the processing circuit 310 selects the B narrowband signal as the first narrowband pixel signal, and selects the B narrowband signal as the second narrowband pixel signal. As described with reference to FIG. 2, when the color filter is a complementary color filter, the Mg and Cy filter units transmit B narrow band light, and the Mg, Cy, Ye and G filter units transmit G narrow band light. In each case, it is more than half of the whole. As described with reference to FIG. 6, the processing circuit 310 independently performs the first interpolation process of interpolating the B narrowband signal and the second interpolation process of interpolating the G narrowband signal. In the NBI mode, since the G channel of the display image is generated based on the B narrow band signal, the B narrow band signal is a luminance component pixel signal. However, the first narrowband signal is not limited to a luminance component pixel signal. For example, a luminance component of a display image may be generated based on both the first narrowband pixel signal and the second narrowband pixel signal. Further, the narrow-band light observation mode is not limited to the NBI mode. The narrow-band observation mode may be any mode in which illumination light including a plurality of narrow-band lights is used.
 下式(1)~(4)で説明するように、従来では、補色イメージセンサにより撮像された画像は画素加算等によって補間処理され、画像の解像感又は色分離性が低下するおそれがある。一方、狭帯域光観察モードにおいては、血管のコントラストが高いことが望ましい。血管のコントラストは、血管の解像感又は色分離性である。本実施形態によれば、第1狭帯域画素信号を補間処理する第1補間処理と、第2狭帯域画素信号を補間処理する第2補間処理とが、独立に行われる。即ち、各狭帯域照明光によって得られる画像が、画素加算等されずに独立に補間処理される。画素加算等が不要であるため、血管のコントラストを向上できる。例えば、膀胱観察においてCISと炎症とを識別しやすくなることが期待される。また、上述した特許文献1ではG信号だけが高解像となる。一方、本実施形態によれば、2つの狭帯域画像の両方が高解像となるため、狭帯域光観察モードにおいて高解像な撮像を実現できる。 As described in the following equations (1) to (4), conventionally, an image captured by a complementary color image sensor is subjected to interpolation processing by pixel addition or the like, and the image resolution or color separation may be reduced. . On the other hand, in the narrow band light observation mode, it is desirable that the contrast of the blood vessel is high. The contrast of a blood vessel is the resolution or color separation of the blood vessel. According to the present embodiment, the first interpolation processing for interpolating the first narrowband pixel signal and the second interpolation processing for interpolating the second narrowband pixel signal are performed independently. That is, an image obtained by each narrow-band illumination light is interpolated independently without adding pixels. Since pixel addition or the like is unnecessary, the contrast of blood vessels can be improved. For example, it is expected that CIS and inflammation can be easily distinguished in bladder observation. Further, in the above-mentioned Patent Document 1, only the G signal has a high resolution. On the other hand, according to the present embodiment, both of the two narrow-band images have high resolution, so that high-resolution imaging can be realized in the narrow-band light observation mode.
 また本実施形態によれば、撮像素子240には、第1狭帯域光を透過するフィルタユニットが半数以上設けられ、且つ第2狭帯域光を透過するフィルタユニットが半数以上設けられている。そして、第1狭帯域光に感度を有する第1狭帯域画素信号と、第2狭帯域光に感度を有する第2狭帯域画素信号とが選択される。これにより、1つの画像を構成する全画素のうち、半数の画素が第1狭帯域の画素となり、残りの半数の画素が第2狭帯域の画素となる。このような画像に対して第1補間処理及び第2補間処理を行うことで、解像感の高い画像が得られる。 According to the present embodiment, more than half of the filter units that transmit the first narrow-band light are provided in the image sensor 240, and more than half of the filter units that transmit the second narrow-band light are provided. Then, a first narrowband pixel signal having sensitivity to the first narrowband light and a second narrowband pixel signal having sensitivity to the second narrowband light are selected. As a result, of all the pixels forming one image, half of the pixels are pixels of the first narrow band, and the remaining half are pixels of the second narrow band. By performing the first interpolation processing and the second interpolation processing on such an image, an image having a high resolution is obtained.
 また、下式(5)及び第1変形例で説明するように、処理回路310は、第1補間処理により得られる第1狭帯域画像、及び前記第2補間処理により得られる第2狭帯域画像に基づいて、第1狭帯域画像と第2狭帯域画像の間の色分離処理を行う。 Further, as described in the following equation (5) and the first modification, the processing circuit 310 includes a first narrowband image obtained by the first interpolation process and a second narrowband image obtained by the second interpolation process. , A color separation process between the first narrowband image and the second narrowband image is performed.
 図7、図8で説明するように、第1狭帯域画素信号は第2狭帯域光にも感度を有し、第2狭帯域画素信号は第1狭帯域光にも感度を有する。即ち、第1狭帯域画素信号及び第2狭帯域画素信号には混色が存在する。本実施形態によれば、第1狭帯域画像と第2狭帯域画像の間の色分離処理が行われるので、上記混色を低減できる。混色が低減されることで、狭帯域光観察モードにおける色分離性を向上できる。 As described with reference to FIGS. 7 and 8, the first narrowband pixel signal is also sensitive to the second narrowband light, and the second narrowband pixel signal is also sensitive to the first narrowband light. That is, the first narrowband pixel signal and the second narrowband pixel signal have a mixed color. According to the present embodiment, since the color separation process is performed between the first narrowband image and the second narrowband image, the color mixture can be reduced. By reducing the color mixture, the color separation in the narrow-band light observation mode can be improved.
 また下式(12)、(13)及び第3変形例で説明するように、処理回路310は、第1狭帯域画素信号の明るさ及び第2狭帯域画素信号の明るさに基づいて、方向判別型の補間処理における方向判別の信頼度を示す指標値RLB(x,y)を求める。処理回路310は、第1補間処理及び第2補間処理において、指標値RLB(x,y)に基づいて方向判別型の補間処理を行う。 Further, as described in the following equations (12) and (13) and the third modification, the processing circuit 310 determines the direction based on the brightness of the first narrowband pixel signal and the brightness of the second narrowband pixel signal. An index value RLB (x, y) indicating the reliability of the direction discrimination in the discrimination type interpolation processing is obtained. The processing circuit 310 performs a direction determination type interpolation process based on the index value RLB (x, y) in the first interpolation process and the second interpolation process.
 具体的には、指標値RLB(x,y)は、位置(x,y)の周辺における第1狭帯域画素信号及び第2狭帯域画素信号の明るさのバラツキ度合いを示す。第1狭帯域画素信号及び第2狭帯域画素信号の明るさのバラツキ度合いが大きい場合には、エッジ方向判定の信頼度が低下する。本実施形態によれば、処理回路310は、信頼度が低いほど、エッジ方向の補間処理結果のウェイトが小さくなるように、補間処理を行う。これにより、補間処理によるアーティファクトの発生を、低減できる。 Specifically, the index value RLB (x, y) indicates the degree of variation in the brightness of the first narrowband pixel signal and the second narrowband pixel signal around the position (x, y). When the degree of variation in the brightness of the first narrowband pixel signal and the second narrowband pixel signal is large, the reliability of the edge direction determination decreases. According to the present embodiment, the processing circuit 310 performs the interpolation processing such that the lower the reliability, the smaller the weight of the interpolation processing result in the edge direction. This can reduce the occurrence of artifacts due to the interpolation processing.
 また本実施形態では、処理回路310は、方向判別型の補間処理において、エッジ方向の補間処理結果と、非方向判別型の補間処理結果とを、指標値RLB(x,y)に基づくブレンド率でブレンドする。処理回路310は、信頼度が低いほど非方向判別型の補間処理結果のブレンド率を大きくする。 Further, in the present embodiment, in the direction discrimination type interpolation processing, the processing circuit 310 compares the edge direction interpolation processing result and the non-direction discrimination type interpolation processing result with a blend ratio based on the index value RLB (x, y). Blend with. The processing circuit 310 increases the blend ratio of the non-direction discrimination type interpolation processing result as the reliability is lower.
 下式(13)において、非方向判別型の補間処理結果のブレンド率は、(1-RLB(x,y))である。 に お い て In the following equation (13), the blend ratio of the result of the non-direction discrimination-type interpolation processing is (1-RLB (x, y)).
 本実施形態によれば、信頼度が低いほど非方向判別型の補間処理結果のブレンド率が大きくなる。これにより、第1狭帯域画素信号及び第2狭帯域画素信号の明るさのバラツキ度合いが大きいほど、非方向判別型の補間処理結果のブレンド率を大きくし、方向判別型の補間処理のブレンド率を小さくできる。 According to this embodiment, the lower the reliability, the higher the blending ratio of the non-direction discrimination type interpolation processing result. As a result, the greater the degree of variation in brightness between the first narrowband pixel signal and the second narrowband pixel signal, the greater the blending rate of the non-directional discrimination type interpolation processing result, and the higher the blending rate of the direction discrimination type interpolation processing. Can be reduced.
 また下式(12)で説明するように、処理回路310は、第1狭帯域画素信号の明るさと、第2狭帯域画素信号の明るさとの比に基づいて指標値RLB(x,y)を求める。 Further, as described by the following equation (12), the processing circuit 310 calculates the index value RLB (x, y) based on the ratio between the brightness of the first narrowband pixel signal and the brightness of the second narrowband pixel signal. Ask.
 第1狭帯域画素信号の明るさと、第2狭帯域画素信号の明るさとの比は、第1狭帯域画素信号及び第2狭帯域画素信号の明るさのバラツキ度合いに応じて変化する。これにより、第1狭帯域画素信号及び第2狭帯域画素信号の明るさのバラツキ度合いに応じて非方向判別型の補間処理結果のブレンド率を制御できる。 比 The ratio of the brightness of the first narrowband pixel signal to the brightness of the second narrowband pixel signal changes according to the degree of variation in the brightness of the first narrowband pixel signal and the second narrowband pixel signal. This makes it possible to control the blending ratio of the non-direction discrimination-type interpolation processing according to the degree of variation in brightness between the first narrowband pixel signal and the second narrowband pixel signal.
 また図7、図8及び第4変形例で説明するように、第1狭帯域画素信号は、第1画素信号と、第1画素信号よりも第1狭帯域光に対する感度が高い第2画素信号とを含む。第2狭帯域画素信号は、第3画素信号と、第3画素信号よりも第2狭帯域光に対する感度が高い第4画素信号とを含む。処理回路310は、第1補間処理において、第2画素信号のウェイトを第1画素信号のウェイトよりも大きくし、第2補間処理において、第4画素信号のウェイトを第3画素信号のウェイトよりも大きくする。 As described in FIGS. 7, 8 and the fourth modification, the first narrow-band pixel signal is composed of the first pixel signal and the second pixel signal having higher sensitivity to the first narrow-band light than the first pixel signal. And The second narrowband pixel signal includes a third pixel signal and a fourth pixel signal having a higher sensitivity to the second narrowband light than the third pixel signal. The processing circuit 310 makes the weight of the second pixel signal larger than the weight of the first pixel signal in the first interpolation processing, and makes the weight of the fourth pixel signal larger than the weight of the third pixel signal in the second interpolation processing. Enlarge.
 図7、図8において、第1画素信号はMgYe信号であり、第2画素信号はMgCy信号である。また第3画素信号はGCy信号であり、第4画素信号はGYe信号である。 7 and 8, the first pixel signal is an MgYe signal, and the second pixel signal is an MgCy signal. The third pixel signal is a GCy signal, and the fourth pixel signal is a GYe signal.
 本実施形態によれば、第1狭帯域画像を補間処理する際に、第1狭帯域光に対する感度が高い画素信号のウェイトを大きくすることで、より解像感が高い第1補間処理を実現できる。また、第2狭帯域画像を補間処理する際に、第2狭帯域光に対する感度が高い画素信号のウェイトを大きくすることで、より解像感が高い第2補間処理を実現できる。従って、これらを合成した表示画像についても解像感が向上する。 According to the present embodiment, when the first narrowband image is subjected to the interpolation processing, the weight of the pixel signal having high sensitivity to the first narrowband light is increased, thereby realizing the first interpolation processing with higher resolution. it can. In addition, when performing interpolation processing on the second narrowband image, by increasing the weight of a pixel signal having high sensitivity to the second narrowband light, it is possible to realize second interpolation processing with higher resolution. Therefore, the sense of resolution is improved for the display image obtained by combining these.
 また本実施形態では、第1狭帯域画素信号は、互いに異なる色に対応した第1画素信号及び第2画素信号を含む。第2狭帯域画素信号は、互いに異なる色に対応した第3画素信号及び第4画素信号を含む。処理回路310は、第1補間処理において、第1対象画素の周辺画素における第1画素信号の平均値と、第1対象画素の周辺画素における第2画素信号の平均値とに基づいて、第1画素信号及び第2画素信号の間の明るさ補正を行う。また処理回路310は、第2補間処理において、第2対象画素の周辺画素における第3画素信号の平均値と、第2対象画素の周辺画素における第4画素信号の平均値とに基づいて、第3画素信号及び第4画素信号の間の明るさ補正を行う。 Also, in the present embodiment, the first narrowband pixel signal includes a first pixel signal and a second pixel signal corresponding to different colors. The second narrowband pixel signal includes a third pixel signal and a fourth pixel signal corresponding to different colors. In the first interpolation processing, the processing circuit 310 performs the first interpolation based on the average value of the first pixel signal in the peripheral pixels of the first target pixel and the average value of the second pixel signal in the peripheral pixels of the first target pixel. The brightness correction between the pixel signal and the second pixel signal is performed. Further, in the second interpolation processing, the processing circuit 310 performs the second interpolation based on the average value of the third pixel signal in the peripheral pixels of the second target pixel and the average value of the fourth pixel signal in the peripheral pixels of the second target pixel. The brightness correction between the third pixel signal and the fourth pixel signal is performed.
 図3に示すように、第1狭帯域画素信号であるB狭帯域信号は、第1画素信号としてMgYe信号を含み、第2画素信号としてMgCy信号を含む。また第2狭帯域画素信号であるG狭帯域信号は、第3画素信号としてGCy信号を含み、第4画素信号としてGYe信号を含む。例えば、処理回路310は、MgCy信号の平均値/MgYe信号の平均値をMgYe信号に乗算し、GYe信号の平均値/GCy信号の平均値をGCy信号に乗算する。即ち、処理回路310は、周辺同色画素の平均値に基づいて明るさ補正を行う。この明るさ補正は、例えば補間処理の前に行われる。 BAs shown in FIG. 3, the B narrowband signal, which is the first narrowband pixel signal, includes the MgYe signal as the first pixel signal, and includes the MgCy signal as the second pixel signal. The G narrow-band signal, which is the second narrow-band pixel signal, includes the GCy signal as the third pixel signal, and includes the GYe signal as the fourth pixel signal. For example, the processing circuit 310 multiplies the MgYe signal by the average value of the MgCy signal / the average value of the MgYe signal, and multiplies the GCy signal by the average value of the GYe signal / the average value of the GCy signal. That is, the processing circuit 310 performs the brightness correction based on the average value of the peripheral same color pixels. This brightness correction is performed, for example, before the interpolation processing.
 本実施形態によれば、第1狭帯域画像内において、第1画素信号と第2画素信号の明るさレベルが合うように補正され、第2狭帯域画像内において、第3画素信号と第4画素信号の明るさレベルが合うように補正される。これにより、補間処理を適切に実行できる。例えば、方向判別型の補間処理において、信号間の明るさのバラツキ度合いが低減されるので、エッジ方向判別の精度が向上する。 According to the present embodiment, in the first narrowband image, the first pixel signal and the second pixel signal are corrected so that the brightness levels match, and in the second narrowband image, the third pixel signal and the fourth pixel signal are corrected. Correction is performed so that the brightness level of the pixel signal matches. Thereby, the interpolation processing can be appropriately executed. For example, in the direction determination type interpolation processing, the degree of variation in brightness between signals is reduced, so that the accuracy of edge direction determination is improved.
 また図4で説明するように、処理回路310は、第1狭帯域光に対する感度が第2狭帯域光に対する感度以上である画素信号を、第1狭帯域画素信号として選択する。第2狭帯域光に対する感度が第1狭帯域光に対する感度より大きい画素信号を、第2狭帯域画素信号として選択する。 As described with reference to FIG. 4, the processing circuit 310 selects a pixel signal whose sensitivity to the first narrowband light is equal to or higher than the sensitivity to the second narrowband light as the first narrowband pixel signal. A pixel signal whose sensitivity to the second narrowband light is greater than the sensitivity to the first narrowband light is selected as the second narrowband pixel signal.
 本実施形態によれば、狭帯域光に対する感度に応じて第1狭帯域画素信号及び第2狭帯域画素信号が選択されることで、画素信号を第1狭帯域画素信号及び第2狭帯域画素信号に分離できる。このように第1狭帯域画素信号及び第2狭帯域画素信号に分離し、それぞれの画素信号に対して独立な補間処理を行うことで、高コントラストの表示画像が得られる。 According to the present embodiment, the first narrowband pixel signal and the second narrowband pixel signal are selected according to the sensitivity to the narrowband light, so that the pixel signals are converted into the first narrowband pixel signal and the second narrowband pixel. Can be separated into signals. As described above, by separating the pixel signal into the first narrowband pixel signal and the second narrowband pixel signal and performing independent interpolation processing on each pixel signal, a display image with high contrast can be obtained.
 また本実施形態では、処理回路310は、第1狭帯域画素信号及び第2狭帯域画素信号のうち混色が小さい狭帯域画素信号を基準として、第1狭帯域画素信号と第2狭帯域画素信号との間の明るさレベルを調整する。 Further, in the present embodiment, the processing circuit 310 performs the first narrowband pixel signal and the second narrowband pixel signal on the basis of the narrowband pixel signal having a small color mixture among the first narrowband pixel signal and the second narrowband pixel signal. Adjust the brightness level between and.
 例えば、第2狭帯域画素信号の方が、混色が小さかったとする。このとき、処理回路310は、第2狭帯域画素信号の平均値/第1狭帯域画素信号の平均値を、第1狭帯域画素信号に乗算する。この明るさレベル調整は、例えば補間処理前或いは補間処理後に行われる。 For example, it is assumed that the color mixture is smaller in the second narrowband pixel signal. At this time, the processing circuit 310 multiplies the first narrowband pixel signal by the average of the second narrowband pixel signal / the average of the first narrowband pixel signal. This brightness level adjustment is performed, for example, before or after the interpolation processing.
 また図2で説明するように、カラーフィルタは、補色フィルタである。具体的には、カラーフィルタは、マゼンダ及びイエロー、シアン、グリーンのフィルタユニットを含む。 (2) As described in FIG. 2, the color filter is a complementary color filter. Specifically, the color filter includes magenta, yellow, cyan, and green filter units.
 上述したように、従来技術において、補色イメージセンサにより撮像された画像は画素加算等によって補間処理される。一方、本実施形態によれば、狭帯域光観察モードにおいて第1狭帯域画素信号を補間処理する第1補間処理と、第2狭帯域画素信号を補間処理する第2補間処理とが、独立に行われる。これにより、画素加算等が不要であるため、血管のコントラストを向上できる。 As described above, in the related art, an image captured by a complementary color image sensor is subjected to interpolation processing by pixel addition or the like. On the other hand, according to the present embodiment, the first interpolation processing for interpolating the first narrowband pixel signal and the second interpolation processing for interpolating the second narrowband pixel signal in the narrowband light observation mode are independently performed. Done. This eliminates the need for pixel addition and the like, and thus can improve the contrast of blood vessels.
 また図5で説明するように、カラーフィルタは、原色ベイヤ配列のカラーフィルタにおいて、赤色のフィルタユニットがマゼンダのフィルタユニットに置き換えられたカラーフィルタであってもよい。 As described with reference to FIG. 5, the color filter may be a color filter in which a red filter unit is replaced with a magenta filter unit in a primary color Bayer array color filter.
 このようなカラーフィルタが設けられた撮像素子を用いた場合にも、本発明を適用できる。この場合においても、狭帯域光観察モードにおいて第1狭帯域画素信号を補間処理する第1補間処理と、第2狭帯域画素信号を補間処理する第2補間処理とが、独立に行われることで、血管のコントラストを向上できる。 The present invention can be applied to a case where an image sensor provided with such a color filter is used. Also in this case, in the narrow-band light observation mode, the first interpolation process of interpolating the first narrow-band pixel signal and the second interpolation process of interpolating the second narrow-band pixel signal are performed independently. The contrast of blood vessels can be improved.
 また本実施形態では、処理回路310は、第1補間処理により得られる第1狭帯域画像を表示画像のGチャンネルに入力することで、表示画像を生成する。 Also, in the present embodiment, the processing circuit 310 generates a display image by inputting the first narrowband image obtained by the first interpolation process to the G channel of the display image.
 この場合、第1狭帯域画像は輝度成分画像である。即ち、処理回路310は、第1狭帯域画素信号を輝度成分画素信号として選択することになる。 In this case, the first narrowband image is a luminance component image. That is, the processing circuit 310 selects the first narrowband pixel signal as the luminance component pixel signal.
 また本実施形態では、第1狭帯域光は、白色光における青色の波長帯域に属し、且つ青色の波長帯域よりも狭い波長帯域を有する。第2狭帯域光は、白色光における緑色の波長帯域に属し、且つ緑色の波長帯域よりも狭い波長帯域を有する。処理回路310は、第1狭帯域画像を表示画像のGチャンネル及びBチャンネルに入力し、第2補間処理により得られる第2狭帯域画像を表示画像のRチャンネルに入力する。 Also, in the present embodiment, the first narrowband light belongs to the blue wavelength band of white light, and has a wavelength band narrower than the blue wavelength band. The second narrowband light belongs to the green wavelength band of the white light, and has a wavelength band narrower than the green wavelength band. The processing circuit 310 inputs the first narrowband image to the G and B channels of the display image, and inputs the second narrowband image obtained by the second interpolation process to the R channel of the display image.
 本実施形態によれば、狭帯域光観察モードとしてNBIモードを実現できる。NBIモードでは、粘膜表層付近の血管を高コントラストで撮影可能である。また、粘膜表層付近に血管が密集した病変をブラウニッシュエリアとして観察できる。ブラウニッシュエリアは、褐色に見える領域のことである。本実施形態では、NBIモードにおいて血管を高コントラストで撮影できる。例えば、CISと炎症は共にブラウニッシュエリアに見えるが、血管が高コントラストに見えるため、CISと炎症を区別しやすくなることが期待される。 According to the present embodiment, the NBI mode can be realized as the narrow-band light observation mode. In the NBI mode, blood vessels near the surface layer of the mucous membrane can be photographed with high contrast. In addition, a lesion in which blood vessels are densely located near the surface of the mucous membrane can be observed as a brownish area. The brownish area is an area that looks brown. In the present embodiment, a blood vessel can be photographed with high contrast in the NBI mode. For example, although both CIS and inflammation appear in the brownish area, the blood vessels appear to have high contrast, so that it is expected that CIS and inflammation can be easily distinguished.
 また本実施形態では、光源100は、白色光観察モードにおいて、白色照明光を出射し、狭帯域光観察モードにおいて、狭帯域照明光を出射する。撮像素子240のカラーフィルタは、白色照明光が出射される場合において白色光画像を撮像するための複数のフィルタユニットを有する。第1狭帯域光を透過するフィルタユニットは、複数のフィルタユニットの半数以上であり、且つ第2狭帯域光を透過するフィルタユニットは、複数のフィルタユニットの半数以上である。処理回路310は、狭帯域光観察モードにおいて、第1狭帯域画素信号及び第2狭帯域画素信号を選択し、第1補間処理及び第2補間処理を行う。 In the present embodiment, the light source 100 emits white illumination light in the white light observation mode, and emits narrow band illumination light in the narrow band light observation mode. The color filter of the image sensor 240 has a plurality of filter units for capturing a white light image when white illumination light is emitted. The number of filter units transmitting the first narrowband light is at least half of the plurality of filter units, and the number of filter units transmitting the second narrowband light is at least half of the plurality of filter units. In the narrow-band light observation mode, the processing circuit 310 selects the first narrow-band pixel signal and the second narrow-band pixel signal, and performs the first interpolation process and the second interpolation process.
 撮像素子240が補色イメージセンサである場合、図2に示すように補色フィルタはMg、Cy、Ye、Gのフィルタユニットを有する。これらの4つのフィルタユニットは、白色光画像を撮像するためのものである。具体的には、下式(1)~(4)の補間処理によって白色光画像が得られる。このとき、Mg、CyフィルタユニットがB狭帯域光を透過し、Mg、Cy、Ye、GフィルタユニットがG狭帯域光を透過する。いずれも4つのうち半数の2つ以上である。従来技術において、白色光画像を撮影する際には下式(1)~(4)のように画素加算等を行う。本実施形態によれば、狭帯域光観察モードにおいて第1狭帯域画素信号及び第2狭帯域画素信号を独立に補間処理することで、高コントラストな狭帯域光画像が得られる。 (2) When the image sensor 240 is a complementary color image sensor, the complementary color filter has Mg, Cy, Ye, and G filter units as shown in FIG. These four filter units are for capturing a white light image. Specifically, a white light image is obtained by the interpolation processing of the following equations (1) to (4). At this time, the Mg and Cy filter units transmit the B narrow band light, and the Mg, Cy, Ye and G filter units transmit the G narrow band light. In each case, two or more of the four are half. In the related art, when capturing a white light image, pixel addition or the like is performed as in the following equations (1) to (4). According to the present embodiment, a high-contrast narrow-band light image can be obtained by independently interpolating the first narrow-band pixel signal and the second narrow-band pixel signal in the narrow-band light observation mode.
 なお、本実施形態の制御装置300は以下のように構成されてもよい。即ち、本実施形態の制御装置300は、情報を記憶するメモリと、メモリに記憶された情報に基づいて動作するプロセッサと、を含む。情報は、例えばプログラムと各種のデータ等である。プロセッサは、ハードウェアを含む。プロセッサは、狭帯域照明光が出射される場合において、撮像素子240からの画素信号のうち、第1狭帯域光に感度を有する第1狭帯域画素信号と、第2狭帯域光に感度を有する第2狭帯域画素信号とを選択する。プロセッサは、第1狭帯域画素信号を、第2狭帯域画素信号を用いずに補間処理する第1補間処理と、第2狭帯域画素信号を、第1狭帯域画素信号を用いずに補間処理する第2補間処理とを行う。 Note that the control device 300 of the present embodiment may be configured as follows. That is, the control device 300 of the present embodiment includes a memory that stores information, and a processor that operates based on the information stored in the memory. The information is, for example, a program and various data. The processor includes hardware. When the narrow-band illumination light is emitted, the processor has a first narrow-band pixel signal having sensitivity to the first narrow-band light and a sensitivity to the second narrow-band light among the pixel signals from the image sensor 240. And a second narrowband pixel signal. The processor interpolates the first narrowband pixel signal without using the second narrowband pixel signal, and interpolates the second narrowband pixel signal without using the first narrowband pixel signal. And a second interpolation process.
 プロセッサは、例えば各部の機能が個別のハードウェアで実現されてもよいし、或いは各部の機能が一体のハードウェアで実現されてもよい。例えば、プロセッサはハードウェアを含み、そのハードウェアは、デジタル信号を処理する回路及びアナログ信号を処理する回路の少なくとも一方を含むことができる。例えば、プロセッサは、回路基板に実装された1又は複数の回路装置や、1又は複数の回路素子で構成することができる。1又は複数の回路装置は例えばIC等である。1又は複数の回路素子は例えば抵抗、キャパシター等である。プロセッサは、例えばCPU(Central Processing Unit)であってもよい。ただし、プロセッサはCPUに限定されるものではなく、GPU(Graphics Processing Unit)、或いはDSP(Digital Signal Processor)等、各種のプロセッサを用いることが可能である。またプロセッサはASICによるハードウェア回路でもよい。またプロセッサは、アナログ信号を処理するアンプ回路やフィルタ回路等を含んでもよい。メモリは、SRAM、DRAMなどの半導体メモリであってもよいし、レジスターであってもよいし、ハードディスク装置等の磁気記憶装置であってもよいし、光学ディスク装置等の光学式記憶装置であってもよい。例えば、メモリはコンピュータにより読み取り可能な命令を格納しており、当該命令がプロセッサにより実行されることで、制御装置300の各部の機能が処理として実現されることになる。ここでの命令は、プログラムを構成する命令セットの命令でもよいし、プロセッサのハードウェア回路に対して動作を指示する命令であってもよい。例えば、プロセッサは、図1において処理回路310の機能を実現する。或いは、プロセッサは、図1において処理回路310及び制御回路320の機能を実現する。 In the processor, for example, the function of each unit may be realized by individual hardware, or the function of each unit may be realized by integrated hardware. For example, a processor includes hardware, and the hardware can include at least one of a circuit that processes digital signals and a circuit that processes analog signals. For example, the processor can be configured with one or a plurality of circuit devices mounted on a circuit board or one or a plurality of circuit elements. The one or more circuit devices are, for example, ICs. The one or more circuit elements are, for example, resistors, capacitors, and the like. The processor may be, for example, a CPU (Central Processing Unit). However, the processor is not limited to the CPU, and various processors such as a GPU (Graphics Processing Unit) or a DSP (Digital Signal Processor) can be used. Further, the processor may be a hardware circuit based on an ASIC. Further, the processor may include an amplifier circuit and a filter circuit for processing an analog signal. The memory may be a semiconductor memory such as an SRAM or a DRAM, a register, a magnetic storage device such as a hard disk device, or an optical storage device such as an optical disk device. You may. For example, the memory stores a computer-readable instruction, and when the instruction is executed by the processor, the function of each unit of the control device 300 is realized as a process. The instruction here may be an instruction of an instruction set constituting a program or an instruction for instructing a hardware circuit of a processor to operate. For example, the processor realizes the function of the processing circuit 310 in FIG. Alternatively, the processor realizes the functions of the processing circuit 310 and the control circuit 320 in FIG.
 また、本実施形態の内視鏡装置の各部は、プロセッサ上で動作するプログラムのモジュールとして実現されてもよい。例えば、プログラムは、画素信号選択部311を実現する画素信号選択モジュールと、補間処理部312を実現する補間処理モジュールとを含んでもよい。画素信号選択モジュールは、狭帯域照明光が出射される場合において、撮像素子240からの画素信号のうち、第1狭帯域光に感度を有する第1狭帯域画素信号と、第2狭帯域光に感度を有する第2狭帯域画素信号とを選択する。補間処理モジュールは、第1狭帯域画素信号を、第2狭帯域画素信号を用いずに補間処理する第1補間処理と、第2狭帯域画素信号を、第1狭帯域画素信号を用いずに補間処理する第2補間処理とを行う。 Each part of the endoscope apparatus according to the present embodiment may be realized as a module of a program that operates on a processor. For example, the program may include a pixel signal selection module realizing the pixel signal selection unit 311 and an interpolation processing module realizing the interpolation processing unit 312. When the narrow-band illumination light is emitted, the pixel signal selection module outputs a first narrow-band pixel signal having sensitivity to the first narrow-band light and a second narrow-band light among the pixel signals from the image sensor 240. And a second narrowband pixel signal having sensitivity. The interpolation processing module performs a first interpolation process of interpolating the first narrowband pixel signal without using the second narrowband pixel signal, and a second interpolation process of using the second narrowband pixel signal without using the first narrowband pixel signal. And a second interpolation process for performing the interpolation process.
 また、本実施形態の制御装置300の各部が行う処理を実現するプログラムは、例えばコンピュータにより読み取り可能な媒体である情報記憶媒体に格納できる。情報記憶媒体は、例えば光ディスク、メモリカード、HDD、或いは半導体メモリなどにより実現できる。半導体メモリは例えばROMである。処理回路310及び制御回路320は、情報記憶媒体に格納されるプログラムとデータに基づいて本実施形態の種々の処理を行う。即ち情報記憶媒体には、本実施形態の内視鏡装置の各部としてコンピュータを機能させるためのプログラムが記憶される。コンピュータは、入力装置、及び処理部、記憶部、出力部を備える装置である。プログラムは、各部の処理をコンピュータに実行させるためのプログラムである。プログラムは、情報記憶媒体に記録される。ここで、情報記録媒体としては、DVDやCD等の光ディスク、光磁気ディスク、ハードディスク、不揮発性メモリやRAM等のメモリなど、光学式検出システムによって読み取り可能な種々の記録媒体を想定できる。 The program that implements the processing performed by each unit of the control device 300 of the present embodiment can be stored in, for example, an information storage medium that is a computer-readable medium. The information storage medium can be realized by, for example, an optical disk, a memory card, an HDD, or a semiconductor memory. The semiconductor memory is, for example, a ROM. The processing circuit 310 and the control circuit 320 perform various processes of the present embodiment based on programs and data stored in the information storage medium. That is, the information storage medium stores a program for causing a computer to function as each unit of the endoscope apparatus according to the present embodiment. The computer is a device including an input device, a processing unit, a storage unit, and an output unit. The program is a program for causing a computer to execute the processing of each unit. The program is recorded on an information storage medium. Here, as the information recording medium, various recording media readable by an optical detection system, such as an optical disk such as a DVD and a CD, a magneto-optical disk, a hard disk, and a memory such as a nonvolatile memory and a RAM, can be assumed.
 2.動作及び処理 {2. Operation and processing
 以下、内視鏡装置の詳細な動作及び処理を説明する。 Hereinafter, detailed operations and processes of the endoscope apparatus will be described.
 図2は、補色イメージセンサを用いたインタレース方式の撮像動作を説明する図である。図2において、Xは撮像素子240における水平走査方向を示し、Yは撮像素子240における垂直走査方向を示す。 FIG. 2 is a view for explaining an interlaced imaging operation using a complementary color image sensor. In FIG. 2, X indicates a horizontal scanning direction in the image sensor 240, and Y indicates a vertical scanning direction in the image sensor 240.
 インタレース方式において、偶数フィールドの撮像と奇数フィールドの撮像とが交互に行われる。偶数フィールドと奇数フィールドによりフレームが構成されており、処理回路310は、偶数フィールドにおいて得られる画素信号と奇数フィールドにおいて得られる画素信号とから、フレームの画像を生成する。 に お い て In the interlace method, imaging of even-numbered fields and imaging of odd-numbered fields are alternately performed. A frame is composed of an even field and an odd field, and the processing circuit 310 generates a frame image from a pixel signal obtained in the even field and a pixel signal obtained in the odd field.
 図2に示すように、撮像素子240には格子状に画素が配置されており、各画素に1つのフィルタユニットが設けられている。マゼンダ、グリーン、シアン、イエローの4色のフィルタユニットが2行2列に配置される。図2においてMgはマゼンダを示し、Gはグリーンを示し、Cyはシアンを示し、Yeはイエローを示す。以下では、これらの符号を用いて色を記載する。また、例えばMgフィルタユニットが設けられた画素をMg画素と記載する。 画素 As shown in FIG. 2, pixels are arranged in a grid on the image sensor 240, and one filter unit is provided for each pixel. Four color filter units of magenta, green, cyan, and yellow are arranged in two rows and two columns. In FIG. 2, Mg indicates magenta, G indicates green, Cy indicates cyan, and Ye indicates yellow. Hereinafter, colors will be described using these codes. Further, for example, a pixel provided with an Mg filter unit is referred to as an Mg pixel.
 インタレース方式では、撮像素子240は、Y方向に並ぶ2画素から画素信号を加算読み出しする。即ち、偶数フィールドにおいて、撮像素子240は、第1行のMg画素と第2行のCy画素から加算読み出しを行い、MgCy信号を取得し、第1行のG画素と第2行のYe画素から加算読み出しを行い、GYe信号を取得する。MgCy信号とGYe信号により第1水平ラインが構成される。同様に、撮像素子240は、第3行及び第4行の画素から加算読み出しを行うことで、第3水平ラインを構成するGCy信号及びMgYe信号を取得する。奇数フィールドにおいて、撮像素子240は、第2行及び第3行の画素から加算読み出しを行うことで、第2水平ラインを構成するGCy信号及びMgYe信号を読み出す。また撮像素子240は、第4行及び第5行の画素から加算読み出しを行うことで、第4水平ラインを構成するMgCy信号及びGYe信号を読み出す。以上の第1~第4水平ラインにより、1フレームの画像が構成される。 In the interlaced method, the image sensor 240 adds and reads pixel signals from two pixels arranged in the Y direction. That is, in the even field, the imaging element 240 performs addition reading from the Mg pixels in the first row and the Cy pixels in the second row to obtain the MgCy signal, and obtains the MgCy signal from the G pixels in the first row and the Ye pixels in the second row. The GYe signal is obtained by performing addition reading. A first horizontal line is configured by the MgCy signal and the GYe signal. Similarly, the image sensor 240 obtains the GCy signal and the MgYe signal constituting the third horizontal line by performing addition readout from the pixels in the third and fourth rows. In the odd field, the image sensor 240 reads out the GCy signal and the MgYe signal constituting the second horizontal line by performing addition reading from the pixels in the second and third rows. Further, the image sensor 240 reads out the MgCy signal and the GYe signal constituting the fourth horizontal line by performing addition reading from the pixels in the fourth and fifth rows. The first to fourth horizontal lines constitute an image of one frame.
 まず、白色光観察モードにおける画像生成処理を説明する。補間処理部312は、下式(1)~(4)により画素信号をYCrCb信号に変換する。下式(1)~(4)において、例えばMgCyはMgCy画素信号の信号値を意味する。補間処理部312は、YCrCb信号を補間処理して全画素にYCrCb信号を生成し、そのYCrCb信号をRGB信号に変換することで表示画像を生成する。
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000004
First, the image generation processing in the white light observation mode will be described. The interpolation processing unit 312 converts a pixel signal into a YCrCb signal according to the following equations (1) to (4). In the following equations (1) to (4), for example, MgCy means the signal value of the MgCy pixel signal. The interpolation processing unit 312 generates a display image by interpolating the YCrCb signal to generate a YCrCb signal for all pixels, and converting the YCrCb signal to an RGB signal.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000004
 次に、NBIモードにおける画像生成処理を説明する。図3は、画素信号選択部311が行う処理を説明する図である。図3において、xは水平走査方向における位置であり、yは垂直走査方向における位置である。位置x、yは、画素信号を画像上に配置したときの、その画像上における画素位置を意味する。位置x、yは各々整数である。例えば1行1列の画素を(x,y)=(1,1)と表す。なお以下では、水平走査方向をx方向とも呼び、垂直走査方向をy方向とも呼ぶ。 Next, an image generation process in the NBI mode will be described. FIG. 3 is a diagram illustrating a process performed by the pixel signal selection unit 311. In FIG. 3, x is a position in the horizontal scanning direction, and y is a position in the vertical scanning direction. The positions x and y mean pixel positions on the image when the pixel signals are arranged on the image. The positions x and y are each an integer. For example, a pixel in one row and one column is represented as (x, y) = (1, 1). Hereinafter, the horizontal scanning direction is also referred to as an x direction, and the vertical scanning direction is also referred to as a y direction.
 図2において取得されたMgCy信号、MgYe信号、GCy信号、GYe信号は、図3に示す配置となる。画素信号選択部311は、MgCy信号及びMgYe信号をB狭帯域信号として選択し、GCy信号及びGYe信号をG狭帯域信号として選択する。NBIモードにおいて、B狭帯域信号に基づいて表示画像のGチャンネルが生成されるので、B狭帯域信号は輝度成分画素信号である。2×2画素の領域に着目すると、B狭帯域信号は(x,y)=(0,1)、(1,0)に配置され、G狭帯域信号は(x,y)=(0,0)、(1,1)に配置される。例えばB狭帯域信号が配置される画素を、B狭帯域画素と呼ぶこととする。図3に示すように、B狭帯域画素及びG狭帯域画素は、それぞれ全体の半数を占める。また、x方向及びy方向のいずれにおいても、B狭帯域画素とG狭帯域画素が交互に並ぶ。 Mg The MgCy signal, MgYe signal, GCy signal, and GYe signal acquired in FIG. 2 are arranged as shown in FIG. The pixel signal selection unit 311 selects the MgCy signal and the MgYe signal as the B narrowband signal, and selects the GCy signal and the GYe signal as the G narrowband signal. In the NBI mode, since the G channel of the display image is generated based on the B narrowband signal, the B narrowband signal is a luminance component pixel signal. Focusing on the area of 2 × 2 pixels, the B narrow band signal is arranged at (x, y) = (0, 1) and (1, 0), and the G narrow band signal is (x, y) = (0, 0). 0) and (1, 1). For example, a pixel in which a B narrowband signal is arranged is referred to as a B narrowband pixel. As shown in FIG. 3, each of the B narrowband pixel and the G narrowband pixel occupies half of the whole. In both the x direction and the y direction, B narrowband pixels and G narrowband pixels are alternately arranged.
 図4は、画素信号が有する感度の例である。図4には、G狭帯域光に対する感度と、B狭帯域光に対する感度とを、相対的な数値で表す。 FIG. 4 is an example of the sensitivity of a pixel signal. In FIG. 4, the sensitivity to the G narrow band light and the sensitivity to the B narrow band light are represented by relative numerical values.
 画素信号選択部311は、B狭帯域光に対する感度がG狭帯域光に対する感度以上であるMgCy信号及びMgYe信号を、B狭帯域信号として選択する。また、画素信号選択部311は、G狭帯域光に対する感度がB狭帯域光に対する感度以上であるGCy信号及びGYe信号を、G狭帯域信号として選択する。これにより、図3に示すような配置のB狭帯域画素及びG狭帯域画素が得られる。 (4) The pixel signal selection unit 311 selects the MgCy signal and the MgYe signal whose sensitivity to the B narrow band light is equal to or higher than the sensitivity to the G narrow band light as the B narrow band signal. In addition, the pixel signal selection unit 311 selects the GCy signal and the GYe signal whose sensitivity to the G narrow band light is equal to or higher than the sensitivity to the B narrow band light, as the G narrow band signal. As a result, a narrow-band B pixel and a narrow-band G pixel arranged as shown in FIG. 3 are obtained.
 なお、撮像素子240が有するカラーフィルタは補色フィルタに限定されない。図5に、補色フィルタではないカラーフィルタの一例を示す。 The color filter of the image sensor 240 is not limited to a complementary color filter. FIG. 5 shows an example of a color filter that is not a complementary color filter.
 図5では、ベイヤ配列のRフィルタがMgフィルタに置き換えられている。即ち、(x,y)=(0,0)、(1,1)にGフィルタが配置され、(x,y)=(1,0)にBフィルタが配置され、(x,y)=(0,1)にMgフィルタが配置される。撮像素子240には、この2×2画素が繰り返し配置されている。撮像素子240は、G画素、B画素、Mg画素から、それぞれG信号、B信号、Mg信号を読み出す。Bフィルタ及びMgフィルタはB狭帯域光を透過し、GフィルタはG狭帯域光を透過する。このため、画素信号選択部311は、B信号及びMg信号をB狭帯域信号として選択し、G信号をG狭帯域信号として選択する。これにより、補色フィルタの場合と同様なB狭帯域画素及びG狭帯域画素の配置となる。以降の補間処理等についても、補色フィルタの場合と同様に適用できる。 In FIG. 5, the Bayer array R filter is replaced with an Mg filter. That is, a G filter is arranged at (x, y) = (0, 0), (1, 1), a B filter is arranged at (x, y) = (1, 0), and (x, y) = An Mg filter is arranged at (0, 1). The 2 × 2 pixels are repeatedly arranged on the image sensor 240. The image sensor 240 reads out the G signal, the B signal, and the Mg signal from the G pixel, the B pixel, and the Mg pixel, respectively. The B filter and the Mg filter transmit the B narrow band light, and the G filter transmits the G narrow band light. For this reason, the pixel signal selection unit 311 selects the B signal and the Mg signal as the B narrowband signal, and selects the G signal as the G narrowband signal. Thereby, the arrangement of the B narrow band pixel and the G narrow band pixel is the same as that of the complementary color filter. The following interpolation processing can be applied in the same manner as in the case of the complementary color filter.
 図6は、補間処理部312が行う処理を説明する図である。補間処理部312は、B狭帯域信号とG狭帯域信号とを、それぞれ独立に補間処理する。 FIG. 6 is a diagram illustrating the processing performed by the interpolation processing unit 312. The interpolation processing unit 312 independently performs interpolation processing on the B narrowband signal and the G narrowband signal.
 画素信号選択部311から補間処理部312に入力された画素信号は、1画素につき1色の画素信号になっている。即ち、半数の画素においてB狭帯域信号が欠落し、残りの半数の画素においてG狭帯域信号が欠落している。 The pixel signal input from the pixel signal selection unit 311 to the interpolation processing unit 312 is a pixel signal of one color per pixel. That is, the B narrow band signal is missing in half of the pixels, and the G narrow band signal is missing in the remaining half of the pixels.
 補間処理部312は、補間処理部312からのB狭帯域信号に対して補間処理を行う。補間処理は、例えば5×5画素の平滑化フィルタ処理等である。なお、平滑化フィルタの平滑化範囲は5×5画素に限定されない。補間処理が行われることで、B狭帯域信号が欠落した画素に対してB狭帯域信号が補間される。このB狭帯域信号により構成される画像を、以下ではB狭帯域画像と呼ぶ。補間処理部312は、B狭帯域画像に対してエッジ強調処理を行う。即ち、補間処理部312は、B狭帯域画像から高周波成分を抽出し、その高周波成分をB狭帯域画像に加算する。補間処理部312は、例えばB狭帯域画像を7×7画素の適応的ローパスフィルター処理した後、その結果に対して7×7画素のハイパスフィルター処理を行うことで、高周波成分を抽出する。補間処理部312は、エッジ強調処理後のB狭帯域信号を最終的なB狭帯信号として出力する。なお、エッジ強調処理の内容は上記に限定されない。また、エッジ強調処理は省略されてもよい。 The interpolation processing unit 312 performs an interpolation process on the B narrowband signal from the interpolation processing unit 312. The interpolation processing is, for example, a smoothing filter processing of 5 × 5 pixels. Note that the smoothing range of the smoothing filter is not limited to 5 × 5 pixels. By performing the interpolation processing, the B narrowband signal is interpolated with respect to the pixel where the B narrowband signal is missing. The image constituted by the B narrowband signal is hereinafter referred to as a B narrowband image. The interpolation processing unit 312 performs an edge enhancement process on the B narrowband image. That is, the interpolation processing unit 312 extracts a high-frequency component from the B narrow-band image and adds the high-frequency component to the B narrow-band image. The interpolation processing unit 312 extracts a high-frequency component by performing, for example, an adaptive low-pass filter process of 7 × 7 pixels on the B narrowband image, and then performing a high-pass filter process of 7 × 7 pixels on the result. The interpolation processing unit 312 outputs the B narrowband signal after the edge enhancement processing as a final B narrowband signal. Note that the content of the edge enhancement processing is not limited to the above. Further, the edge enhancement processing may be omitted.
 また補間処理部312は、補間処理部312からのG狭帯域信号に対して補間処理を行う。補間フィルタは、例えば5×5画素の平滑化フィルタ等である。なお、平滑化フィルタの平滑化範囲は5×5画素に限定されない。補間処理が行われることで、G狭帯域信号が欠落した画素に対してG狭帯域信号が補間される。このG狭帯域信号により構成される画像を、G狭帯域画像とも呼ぶ。補間処理部312は、補間処理後のG狭帯域信号を出力する。 (5) The interpolation processing unit 312 performs an interpolation process on the G narrow band signal from the interpolation processing unit 312. The interpolation filter is, for example, a 5 × 5 pixel smoothing filter. Note that the smoothing range of the smoothing filter is not limited to 5 × 5 pixels. By performing the interpolation processing, the G narrow band signal is interpolated with respect to the pixel where the G narrow band signal is missing. The image constituted by the G narrow band signal is also called a G narrow band image. The interpolation processing unit 312 outputs the G narrow band signal after the interpolation processing.
 補間処理部312は、B狭帯域画像を表示画像のGチャンネル及びBチャンネルとし、G狭帯域画像を表示画像のRチャンネルとする。これにより、NBIモードにおける表示画像が生成される。 The interpolation processing unit 312 sets the B narrowband image as the G channel and the B channel of the display image, and sets the G narrowband image as the R channel of the display image. Thus, a display image in the NBI mode is generated.
 3.変形例 {3. Modified example
 以下、種々の変形例について説明する。なお、以下の変形例を適宜に組み合わせることが可能である。 Hereinafter, various modifications will be described. Note that the following modifications can be appropriately combined.
 第1変形例では、補間処理部312が、補間処理後に色分離処理を行う。図7、図8は、画素信号における混色を説明する図である。図7、図8において、BNはB狭帯域光のスペクトルを示し、GNはG狭帯域光のスペクトルを示す。 In the first modification, the interpolation processing unit 312 performs the color separation processing after the interpolation processing. FIGS. 7 and 8 are diagrams illustrating color mixing in pixel signals. 7 and 8, BN indicates the spectrum of the B narrow band light, and GN indicates the spectrum of the G narrow band light.
 図7には、B狭帯域信号として選択されるMgCy信号及びMgYe信号のスペクトルを示す。これらの信号は、B狭帯域光だけでなくG狭帯域光に対しても感度を有する。また、図8には、G狭帯域信号として選択されるGCy信号及びGYe信号のスペクトルを示す。これらの信号は、G狭帯域光だけでなくB狭帯域光に対しても感度を有する。B狭帯域光及びG狭帯域光は同時に被写体に照射されるため、B狭帯域信号にはG狭帯域の成分が混色し、G狭帯域信号にはB狭帯域の成分が混色することになる。 FIG. 7 shows the spectra of the MgCy signal and the MgYe signal selected as the B narrowband signal. These signals are sensitive not only to B narrowband light but also to G narrowband light. FIG. 8 shows the spectra of the GCy signal and the GYe signal selected as the G narrow-band signal. These signals are sensitive not only to G narrowband light but also to B narrowband light. Since the B narrow-band light and the G narrow-band light are simultaneously irradiated on the subject, the B narrow-band signal is mixed with the G narrow-band component, and the G narrow-band signal is mixed with the B narrow-band component. .
 補間処理部312は、下式(5)に示す色分離処理を行うことで上記の混色を低減する。下式(5)において、B(x,y)は位置(x,y)のB狭帯域信号であり、G(x,y)は位置(x,y)のG狭帯域信号である。補間処理部312は、例えば図6の補間処理とエッジ強調処理の間においてB狭帯域信号の色分離処理を行う。
Figure JPOXMLDOC01-appb-M000005
The interpolation processing unit 312 reduces the above-described color mixture by performing a color separation process represented by the following equation (5). In the following equation (5), B (x, y) is a B narrowband signal at a position (x, y), and G (x, y) is a G narrowband signal at a position (x, y). The interpolation processing unit 312 performs color separation processing of the B narrowband signal between the interpolation processing and the edge enhancement processing in FIG. 6, for example.
Figure JPOXMLDOC01-appb-M000005
 第2変形例では、補間処理部312が方向判別型の補間処理を行う。即ち、図6の補間処理が、方向判別型の補間処理となる。以下、B狭帯域信号に対する補間処理を例に説明するが、G狭帯域信号に対する補間処理も同様である。 In the second modification, the interpolation processing unit 312 performs a direction discrimination type interpolation process. That is, the interpolation processing of FIG. 6 is a direction determination type interpolation processing. Hereinafter, the interpolation process for the B narrowband signal will be described as an example, but the same applies to the interpolation process for the G narrowband signal.
 まず方向判別型の補間処理の第1例について説明する。この例では3×3画素の範囲を用いて補間処理を行う。補間処理部312は、(x,y)におけるエッジ方向の判定を行い、その判定結果に基づいて下式(6)~(8)のいずれかを選択する。(x,y)は、B狭帯域信号が欠落した位置である。エッジ方向がy方向と判定された場合、補間処理部312は下式(6)によりB狭帯域信号を求める。エッジ方向がx方向と判定された場合、補間処理部312は下式(7)によりB狭帯域信号を求める。エッジがないと判定された場合、補間処理部312は下式(8)によりB狭帯域信号を求める。
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000008
First, a first example of the direction determination type interpolation processing will be described. In this example, interpolation processing is performed using a range of 3 × 3 pixels. The interpolation processing unit 312 determines the edge direction at (x, y), and selects one of the following equations (6) to (8) based on the determination result. (X, y) is the position where the B narrowband signal is missing. When it is determined that the edge direction is the y direction, the interpolation processing unit 312 obtains a B narrowband signal by the following equation (6). When it is determined that the edge direction is the x direction, the interpolation processing unit 312 obtains a B narrowband signal by the following equation (7). When it is determined that there is no edge, the interpolation processing unit 312 obtains a B narrowband signal by the following equation (8).
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000008
 次に方向判別型の補間処理の第2例について説明する。この例では3×3画素よりも広い範囲を用いて補間処理を行う。図9~図11は、方向判別型の補間処理の第2例を説明する図である。図9~図11において、補間処理に用いる画素の範囲を点線で示す。中央の太線四角は、補間対象となる画素を示す。 Next, a description will be given of a second example of the direction determination type interpolation processing. In this example, the interpolation processing is performed using a range wider than 3 × 3 pixels. 9 to 11 are diagrams for explaining a second example of the direction discrimination-type interpolation processing. 9 to 11, the range of pixels used for the interpolation processing is indicated by a dotted line. The bold square in the center indicates a pixel to be interpolated.
 補間処理部312は、(x,y)におけるエッジ方向の判定を行い、その判定結果に基づいて下式(9)~(11)のいずれかを選択する。エッジ方向がy方向と判定された場合、補間処理部312は、図9に示す範囲の画素を用いて下式(9)によりB狭帯域信号を求める。エッジ方向がx方向と判定された場合、補間処理部312は、図10に示す範囲の画素を用いて下式(10)によりB狭帯域信号を求める。エッジがないと判定された場合、補間処理部312は、図11に示す範囲の画素を用いて下式(11)によりB狭帯域信号を求める。
Figure JPOXMLDOC01-appb-M000009
Figure JPOXMLDOC01-appb-M000010
Figure JPOXMLDOC01-appb-M000011
The interpolation processing unit 312 determines the edge direction at (x, y), and selects one of the following equations (9) to (11) based on the determination result. When it is determined that the edge direction is the y direction, the interpolation processing unit 312 obtains a B narrowband signal by using the pixels in the range shown in FIG. When it is determined that the edge direction is the x direction, the interpolation processing unit 312 obtains a B narrowband signal by using the pixels in the range shown in FIG. If it is determined that there is no edge, the interpolation processing unit 312 obtains a B narrowband signal using the pixels in the range shown in FIG.
Figure JPOXMLDOC01-appb-M000009
Figure JPOXMLDOC01-appb-M000010
Figure JPOXMLDOC01-appb-M000011
 第3変形例では、エッジ方向に補間処理された結果と、方向判別を用いずに補間処理された結果とを、信頼度に基づいてブレンドする。以下、B狭帯域信号に対する補間処理を例に説明するが、G狭帯域信号に対する補間処理も同様である。 In the third modification, the result of the interpolation processing in the edge direction and the result of the interpolation processing without using the direction determination are blended based on the reliability. Hereinafter, the interpolation process for the B narrowband signal will be described as an example, but the same applies to the interpolation process for the G narrowband signal.
 補間処理部312は、下式(12)により信頼度の指標値RLB(x,y)を求める。ave_MgCyは、(x,y)におけるMgCy信号の平均値であり、(x,y)の周辺画素のMgCy信号から求めた平均値である。ave_MgYeも同様である。指標値RLB(x,y)は、0<RLB(x,y)≦1の範囲の数値である。指標値RLB(x,y)が大きいほど信頼度が高い。
Figure JPOXMLDOC01-appb-M000012
The interpolation processing unit 312 obtains the reliability index value RLB (x, y) by the following equation (12). ave_MgCy is the average value of the MgCy signal at (x, y), and is the average value obtained from the MgCy signals of the peripheral pixels at (x, y). The same applies to ave_MgYe. The index value RLB (x, y) is a numerical value in the range of 0 <RLB (x, y) ≦ 1. The larger the index value RLB (x, y), the higher the reliability.
Figure JPOXMLDOC01-appb-M000012
 補間処理部312は、下式(13)により、(x,y)におけるB狭帯域信号を求める。Bedge(x,y)は、エッジ方向に補間処理された結果である。上式(6)~(8)の方向判別型の補間処理を用いた場合、Bedge(x,y)は、上式(6)又は(7)の演算結果である。Bave(x,y)は、方向判別を用いずに補間処理された結果であり、例えば上式(8)の演算結果である。
Figure JPOXMLDOC01-appb-M000013
The interpolation processing unit 312 obtains a B narrowband signal at (x, y) by the following equation (13). Bededge (x, y) is the result of interpolation processing in the edge direction. When the direction discriminating type interpolation processing of the above equations (6) to (8) is used, Bedge (x, y) is the calculation result of the above equation (6) or (7). Bave (x, y) is a result of interpolation processing without using direction discrimination, and is, for example, a calculation result of the above equation (8).
Figure JPOXMLDOC01-appb-M000013
 第4変形例では、狭帯域光に対する画素信号の感度に基づいて、補間処理における画素信号のウェイトを設定する。以下、B狭帯域信号に対する補間処理を例に説明するが、G狭帯域信号に対する補間処理も同様である。 In the fourth modification, the weight of the pixel signal in the interpolation processing is set based on the sensitivity of the pixel signal to narrowband light. Hereinafter, the interpolation process for the B narrowband signal will be described as an example, but the same applies to the interpolation process for the G narrowband signal.
 図7に示すように、MgCy信号とMgYe信号を比べると、B狭帯域光に対する感度はMgCy信号の方が高い。このため、B狭帯域信号に対する補間処理において、MgCy信号のウェイトをMgYe信号のウェイトよりも高くする。例えば上式(9)~(11)において、MgCyの項にウェイトW_MgCyを乗算し、MgYeの項にウェイトW_MgYeを乗算する。W_MgCy>W_MgYeである。W_MgCy、W_MgYeは、画素信号のスペクトル等に基づいて予め設定される。 感 度 As shown in FIG. 7, when comparing the MgCy signal and the MgYe signal, the sensitivity to the B narrowband light is higher for the MgCy signal. For this reason, in the interpolation processing for the B narrowband signal, the weight of the MgCy signal is set higher than the weight of the MgYe signal. For example, in the above equations (9) to (11), the term of MgCy is multiplied by the weight W_MgCy, and the term of MgYe is multiplied by the weight W_MgYe. W_MgCy> W_MgYe. W_MgCy and W_MgYe are set in advance based on the spectrum of the pixel signal and the like.
 以上、本発明を適用した実施形態およびその変形例について説明したが、本発明は、各実施形態やその変形例そのままに限定されるものではなく、実施段階では、発明の要旨を逸脱しない範囲内で構成要素を変形して具体化することができる。また、上記した各実施形態や変形例に開示されている複数の構成要素を適宜組み合わせることによって、種々の発明を形成することができる。例えば、各実施形態や変形例に記載した全構成要素からいくつかの構成要素を削除してもよい。さらに、異なる実施の形態や変形例で説明した構成要素を適宜組み合わせてもよい。このように、発明の主旨を逸脱しない範囲内において種々の変形や応用が可能である。また、明細書又は図面において、少なくとも一度、より広義または同義な異なる用語と共に記載された用語は、明細書又は図面のいかなる箇所においても、その異なる用語に置き換えることができる。 As described above, the embodiments to which the present invention is applied and the modifications thereof have been described. However, the present invention is not limited to the embodiments and the modifications thereof as it is. Can be embodied by modifying the components. In addition, various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in each of the above embodiments and modifications. For example, some components may be deleted from all the components described in the embodiments and the modifications. Furthermore, the components described in the different embodiments and modified examples may be appropriately combined. Thus, various modifications and applications are possible without departing from the gist of the invention. Further, in the specification or the drawings, a term described at least once together with a broader or synonymous different term can be replaced with the different term in any part of the specification or the drawing.
100 光源、110 白色光源、120 レンズ、130 フィルタ、200 挿入部、210 ライトガイド、220 照明レンズ、230 対物レンズ、240 撮像素子、250 A/D変換回路、260 メモリ、300 制御装置、310 処理回路、311 画素信号選択部、312 補間処理部、320 制御回路、400 表示部、500 外部I/F部、Mg,Cy,Ye,G フィルタユニット、MgCy,MgYe,GCy,GYe 画素信号 100 ° light source, 110 ° white light source, 120 ° lens, 130 ° filter, 200 ° insertion section, 210 ° light guide, 220 ° illumination lens, 230 ° objective lens, 240 ° image sensor, 250 ° A / D conversion circuit, 260 ° memory, 300 ° control device, 310 ° processing circuit Pixel signal selection section, 311 interpolation processing section, 320 control circuit, 400 display section, 500 external I / F section, Mg, Cy, Ye, G filter unit, MgCy, MgYe, GCy, GYe pixel signal

Claims (17)

  1.  第1狭帯域光及び第2狭帯域光を有する狭帯域照明光を出射する光源と、
     複数の画素の各画素に対して各フィルタユニットが設けられるカラーフィルタを有し、前記複数の画素によって受光された光を光電変換する撮像素子と、
     前記撮像素子からの画素信号のうち、前記第1狭帯域光に感度を有する第1狭帯域画素信号と、前記第2狭帯域光に感度を有する第2狭帯域画素信号とを選択する処理回路と、
     を含み、
     前記カラーフィルタは、
     前記第1狭帯域光を透過するフィルタユニットを総数の半数以上有し、且つ前記第2狭帯域光を透過するフィルタユニットを前記総数の半数以上有し、
     前記処理回路は、
     前記第1狭帯域画素信号を、前記第2狭帯域画素信号を用いずに補間処理する第1補間処理と、前記第2狭帯域画素信号を、前記第1狭帯域画素信号を用いずに補間処理する第2補間処理とを行うことを特徴とする内視鏡装置。
    A light source that emits narrow-band illumination light having first narrow-band light and second narrow-band light;
    An image sensor that has a color filter in which each filter unit is provided for each of the plurality of pixels, and that photoelectrically converts light received by the plurality of pixels;
    A processing circuit that selects a first narrow-band pixel signal having sensitivity to the first narrow-band light and a second narrow-band pixel signal having sensitivity to the second narrow-band light among the pixel signals from the image sensor; When,
    Including
    The color filter,
    The filter includes a filter unit that transmits the first narrow band light, the filter unit transmits half of the total number of filter units,
    The processing circuit includes:
    A first interpolation process of interpolating the first narrowband pixel signal without using the second narrowband pixel signal, and interpolating the second narrowband pixel signal without using the first narrowband pixel signal An endoscope apparatus, which performs a second interpolation process.
  2.  請求項1において、
     前記処理回路は、
     前記第1補間処理により得られる第1狭帯域画像、及び前記第2補間処理により得られる第2狭帯域画像に基づいて、前記第1狭帯域画像と前記第2狭帯域画像の間の色分離処理を行うことを特徴とする内視鏡装置。
    In claim 1,
    The processing circuit includes:
    Color separation between the first narrowband image and the second narrowband image based on a first narrowband image obtained by the first interpolation process and a second narrowband image obtained by the second interpolation process An endoscope apparatus for performing processing.
  3.  請求項1において、
     前記処理回路は、
     前記第1狭帯域画素信号の明るさ及び前記第2狭帯域画素信号の明るさに基づいて、方向判別型の補間処理における方向判別の信頼度を示す指標値を求め、
     前記第1補間処理及び前記第2補間処理において、前記指標値に基づいて前記方向判別型の補間処理を行うことを特徴とする内視鏡装置。
    In claim 1,
    The processing circuit includes:
    Based on the brightness of the first narrowband pixel signal and the brightness of the second narrowband pixel signal, determine an index value indicating the reliability of direction determination in direction determination type interpolation processing,
    An endoscope apparatus, wherein in the first interpolation processing and the second interpolation processing, the direction discriminating-type interpolation processing is performed based on the index value.
  4.  請求項3において、
     前記処理回路は、
     前記方向判別型の補間処理において、エッジ方向の補間処理結果と、非方向判別型の補間処理結果とを、前記指標値に基づくブレンド率でブレンドし、
     前記信頼度が低いほど前記非方向判別型の補間処理結果の前記ブレンド率を大きくすることを特徴とする内視鏡装置。
    In claim 3,
    The processing circuit includes:
    In the direction discrimination type interpolation processing, the interpolation processing result in the edge direction and the non-direction discrimination type interpolation processing result are blended at a blend ratio based on the index value,
    The endoscope apparatus according to claim 1, wherein the lower the reliability, the larger the blending ratio of the result of the non-direction discrimination-type interpolation processing.
  5.  請求項3において、
     前記処理回路は、
     前記第1狭帯域画素信号の明るさと、前記第2狭帯域画素信号の明るさとの比に基づいて前記指標値を求めることを特徴とする内視鏡装置。
    In claim 3,
    The processing circuit includes:
    The endoscope apparatus according to claim 1, wherein the index value is obtained based on a ratio between the brightness of the first narrowband pixel signal and the brightness of the second narrowband pixel signal.
  6.  請求項1において、
     前記第1狭帯域画素信号は、第1画素信号と、前記第1画素信号よりも前記第1狭帯域光に対する感度が高い第2画素信号とを含み、
     前記第2狭帯域画素信号は、第3画素信号と、前記第3画素信号よりも前記第2狭帯域光に対する感度が高い第4画素信号とを含み、
     前記処理回路は、
     前記第1補間処理において、前記第2画素信号のウェイトを前記第1画素信号のウェイトよりも大きくし、前記第2補間処理において、前記第4画素信号のウェイトを前記第3画素信号のウェイトよりも大きくすることを特徴とする内視鏡装置。
    In claim 1,
    The first narrowband pixel signal includes a first pixel signal, and a second pixel signal having a higher sensitivity to the first narrowband light than the first pixel signal,
    The second narrowband pixel signal includes a third pixel signal, and a fourth pixel signal having a higher sensitivity to the second narrowband light than the third pixel signal,
    The processing circuit includes:
    In the first interpolation processing, the weight of the second pixel signal is made larger than the weight of the first pixel signal, and in the second interpolation processing, the weight of the fourth pixel signal is made larger than the weight of the third pixel signal. An endoscope apparatus characterized in that the size of the endoscope is also increased.
  7.  請求項1において、
     前記第1狭帯域画素信号は、互いに異なる色に対応した第1画素信号及び第2画素信号を含み、
     前記第2狭帯域画素信号は、互いに異なる色に対応した第3画素信号及び第4画素信号を含み、
     前記処理回路は、
     前記第1補間処理において、第1対象画素の周辺画素における前記第1画素信号の平均値と、前記第1対象画素の周辺画素における前記第2画素信号の平均値とに基づいて、前記第1画素信号及び前記第2画素信号の間の明るさ補正を行い、
     前記第2補間処理において、第2対象画素の周辺画素における前記第3画素信号の平均値と、前記第2対象画素の周辺画素における前記第4画素信号の平均値とに基づいて、前記第3画素信号及び前記第4画素信号の間の明るさ補正を行うことを特徴とする内視鏡装置。
    In claim 1,
    The first narrowband pixel signal includes a first pixel signal and a second pixel signal corresponding to different colors,
    The second narrowband pixel signal includes a third pixel signal and a fourth pixel signal corresponding to different colors,
    The processing circuit includes:
    In the first interpolation processing, based on an average value of the first pixel signal in peripheral pixels of the first target pixel and an average value of the second pixel signal in peripheral pixels of the first target pixel, Performing a brightness correction between the pixel signal and the second pixel signal,
    In the second interpolation processing, the third interpolation processing is performed based on an average value of the third pixel signal in peripheral pixels of the second target pixel and an average value of the fourth pixel signal in peripheral pixels of the second target pixel. An endoscope apparatus for performing brightness correction between a pixel signal and the fourth pixel signal.
  8.  請求項1において、
     前記処理回路は、
     前記第1狭帯域光に対する感度が前記第2狭帯域光に対する感度以上である画素信号を、前記第1狭帯域画素信号として選択し、前記第2狭帯域光に対する感度が前記第1狭帯域光に対する感度より大きい画素信号を、前記第2狭帯域画素信号として選択することを特徴とする内視鏡装置。
    In claim 1,
    The processing circuit includes:
    A pixel signal whose sensitivity to the first narrowband light is equal to or greater than the sensitivity to the second narrowband light is selected as the first narrowband pixel signal, and the sensitivity to the second narrowband light is the first narrowband light. An endoscope apparatus which selects a pixel signal having a higher sensitivity to the second narrowband pixel signal.
  9.  請求項1において、
     前記処理回路は、
     前記第1狭帯域画素信号及び前記第2狭帯域画素信号のうち混色が小さい狭帯域画素信号を基準として、前記第1狭帯域画素信号と前記第2狭帯域画素信号との間の明るさレベルを調整することを特徴とする内視鏡装置。
    In claim 1,
    The processing circuit includes:
    A brightness level between the first narrow-band pixel signal and the second narrow-band pixel signal with reference to a narrow-band pixel signal having a small color mixture among the first narrow-band pixel signal and the second narrow-band pixel signal. An endoscope apparatus characterized by adjusting the following.
  10.  請求項1において、
     前記カラーフィルタは、補色フィルタであることを特徴とする内視鏡装置。
    In claim 1,
    The endoscope apparatus, wherein the color filter is a complementary color filter.
  11.  請求項10において、
     前記カラーフィルタは、
     マゼンダ及びイエロー、シアン、グリーンのフィルタユニットを含むことを特徴とする内視鏡装置。
    In claim 10,
    The color filter,
    An endoscope apparatus comprising magenta, yellow, cyan, and green filter units.
  12.  請求項1において、
     前記カラーフィルタは、
     原色ベイヤ配列のカラーフィルタにおいて、赤色のフィルタユニットがマゼンダのフィルタユニットに置き換えられたカラーフィルタであることを特徴とする内視鏡装置。
    In claim 1,
    The color filter,
    An endoscope apparatus characterized in that in a color filter of a primary color Bayer array, a red filter unit is replaced by a magenta filter unit.
  13.  請求項1において、
     前記処理回路は、
     前記第1補間処理により得られる第1狭帯域画像を表示画像のGチャンネルに入力することで、前記表示画像を生成することを特徴とする内視鏡装置。
    In claim 1,
    The processing circuit includes:
    An endoscope apparatus wherein the display image is generated by inputting a first narrowband image obtained by the first interpolation processing to a G channel of a display image.
  14.  請求項13において、
     前記第1狭帯域光は、白色光における青色の波長帯域に属し、且つ前記青色の波長帯域よりも狭い波長帯域を有し、
     前記第2狭帯域光は、白色光における緑色の波長帯域に属し、且つ前記緑色の波長帯域よりも狭い波長帯域を有し、
     前記処理回路は、
     前記第1狭帯域画像を前記表示画像の前記Gチャンネル及びBチャンネルに入力し、前記第2補間処理により得られる第2狭帯域画像を前記表示画像のRチャンネルに入力することを特徴とする内視鏡装置。
    In claim 13,
    The first narrowband light belongs to a blue wavelength band in white light, and has a wavelength band narrower than the blue wavelength band,
    The second narrowband light belongs to a green wavelength band in white light, and has a wavelength band narrower than the green wavelength band,
    The processing circuit includes:
    The first narrowband image is input to the G channel and the B channel of the display image, and the second narrowband image obtained by the second interpolation processing is input to the R channel of the display image. Endoscope device.
  15.  請求項1において、
     前記光源は、
     白色光観察モードにおいて、白色照明光を出射し、狭帯域光観察モードにおいて、前記狭帯域照明光を出射し、
     前記カラーフィルタは、
     前記白色照明光が出射される場合において白色光画像を撮像するための複数のフィルタユニットを有し、
     前記第1狭帯域光を透過するフィルタユニットは、前記複数のフィルタユニットの半数以上であり、且つ前記第2狭帯域光を透過するフィルタユニットは、前記複数のフィルタユニットの半数以上であり、
     前記処理回路は、前記狭帯域光観察モードにおいて、前記第1狭帯域画素信号及び前記第2狭帯域画素信号を選択し、前記第1補間処理及び前記第2補間処理を行うことを特徴とする内視鏡装置。
    In claim 1,
    The light source is
    In the white light observation mode, emit white illumination light, in the narrow band light observation mode, emit the narrow band illumination light,
    The color filter,
    Having a plurality of filter units for capturing a white light image when the white illumination light is emitted,
    The filter unit that transmits the first narrowband light is at least half of the plurality of filter units, and the filter unit that transmits the second narrowband light is at least half of the plurality of filter units,
    The processing circuit, in the narrowband light observation mode, selects the first narrowband pixel signal and the second narrowband pixel signal, and performs the first interpolation process and the second interpolation process. Endoscope device.
  16.  第1狭帯域光及び第2狭帯域光を有する狭帯域照明光を出射する光源と、複数の画素の各画素に対して各フィルタユニットが設けられるカラーフィルタを有する撮像素子と、を含む内視鏡装置の作動方法であって、
     前記カラーフィルタが、前記第1狭帯域光を透過するフィルタユニットを半数以上有し、且つ前記第2狭帯域光を透過するフィルタユニットを半数以上有する場合において、
     前記撮像素子からの画素信号のうち、前記第1狭帯域光に感度を有する第1狭帯域画素信号と、前記第2狭帯域光に感度を有する第2狭帯域画素信号とを選択し、
     前記第1狭帯域画素信号を、前記第2狭帯域画素信号を用いずに補間処理する第1補間処理と、前記第2狭帯域画素信号を、前記第1狭帯域画素信号を用いずに補間処理する第2補間処理とを行うことを特徴とする内視鏡装置の作動方法。
    Endoscope including a light source that emits narrow-band illumination light having first narrow-band light and second narrow-band light, and an image sensor having a color filter in which each filter unit is provided for each of the plurality of pixels A method of operating a mirror device,
    In the case where the color filter has half or more filter units transmitting the first narrow band light, and has half or more filter units transmitting the second narrow band light,
    Among the pixel signals from the imaging element, a first narrowband pixel signal having sensitivity to the first narrowband light and a second narrowband pixel signal having sensitivity to the second narrowband light are selected,
    A first interpolation process of interpolating the first narrowband pixel signal without using the second narrowband pixel signal, and interpolating the second narrowband pixel signal without using the first narrowband pixel signal And a second interpolation process for processing.
  17.  複数の画素の各画素に対して各フィルタユニットが設けられるカラーフィルタを有する撮像素子が撮影した画像を処理するプログラムであって、
     前記カラーフィルタが、狭帯域照明光の第1狭帯域光を透過するフィルタユニットを半数以上有し、且つ前記狭帯域照明光の第2狭帯域光を透過するフィルタユニットを半数以上有する場合において、
     前記撮像素子からの画素信号のうち、前記第1狭帯域光に感度を有する第1狭帯域画素信号と、前記第2狭帯域光に感度を有する第2狭帯域画素信号とを選択し、
     前記第1狭帯域画素信号を、前記第2狭帯域画素信号を用いずに補間処理する第1補間処理と、前記第2狭帯域画素信号を、前記第1狭帯域画素信号を用いずに補間処理する第2補間処理とを行うステップを、
     コンピュータに実行させるプログラム。
    A program for processing an image captured by an image sensor having a color filter provided with each filter unit for each of a plurality of pixels,
    In the case where the color filter has half or more filter units that transmit the first narrowband light of the narrowband illumination light, and has half or more filter units that transmit the second narrowband light of the narrowband illumination light,
    Among the pixel signals from the imaging element, a first narrowband pixel signal having sensitivity to the first narrowband light and a second narrowband pixel signal having sensitivity to the second narrowband light are selected,
    A first interpolation process of interpolating the first narrowband pixel signal without using the second narrowband pixel signal, and interpolating the second narrowband pixel signal without using the first narrowband pixel signal Performing a second interpolation process to be processed,
    A program to be executed by a computer.
PCT/JP2018/028542 2018-07-31 2018-07-31 Endoscope device, and endoscope device operating method and program WO2020026323A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/028542 WO2020026323A1 (en) 2018-07-31 2018-07-31 Endoscope device, and endoscope device operating method and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/028542 WO2020026323A1 (en) 2018-07-31 2018-07-31 Endoscope device, and endoscope device operating method and program

Publications (1)

Publication Number Publication Date
WO2020026323A1 true WO2020026323A1 (en) 2020-02-06

Family

ID=69230866

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/028542 WO2020026323A1 (en) 2018-07-31 2018-07-31 Endoscope device, and endoscope device operating method and program

Country Status (1)

Country Link
WO (1) WO2020026323A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013066146A (en) * 2011-08-31 2013-04-11 Sony Corp Image processing device, image processing method, and program
JP2015066132A (en) * 2013-09-27 2015-04-13 富士フイルム株式会社 Endoscope system and operation method thereof
JP2015116328A (en) * 2013-12-18 2015-06-25 オリンパス株式会社 Endoscope apparatus
WO2015093295A1 (en) * 2013-12-20 2015-06-25 オリンパス株式会社 Endoscopic device
WO2016084257A1 (en) * 2014-11-28 2016-06-02 オリンパス株式会社 Endoscope apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013066146A (en) * 2011-08-31 2013-04-11 Sony Corp Image processing device, image processing method, and program
JP2015066132A (en) * 2013-09-27 2015-04-13 富士フイルム株式会社 Endoscope system and operation method thereof
JP2015116328A (en) * 2013-12-18 2015-06-25 オリンパス株式会社 Endoscope apparatus
WO2015093295A1 (en) * 2013-12-20 2015-06-25 オリンパス株式会社 Endoscopic device
WO2016084257A1 (en) * 2014-11-28 2016-06-02 オリンパス株式会社 Endoscope apparatus

Similar Documents

Publication Publication Date Title
US7944466B2 (en) Endoscope apparatus
US10159404B2 (en) Endoscope apparatus
US9582878B2 (en) Image processing device and operation method therefor
KR101009559B1 (en) Living body observation equipmnet
JP4996773B2 (en) Endoscope device
JP4847250B2 (en) Endoscope device
US10039439B2 (en) Endoscope system and method for operating the same
US10070771B2 (en) Image processing apparatus, method for operating image processing apparatus, computer-readable recording medium, and endoscope device
US10194849B2 (en) Endoscope system and method for operating the same
US10765295B2 (en) Image processing apparatus for detecting motion between two generated motion detection images based on images captured at different times, method for operating such image processing apparatus, computer-readable recording medium, and endoscope device
JP5808031B2 (en) Endoscope system
US20160089010A1 (en) Endoscope system, processor device, and method for operating endoscope system
US11045079B2 (en) Endoscope device, image processing apparatus, image processing method, and program
JP2006061620A (en) Video signal processor for endoscope
US20200169686A1 (en) Simultaneous Display of Two or More Different Sequentially Processed Images
WO2016079831A1 (en) Image processing device, image processing method, image processing program and endoscopic device
US20170251915A1 (en) Endoscope apparatus
JP2016015995A (en) Electronic endoscope system, and processor for electronic endoscope
US11596293B2 (en) Endoscope system and operation method therefor
JP6556076B2 (en) Endoscopic image signal processing apparatus and method, and program
JP2010200883A (en) Device, method, and program for processing endoscopic image
WO2020026323A1 (en) Endoscope device, and endoscope device operating method and program
JP7163386B2 (en) Endoscope device, method for operating endoscope device, and program for operating endoscope device
WO2017170232A1 (en) Endoscope image signal processing device, method, and program
WO2021152704A1 (en) Endoscope device, operating method for endoscope device, and image processing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18928152

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18928152

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP