WO2017046876A1 - Endoscope system, image processing apparatus, and image processing method - Google Patents

Endoscope system, image processing apparatus, and image processing method Download PDF

Info

Publication number
WO2017046876A1
WO2017046876A1 PCT/JP2015/076200 JP2015076200W WO2017046876A1 WO 2017046876 A1 WO2017046876 A1 WO 2017046876A1 JP 2015076200 W JP2015076200 W JP 2015076200W WO 2017046876 A1 WO2017046876 A1 WO 2017046876A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
pixel
filter
wavelength band
unit
Prior art date
Application number
PCT/JP2015/076200
Other languages
French (fr)
Japanese (ja)
Inventor
俊彰 三上
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to PCT/JP2015/076200 priority Critical patent/WO2017046876A1/en
Publication of WO2017046876A1 publication Critical patent/WO2017046876A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor

Definitions

  • the present invention is an endoscope that generates an enhanced image in which capillaries on a mucosal surface layer in a subject are emphasized using image data that is introduced into a living body and output from an endoscope that acquires the in-vivo image.
  • the present invention relates to a system, an image processing apparatus, and an image processing method.
  • narrow-band illumination light that is narrower than the wavelength band of light emitted from a white light source and included in the blue and green wavelength bands is referred to as an observation site of a subject.
  • a technique for highlighting the capillaries and mucosal fine patterns on the surface of the mucosa by imaging the reflected light reflected at the observation site is known.
  • a green component In addition to a main sensitivity region sensitive to green light, a sub-sensitivity region sensitive to blue narrow-band illumination light is provided as a characteristic of a filter that transmits light (G filter).
  • Correlation calculation of pixel values of the pixel (G pixel) that receives the light transmitted through the G filter and the pixel (R pixel) that receives the light transmitted through the filter (R filter) that transmits the red component is performed.
  • a component of the sub-sensitivity region is extracted from the pixel value of the G pixel, and a pixel (B pixel) that receives the light transmitted through the filter (B filter) that transmits the extracted component of the sub-sensitivity region and the blue component
  • Patent Document 1 when the subject is irradiated with narrow-band illumination light, since the pixel value of the R pixel is very small, the gain of the pixel value of the R pixel is increased. At this time, since the noise component included in the pixel value of the R pixel is amplified as the sensitivity difference with respect to the G pixel of the R image in the green light of the narrowband illumination light increases, the SN ratio of the enhanced image obtained by the correlation calculation is increased. There was a problem of being lowered.
  • the present invention has been made in view of the above, and is an endoscope that can obtain a high-resolution enhanced image without amplifying noise even when a subject is irradiated with narrow-band illumination light.
  • An object is to provide a mirror system, an image processing apparatus, and an image processing method.
  • an endoscope system includes at least a narrow-band first light included in a blue wavelength band and a narrow-band included in a green wavelength band.
  • a light source device that emits narrow-band light, and an imaging device that generates an electrical signal by photoelectrically converting light received by each of a plurality of pixels arranged in a two-dimensional grid.
  • the first filter that transmits light in the blue wavelength band, the second filter that transmits light in the blue wavelength band and the green wavelength band, and transmits light in at least the red wavelength band
  • a first filter unit comprising a plurality of filters having a third filter, or a first filter that transmits light in the blue wavelength band, and light in the blue wavelength band and the red wavelength.
  • a second filter unit comprising a plurality of filters having a fourth filter that transmits at least the light in the green wavelength band, and the light in the green wavelength band.
  • the first filter unit in which the number of filters to be transmitted is more than half of the total number of filters, and the number of filters that transmit light in the blue wavelength band is greater than or equal to the number of filters that transmit light in the green wavelength band Or a color filter formed by arranging a second filter unit corresponding to the plurality of pixels, and an arithmetic processing for extracting the narrowband light component from the pixel value generated by each of the plurality of pixels, From the pixel value of the pixel having the lowest sensitivity to the first light among the plurality of pixels having sensitivity to the first light, the first of the plurality of pixels.
  • the endoscope system according to the present invention is the endoscope system according to the above invention, wherein the extraction unit uses the second value among the plurality of pixels based on a pixel value of the pixel having sensitivity to light in at least the red wavelength band. By further subtracting the product of the pixel value of the pixel having the highest sensitivity to light and the second coefficient whose absolute value is 1 or less, the component of the narrowband light obtained by the first light is further extracted. It is characterized by that.
  • the extraction unit extracts the narrowband light component obtained by the second light and then obtains the narrow light obtained by the first light. A band light component is extracted.
  • the endoscope system may further include an interpolation unit that generates an interpolation value of a channel for each color by performing an interpolation process on the pixel value generated by each of the plurality of pixels. And the extraction unit extracts the narrowband light component from the interpolation value generated by the interpolation unit performing the interpolation process.
  • the pixel in which the first filter is arranged has the highest sensitivity to the first light among the plurality of pixels.
  • the pixel in which the second filter is disposed or the pixel in which the fifth filter is disposed has the highest sensitivity to the second light among the plurality of pixels. .
  • the endoscope system according to the present invention is characterized in that, in the above-mentioned invention, the endoscope system further includes a display image generation unit that generates a display image signal based on the narrowband light component extracted by the extraction unit. To do.
  • the image processing apparatus is generated by an endoscope including an image sensor that generates an electric signal by photoelectrically converting light received by each of a plurality of pixels arranged in a two-dimensional grid.
  • An image processing device that performs image processing on image data, wherein the first light of the narrow band included in the blue wavelength band emitted by the light source device from the pixel value generated by each of the plurality of pixels , A processing for extracting a narrowband light component comprising a second narrowband light included in the green wavelength band, wherein the first of the plurality of pixels having sensitivity to the first light.
  • the pixel value and absolute value of the pixel having the highest sensitivity to the first light out of the plurality of pixels is 1 or less from the pixel value of the pixel having the lowest sensitivity to the light of 1 By subtracting the product with the coefficient
  • an extraction unit that performs calculation processing to extract at least a component of the narrowband light obtained by the second light includes a first filter that transmits light in the blue wavelength band, A second filter configured to transmit light in the blue wavelength band and the green wavelength band, and a third filter configured to transmit at least light in the red wavelength band.
  • One filter unit or a first filter that transmits light in the blue wavelength band, a fourth filter that transmits light in the blue wavelength band and light in the red wavelength band, and at least the green filter
  • a second filter unit comprising a plurality of filters having a fifth filter that transmits light in the wavelength band, and the number of filters that transmit light in the green wavelength band is the total number of filters.
  • a plurality of the first filter units or the second filter units wherein the number of filters that transmit light in the blue wavelength band is greater than or equal to the number of filters that transmit light in the green wavelength band. It is characterized by comprising a color filter arranged corresponding to the pixels.
  • the image processing method according to the present invention is generated by an endoscope including an image sensor that generates an electric signal by photoelectrically converting light received by each of a plurality of pixels arranged in a two-dimensional grid.
  • the pixel value and the absolute value of the pixel having the highest sensitivity to the first light among the plurality of pixels are 1 from the pixel value of the pixel having the lowest sensitivity to the first light among the pixels of A first coefficient which is An extraction step of performing an arithmetic process of extracting at least a component of the narrowband light obtained by the second light by subtracting a product, wherein the endoscope transmits light in the blue wavelength band
  • a plurality of filters having a first filter, a second filter that transmits light in the blue wavelength band and the green wavelength band, and a third filter that transmits light in at least the red wavelength band Or a first filter that transmits light in the blue wavelength band, and a fourth filter that transmits light in the blue wavelength band and light in the red wavelength band.
  • a second filter unit comprising a plurality of filters having at least a fifth filter that transmits light in the green wavelength band, and transmits light in the green wavelength band
  • a first filter unit wherein the number of filters is equal to or greater than half of the total number of filters, and the number of filters that transmit light in the blue wavelength band is equal to or greater than the number of filters that transmit light in the green wavelength band; It is provided with a color filter formed by arranging a second filter unit corresponding to the plurality of pixels.
  • the present invention it is possible to obtain a high-resolution enhanced image without amplifying noise even when the subject is irradiated with narrow-band illumination light.
  • FIG. 1 is a schematic diagram showing a schematic configuration of an endoscope system according to Embodiment 1 of the present invention.
  • FIG. 2 is a schematic diagram showing an example of the configuration of the color filter according to Embodiment 1 of the present invention.
  • FIG. 3 is a diagram showing the relationship between the wavelength and the sensitivity characteristic of each of the B pixel, Cy pixel, and Mg pixel according to Embodiment 1 of the present invention.
  • FIG. 4 is a diagram showing sensitivity characteristics between the narrowband light and each pixel according to the first embodiment of the present invention.
  • FIG. 5 is a schematic diagram showing the structure in the layer direction of the biological tissue observed by the endoscope system according to Embodiment 1 of the present invention.
  • FIG. 1 is a schematic diagram showing a schematic configuration of an endoscope system according to Embodiment 1 of the present invention.
  • FIG. 2 is a schematic diagram showing an example of the configuration of the color filter according to Embodiment 1 of the present invention.
  • FIG. 3
  • FIG. 6A is a diagram illustrating an example of interpolation processing performed on Cy pixels by the interpolation unit according to Embodiment 1 of the present invention.
  • FIG. 6B is a diagram illustrating an example of interpolation processing performed on the Cy pixels by the interpolation unit according to Embodiment 1 of the present invention.
  • FIG. 7A is a diagram illustrating an example of an interpolation process performed on a B pixel or an Mg pixel by the interpolation unit according to Embodiment 1 of the present invention.
  • FIG. 7B is a diagram illustrating an example of interpolation processing performed on the B pixel or the Mg pixel by the interpolation unit according to Embodiment 1 of the present invention.
  • FIG. 7C is a diagram illustrating an example of interpolation processing performed on the B pixel or the Mg pixel by the interpolation unit according to Embodiment 1 of the present invention.
  • FIG. 8 is a diagram showing an outline of processing for extracting a signal value generated by each narrowband light performed by the extraction unit according to Embodiment 1 of the present invention from the Cy channel or the Mg channel.
  • FIG. 9A is a diagram illustrating an area of a B pixel in a narrow band.
  • FIG. 9B is a diagram illustrating an area of a Cy pixel in a narrow band.
  • FIG. 9C is a diagram illustrating the areas of B pixels and Cy pixels in a narrow band.
  • FIG. 10 is a schematic diagram showing an example of the configuration of the color filter according to Embodiment 2 of the present invention.
  • FIG. 11 is a diagram showing the relationship between the wavelength and the sensitivity characteristic of each of the B pixel, the G pixel, and the Mg pixel according to Embodiment 2 of the present invention.
  • FIG. 12 is a diagram showing sensitivity characteristics of each narrowband light and each pixel according to the second embodiment of the present invention.
  • FIG. 13 is a diagram showing an outline of processing for extracting a signal value generated by each narrowband light from the Mg channel performed by the extraction unit according to Embodiment 2 of the present invention.
  • FIG. 14 is a schematic diagram showing an example of the configuration of the color filter according to Embodiment 3 of the present invention.
  • FIG. 15 is a diagram showing the relationship between the wavelength and the sensitivity characteristic of each of the B pixel, the Cy pixel, and the R pixel according to the third embodiment of the present invention.
  • FIG. 16 is a diagram showing sensitivity characteristics of each narrowband light and each pixel according to the third embodiment of the present invention.
  • FIG. 17 is a diagram illustrating an outline of a process of extracting, from the Cy channel, a signal generated by each narrowband light performed by the extraction unit according to Embodiment 3 of the present invention.
  • FIG. 18 is a diagram showing sensitivity characteristics of each narrowband light and each pixel according to the fourth embodiment of the present invention.
  • FIG. 19 is a diagram showing an outline of processing for extracting a signal generated by each narrowband light performed by the extraction unit according to Embodiment 4 of the present invention from the Cy channel and the Mg channel.
  • FIG. 1 is a schematic diagram showing a schematic configuration of an endoscope system according to Embodiment 1 of the present invention.
  • An endoscope system 1 shown in FIG. 1 includes an endoscope 2 that acquires an image signal in a body cavity by being inserted into a subject, and a light source device 3 that generates illumination light emitted from the distal end of the endoscope 2.
  • the processor unit 4 performs predetermined image processing on the image signal acquired by the endoscope 2 and controls the overall operation of the endoscope system 1.
  • the processor unit 4 corresponds to the image signal input from the processor unit 4.
  • a display unit 5 for displaying an image to be displayed.
  • the endoscope 2 includes an operation unit 200, an imaging lens 201, an imaging unit 202, a light guide 203, an illumination lens 204, an A / D conversion unit 205, and an imaging information storage unit 206.
  • the operation unit 200 operates peripheral devices such as an air supply unit, a water supply unit, and a gas supply unit in addition to an instruction signal for instructing the light source device 3 to switch the illumination light and an instruction signal for changing the control content of the processor unit 4. An input of an instruction signal for instructing is received.
  • the imaging lens 201 is provided at the distal end of the endoscope 2 and collects at least reflected light from the subject.
  • the imaging lens 201 is configured using one or a plurality of lenses and a prism.
  • the imaging unit 202 receives the light collected by the imaging lens 201 and performs photoelectric conversion, thereby generating an image signal of the subject, and outputs the image signal to the A / D conversion unit 205.
  • the imaging unit 202 includes an imaging element 202a and a color filter 202b (first color filter).
  • the imaging element 202a is realized by using an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
  • the image sensor 202a has a plurality of pixels arranged in a two-dimensional lattice.
  • the color filter 202b has a plurality of transmission filters that transmit light in wavelength bands that are individually set. Specifically, a first filter that transmits light in the blue wavelength band, a second filter that transmits light in the blue wavelength band and the green wavelength band, and light in at least the red wavelength band A filter unit comprising a plurality of filters, wherein the number of filters transmitting light in the green wavelength band is more than half of the total number of filters, and the blue wavelength band A filter unit in which the number of filters that transmit light is equal to or greater than the number of filters that transmit light in the green wavelength band is arranged in correspondence with a plurality of pixels of the image sensor 202a.
  • FIG. 2 is a schematic diagram illustrating an example of the configuration of the color filter 202b.
  • the color filter 202b includes a filter B (first filter) that transmits light in the blue wavelength band, and a filter Cy (that transmits light in the blue wavelength band and light in the green wavelength band).
  • the arrangement pattern of the color filter 202b is basically a filter B on the upper left, a filter Cy on the lower left and upper right, and a filter Mg on the lower right based on 2 ⁇ 2 pixels.
  • the color filter 202b is arranged over all the pixels of the image sensor 202a according to the arrangement pattern described above. In the following, pixels in which each of the filter B, the filter Cy, and the filter Mg are arranged are denoted as B pixel, Cy pixel, and Mg pixel.
  • FIG. 3 is a diagram illustrating the relationship between the sensitivity characteristic with respect to the wavelength of light and the wavelength of each of the B pixel, the Cy pixel, and the Mg pixel.
  • the vertical axis represents sensitivity
  • the horizontal axis represents wavelength.
  • each of a curve L B (solid line), a curve L Cy (one-dot chain line), and a curve L Mg (dotted line) indicates sensitivity characteristics of the B pixel, the Cy pixel, and the Mg pixel.
  • B pixel has a first peak sensitivity in the wavelength band of blue light (400nm ⁇ 470nm).
  • the Cy pixel has a first peak of sensitivity for green light (500 nm to 550 nm) and a second peak of sensitivity for blue light (400 nm to 470 nm).
  • the Mg pixel has a first peak of sensitivity for red light (600 nm to 650 nm) and a second peak of sensitivity for blue light (400 nm to 470 nm).
  • the light guide 203 is configured using glass fiber or the like, and guides light emitted from the light source device 3 described later.
  • the illumination lens 204 is provided at the tip of the light guide 203, diffuses the light guided by the light guide 203, and emits the light from the tip of the endoscope 2 to the outside.
  • the A / D conversion unit 205 performs A / D conversion on the analog image signal generated by the image sensor 202 a to generate a digital image signal, and outputs the image signal to the processor unit 4.
  • the imaging information storage unit 206 records data including various programs for operating the endoscope 2, various parameters necessary for the operation of the endoscope 2, identification information of the endoscope 2, and the like.
  • the imaging information storage unit 206 includes an identification information storage unit 206a that stores identification information.
  • the identification information includes unique information (ID) of the endoscope 2, year, specification information, transmission method, filter arrangement information for the color filter 202 b, and the like.
  • the imaging information storage unit 206 is realized using a flash memory or the like.
  • the light source device 3 includes an illumination unit 31 and an illumination control unit 32.
  • the illumination unit 31 switches and emits a plurality of illumination lights having different wavelength bands under the control of the illumination control unit 32.
  • the illumination unit 31 includes a light source unit 311, a light source driver 312, a switching filter 313, a drive unit 314, a drive driver 315, and a condenser lens 316.
  • the light source unit 311 emits white light including red, green, and blue light. White light emitted from the light source unit 311 is emitted to the outside from the distal end of the endoscope 2 via the switching filter 313, the condenser lens 316, and the light guide 203.
  • the light source unit 311 is realized using an LED (Light Emitting Diode), a xenon lamp, or the like.
  • the light source driver 312 supplies white light to the light source unit 311 by supplying power to the light source unit 311 under the control of the illumination control unit 32.
  • the switching filter 313 is detachably disposed on the optical path of white light emitted from the light source unit 311.
  • the switching filter 313 transmits only the blue narrow-band light and the green narrow-band light among the white light emitted from the light source unit 311.
  • the switching filter 313 transmits narrowband light composed of light in a narrow band T B (eg, 390 nm to 445 nm) included in white light and light in a narrow band T G (eg, 530 nm to 550 nm). To do.
  • the light in each of the narrow band T B and the narrow band T G has a blue and green wavelength band that is easily absorbed by hemoglobin in the blood.
  • FIG. 4 is a diagram showing sensitivity characteristics between narrowband light and each pixel.
  • the vertical axis indicates sensitivity
  • the horizontal axis indicates wavelength.
  • the sensitivity of the narrow band TG is the highest for Cy pixels. Also, the sensitivity of the narrowband T B is, Cy pixel is the lowest, the difference is set so as not to occur between the B pixel and the Mg pixel.
  • the drive unit 314 inserts or displaces the switching filter 313 on the optical path of white light emitted from the light source unit 311.
  • the drive unit 314 is realized using a stepping motor, a DC motor, or the like.
  • the drive driver 315 supplies predetermined power to the drive unit 314 under the control of the illumination control unit 32.
  • the condensing lens 316 collects the white light emitted from the light source unit 311 or the narrowband light transmitted through the switching filter 313 and emits the light to the outside of the light source device 3 (light guide 203).
  • the illumination control unit 32 controls the light source driver 312 to turn on / off the light source unit 311. In addition, the illumination control unit 32 controls the drive driver 315 to insert and remove the switching filter 313 with respect to the optical path of white light emitted from the light source unit 311, thereby changing the illumination light emitted from the illumination unit 31 to white. light, and narrow-band T B narrowband T switch to narrowband light control consisting of the G light is carried out.
  • the processor unit 4 includes an image processing unit 41, a display image generation unit 42, an input unit 43, a storage unit 44, and a control unit 45.
  • the processor unit 4 functions as an image processing apparatus.
  • the image processing unit 41 performs predetermined image processing on the image signal input from the A / D conversion unit 205 and outputs the image signal to the display image generation unit 42.
  • the image processing unit 41 includes an interpolation unit 411 and an extraction unit 412.
  • the interpolation unit 411 performs an interpolation process on the image signal input from the A / D conversion unit 205. Specifically, the interpolation unit 411 performs an interpolation process based on direction information (for example, an edge) on each of the B pixel, the Cy pixel, and the Mg pixel, thereby each of the B channel, the Cy channel, and the Mg channel. The interpolation values of the B channel, the Cy channel, and the Mg channel are output to the extraction unit 412. Further, the interpolation unit 411 generates the B channel and the Mg channel with high resolution by using the direction information of the predetermined pixels for each of the B pixel, the Cy pixel, and the Mg pixel under the observation of the narrow band light.
  • direction information for example, an edge
  • Extraction unit 412 the observation of a narrow-band light, a processing for extracting a component of the narrow-band light from the pixel values, each generated in a plurality of pixels constituting the image pickup element 202a, the narrow-band T B light sensitivity from the pixel value of the pixel having the lowest sensitivity sensitivity to light of a narrow band T B of the plurality of pixels with respect to the pixel value of the highest pixel sensitivity to light of a narrow band T B of the plurality of pixels
  • a calculation process for extracting at least a narrowband light component obtained by the narrowband TG is performed, and the calculation result is displayed on the display image generation unit 42. Output to.
  • the extraction unit 412 extracts a predetermined signal value from each of the interpolation values of the B channel, the Cy channel, and the Mg channel input from the interpolation unit 411, and sends the extracted signal value to the display image generation unit 42. Output.
  • the extraction unit 412 extracts the signal values of the narrow band T B and the narrow band T G constituting the narrow band light by the arithmetic processing between the channels, and outputs the extracted signal value to the display image generation unit 42.
  • the extraction unit 412 uses a color conversion matrix or a conversion table for the interpolation values of the B channel, the Cy channel, and the Mg channel under white light observation, so that the B pixel, the G pixel, and the R pixel are used. Extract the signal value of each of the images.
  • the display image generation unit 42 generates a color image and a pseudo color image based on the signal value input from the image processing unit 41 and outputs the color image and the pseudo color image to the display unit 5. Specifically, the display image generating unit 42, under the narrow band light, each of the signal values of the narrow-band T B and narrowband T G constituting the narrowband light input from the image processing unit 41, and a control unit A pseudo color is generated based on the color filter information input from 45. For example, the display image generating unit 42 assigns to each of the B channel and the G channel of the display image signal a signal value of the narrow band T B, allocating a signal value of the narrow band T G to R channel of the display image signal.
  • the display image generation unit 42 generates a color image using the B channel, G channel, and R channel signals output from the image processing unit 41 under white light. Further, the display image generation unit 42 performs gradation conversion processing, enlargement processing, structure enhancement processing, and the like on the pseudo color and the color image, generates a display image, and outputs the display image to the display unit 5.
  • the input unit 43 receives input of instruction signals for instructing various operations related to the endoscope system 1. For example, the input unit 43 receives an input of an instruction signal that instructs illumination light emitted from the light source device 3.
  • the storage unit 44 stores various programs for operating the endoscope system 1 and data including various parameters necessary for the operation of the endoscope system 1.
  • the storage unit 44 stores identification information of the endoscope 2, for example, a relationship table between unique information of the image sensor 202a and information related to the filter arrangement of the color filter 202b.
  • the storage unit 44 is realized using a semiconductor memory such as a flash memory or a DRAM.
  • the control unit 45 performs drive control of each component including the endoscope 2 and the light source device 3 and information input / output control for each component. Further, the control unit 45 acquires setting data for imaging control, control data related to imaging, and control data related to illumination stored in the storage unit 44, and control signals corresponding to the acquired setting data and control data, respectively. Is output to the endoscope 2 and the light source device 3 via a predetermined signal line or bus. In addition, the control unit 45 outputs the color filter information output via the imaging information storage unit 206 to each of the image processing unit 41 and the display image generation unit 42.
  • the control unit 45 is configured using a CPU (Central Processing Unit) or the like.
  • the display unit 5 displays an image corresponding to the display image signal input from the processor unit 4.
  • the display unit 5 is realized using liquid crystal, organic EL (Electro Luminescence), or the like.
  • FIG. 5 is a schematic diagram showing a layered structure of a biological tissue observed by the endoscope system 1.
  • the living tissue 71 as the subject has blood vessels with different depths. Specifically, many capillaries 72 are mainly distributed in the vicinity of the mucous membrane surface layer of the living tissue 71. Further, in the middle layer deeper than the surface layer of the mucous membrane of the living tissue 71, blood vessels 73 that are thicker than the capillaries 72 are distributed in addition to the capillaries 72. Furthermore, a thicker blood vessel 74 is distributed in the deep layer of the biological tissue 71 than the thick blood vessel 73 in the middle layer.
  • the depth of light reaching the living tissue 71 depends on the wavelength.
  • narrow band of light T B is reflected in the vicinity of the mucosal surface, the light of the narrow-band T G reaches to a position deeper than the mucosal surface layer.
  • the endoscope system 1 the narrow-band light is irradiated to the living body tissue 71, by the light of the narrow-band T B which is reflected to the imaging (light) in the living tissue 71, the surface of the capillary 72 in the body tissue 71 Information (hereinafter referred to as “surface layer information”), and by imaging (receiving) the light of the narrow band TG reflected by the living tissue 71, the thick blood vessel 73 in the middle layer in the living tissue 71 and Information on each of the deep blood vessels 74 (hereinafter referred to as “mid-deep layer information”) can be acquired.
  • surface layer information Information
  • mid-deep layer information Information on each of the deep blood vessels 74
  • each of the Cy pixel and Mg image has a sensitivity to light of the light and narrowband T G narrowband T B.
  • each of the pixel value of the Cy pixel and the pixel value of the Mg pixel includes surface layer information and mid-deep layer information.
  • B channel interpolation unit 411 performs interpolation processing, and outputs the interpolated value of Cy channel and Mg channel to the extraction unit 412, extraction unit 412 narrowband T B and narrowband A signal value obtained by each light of TG is extracted from the Cy channel or the Mg channel, and is output to the display image generation unit 42.
  • FIG. 6A and 6B are diagrams illustrating an example of an interpolation process performed by the interpolation unit 411 on the Cy pixel.
  • 7A to 7C are diagrams illustrating an example of interpolation processing performed by the interpolation unit 411 for the B pixel or the Mg pixel.
  • the interpolation unit 411 performs vertical (see arrow A1) and horizontal (see arrow A1) Cy pixels near the interpolation target pixel P1 (target pixel).
  • the Cy channel is interpolated by calculating the difference between the pixel values and performing weighted averaging so as to give priority to the pixel value in the direction in which the difference is small (FIG. 6A ⁇ FIG. 6B).
  • the interpolating unit 411 first calculates the difference between the pixel values in the diagonal direction (see the arrow A2) for the B pixel in the vicinity of the interpolation target pixel P1 (target pixel).
  • a B channel having a checkered value is created by performing weighted averaging so as to give priority to pixel values in a direction where the difference is small (FIG. 7A ⁇ FIG. 7B).
  • the interpolation unit 411 obtains a difference between pixel values in the vertical direction (see the arrow A3) and the horizontal direction (see the arrow A3) for the B pixel in the vicinity of the interpolation target pixel (target pixel).
  • the B channel signal value is interpolated by performing weighted averaging so as to give priority to the pixel value in the direction of smaller B.
  • the interpolation unit 411 interpolates the Mg channel by the above-described interpolation processing as in the case of the B pixel. (FIG. 7B ⁇ FIG. 7C).
  • the interpolation unit 411 in addition to the interpolation process described above also, B pixels, Cy pixel and Mg pixel has a sensitivity to narrowband T B, due to a high mutual correlation, based on the direction information of the Cy pixel, You may perform interpolation of B pixel and Mg pixel.
  • FIG. 8 is a diagram illustrating an outline of processing performed by the extraction unit 412 to extract a signal value generated by each narrow-band light from the Cy channel or the Mg channel.
  • B-channel the respective interpolated value of Cy channel and Mg channels, indicated as P B, P Cy and P Mg.
  • the signal value of the narrow band T B included in P B is b
  • the signal values of the narrow band T B and the narrow band T G included in P Cy are b ′ and g ′, respectively, and are included in PMg.
  • the signal values of the narrow band T B and the narrow band T G are b ′′ and g ′′.
  • the imaging device 202a of the coordinates (x, y) each of the interpolated value P B of each channel in the (pixel of interest), P Cy and P Mg has the following formula (1-1) to Formula ( 1-3).
  • P B (x, y) b (x, y) (1-1)
  • P Cy (x, y) g ′ (x, y) + b ′ (x, y) (1-2)
  • P Mg (x, y) g ′′ (x, y) + b ′′ (x, y) (1-3)
  • b ′ is expressed by the following equation (1-4) using the coefficient ⁇ .
  • b ′ (x, y) ⁇ b (x, y) (1-4)
  • FIG. 9A is a diagram showing the area of the B pixel (curve L B ) in the narrow band T B.
  • Figure 9B is a diagram showing the area of the Cy pixel in the narrow-band T B (curve L Cy).
  • FIG. 9C is a diagram illustrating the areas of the B pixel (curve L B ) and the Cy pixel (curve L Cy ) in the narrow band T B.
  • FIG. 9A ⁇ Figure 9C, the area of S B of the curve L B in the narrow-band T B, the area of the curve L Cy and S Cy in narrowband T B.
  • the coefficient alpha the ratio between the area S Cy narrow area of the curve L B in the band T B S B and the curve L Cy, described in equation (1-5) Is done.
  • S Cy / S B (1-5)
  • the coefficient beta the ratio of narrowband T area of the curve L Cy in G S Cy 'and curve L Mg area S Mg', represented by the following formula (1-9).
  • S Mg '/ S Cy ' (1-9)
  • the extraction unit 412 obtains the signal value b and the signal value b ′′ in all the pixels of the image sensor 202a using the above formulas (1-1) and (1-11). Extract surface layer information of living tissue. Further, the extraction unit 412 extracts the mid-deep layer information by obtaining the signal value g ′ in all the pixels of the image sensor 202a using the above-described equation (1-7). Extraction unit 412 outputs of each B pixel and Mg pixels' and the signal value g of the narrow band T G of Cy pixel 'narrowband signal value b and the signal value b of T B' to the display image generating unit 42 a .
  • the display image generation unit 42 generates a pseudo color image using the signal values b, b ′′, and g ′ output from the extraction unit 412.
  • the pseudo color image includes three channels of R, B, and G.
  • the display image generating unit 42 may generate the B channel and assign the same information to the G channel.
  • the display image generation unit 42 outputs each of the signal value b and the signal value b ′′ input from the extraction unit 412 on the B channel based on the color filter information input from the control unit 45. To the B pixel position and Mg pixel position. In this case, the display image generation unit 42 multiplies the signal value b ′′ by the coefficient k to make the brightness of the signal value b and the signal value b ′′ uniform.
  • the coefficient k of the sensitivity ratio between the B pixel and the Mg pixel preferably satisfies the following condition (1-13). 1 ⁇ k ⁇ 2 (1-13) Furthermore, it is preferable that the following condition (1-14) is satisfied. k ⁇ 1 / ⁇ (1 + ⁇ ) (1-14) Furthermore, it is preferable to satisfy the following condition (1-15). ⁇ (1 + ⁇ ) ⁇ 1 (1-15)
  • is the value of the above equation (1-5)
  • is the value of the above equation (1-9).
  • the display image generation unit 42 generates a surface layer information image for the position of the Cy pixel on the B channel using the signal value b at the position of the neighboring B pixel and the signal value b '' at the position of the Mg pixel.
  • interpolation processing using direction information may be used, or linear interpolation processing may be used.
  • the priority order may be set without using the interpolation process, and either of the signal values b and b ′′ may be used.
  • the display image generation unit 42 After the display image generation unit 42 generates the B channel, the same information as the B channel is generated. Are assigned to the G channel.
  • the display image generation unit 42 generates a mid-depth information image by assigning the signal value g ′ input from the extraction unit 412 to the position of each pixel on the R channel.
  • the extraction unit 412 is B channels, by performing arithmetic processing as much as possible not to perform amplification of noise to Cy channels and Mg channel interpolation value P Cy and interpolated values it is possible to extract high-quality narrow-band signal from the P Mg, while suppressing amplification of noise, the mucosal surface in the observation region of a subject under the narrow band light, the middle layer and deep high-resolution image acquisition can do.
  • the extraction unit 412 determines the Cy channel with the highest sensitivity to the light in the narrow band TG among the plurality of pixels constituting the image sensor 202a from the interpolation value of the Mg channel. by subtracting the product of the interpolation value and the beta, since further extracts the narrowband light components obtained by the optical narrow-band T B, while suppressing amplification of noise, the subject under the narrow band light observation High-resolution images of the mucosal surface layer, middle layer and deep layer at the site can be acquired.
  • the extraction unit 412 extracts a narrow-band light components obtained by the optical narrow-band T G, narrowband T component of the narrow-band light obtained by the light of B Therefore, a high-resolution image can be acquired.
  • the endoscope system according to the second embodiment has a configuration different from that of the color filter 202b according to the first embodiment described above, and a process performed by each of the extraction unit and the display image generation unit.
  • the processing executed by the extraction unit and the display image generation unit according to the second embodiment will be described.
  • symbol is attached
  • FIG. 10 is a schematic diagram showing an example of the configuration of the color filter according to Embodiment 2 of the present invention.
  • the color filter 202c (second color filter) illustrated in FIG. 10 includes a filter B (first filter), a filter G (fifth filter) that transmits light in the green wavelength band, and a filter Mg (fourth filter). Filter).
  • the arrangement pattern of the color filter 202c is based on 2 ⁇ 2 pixels, and is a filter B on the upper left, a filter G on the lower left and upper right, and a filter Mg on the lower right filter.
  • the color filter 202c is arranged over all the pixels of the image sensor 202a according to the arrangement pattern described above.
  • the pixels in which each of the filter B, the filter G, and the filter Mg are arranged are denoted as a B pixel, a G pixel, and an Mg pixel.
  • FIG. 11 is a diagram illustrating the relationship between the sensitivity characteristic with respect to the wavelength of light and the wavelength of each of the B pixel, the G pixel, and the Mg pixel.
  • the vertical axis indicates the sensitivity
  • the horizontal axis indicates the wavelength.
  • each of a curve L B (solid line), a curve L G (dashed line), and a curve L Mg (dotted line) indicates sensitivity characteristics of the B pixel, the G pixel, and the Mg pixel.
  • B pixel has a first peak sensitivity in the wavelength band of blue light (400nm ⁇ 470nm).
  • G pixel has a first peak of the sensitivity to green light (500nm ⁇ 550nm).
  • the Mg pixel has a first peak of sensitivity for red light (600 nm to 650 nm) and a second peak of sensitivity for blue light (400 nm to 470 nm).
  • FIG. 12 is a diagram showing the narrowband light and the sensitivity characteristics of each pixel.
  • the horizontal axis indicates the wavelength
  • the vertical axis indicates the sensitivity.
  • the G pixel has the highest sensitivity at each wavelength in the narrow band TG . Further, the sensitivity at each wavelength in the narrow band T B, the difference is designed so as not to occur between the B pixel and the Mg pixel.
  • Interpolation processing in which the interpolation unit 411 performs interpolation using the pixel values of the B pixel, the G pixel, and the Mg pixel with respect to the image data generated by the imaging element 202a in which the color filter 202c configured as described above is arranged. I do.
  • the interpolation unit 411 performs predetermined interpolation using linear interpolation or the like using the pixel values of the B pixel, the G pixel, and the Mg pixel.
  • the interpolation unit 411 outputs the interpolation values of the B channel, the G channel, and the Mg channel generated from the pixel values of the B pixel, the G pixel, and the Mg pixel to the extraction unit 412.
  • FIG. 13 is a diagram illustrating an outline of processing performed by the extraction unit 412 to extract a signal value generated by each narrowband light from the Mg channel.
  • B-channel each of the interpolation value of the G channel and Mg channels, indicated as P B, P G, and P Mg.
  • the signal value of the narrow band T B included in P B is set as b
  • the signal value of each of the narrow bands T G included in P G is set as g ′
  • the narrow band T B and the narrow band T included in P Mg are set. Let each signal value of G be b ′′ and g ′′.
  • each of P B , P G, and P Mg at the coordinates (x, y) (target pixel) of the image sensor 202a is expressed by the following equations (2-1) to (2-3). Is done.
  • P B (x, y) b (x, y) (2-1)
  • P G (x, y) g ′ (x, y) (2-2)
  • P Mg (x, y) g ′′ (x, y) + b ′′ (x, y (2-3)
  • g ′ is expressed by the following equation (2-4) using a coefficient ⁇ based on each wavelength for the narrow band TG of each of the G pixel and the Mg pixel.
  • the sensitivity of the G pixel is always higher than the sensitivity of the Mg pixel in each wavelength component included in the narrow band TG . Therefore, the coefficient ⁇ is 0 ⁇ ⁇ 1. Therefore, without amplifying the noise contained in the P Mg by formula (2-5) described above, can be obtained b '' contained in P Mg.
  • the extraction unit 412 obtains the signal value b and the signal value b ′′ in all the pixels of the image sensor 202a using the above-described equations (2-1) and (2-5). Extract surface layer information of living tissue. Further, the extraction unit 412 extracts the mid-deep layer information by obtaining the signal value g ′ in all the pixels of the image sensor 202a using the above-described equation (2-2). The extraction unit 412 outputs the signal value b and the signal value b ′′ of the narrow band T B of each of the B pixel and the Mg pixel and the signal value g ′ of the narrow band T G of the G pixel to the display image generation unit 42. .
  • the display image generation unit 42 generates a pseudo color image using the signal values b, b ′′, g ′ output from the extraction unit 412.
  • the pseudo color image is composed of three channels of R, B, and G.
  • the display image generation unit 42 may generate the B channel and assign the same information to the G channel.
  • the display image generation unit 42 Based on the color filter information input from the control unit 45, the display image generation unit 42 converts each of the signal value b and the signal value b ′′ input from the extraction unit 412 to the position of the B pixel on the B channel and Mg Assign to pixel location. In this case, the display image generation unit 42 multiplies b ′′ by a coefficient k to make the brightness of the signal value b and the signal value b ′′ uniform.
  • the coefficient k can be expressed by the above equation (1-12) based on the sensitivity ratio between the B pixel and the Mg pixel.
  • the display image generation unit 42 generates a surface layer information image using the signal value b at the position of the neighboring B pixel and the signal value b '' at the position of the Mg pixel at the position of the G pixel.
  • interpolation processing using direction information may be used, or linear interpolation processing may be used.
  • the priority order may be set without using interpolation, and either of the signal values b and b ′′ may be used.
  • the display image generation unit 42 generates the same information as the B channel after generating the B channel. Further, the display image generation unit 42 generates a mid-depth information image by assigning the signal value g ′ input from the extraction unit 412 to the position of each pixel on the R channel.
  • B channels by performing arithmetic processing as much as possible not to perform amplification of noise to the G channel and Mg channels, high-quality narrowband P G and P Mg Can be extracted. Thereby, it is possible to acquire high-resolution images of the mucosal surface layer, the middle layer, and the deep layer at the observation site of the subject under narrow band light without amplifying noise.
  • the endoscope system according to the third embodiment has a configuration different from that of the color filter 202b according to the first embodiment described above, and different processes performed by each of the extraction unit and the display image generation unit.
  • the processing executed by the extraction unit and the display image generation unit according to the third embodiment will be described.
  • symbol is attached
  • FIG. 14 is a schematic diagram showing an example of the configuration of a color filter according to Embodiment 3 of the present invention.
  • the color filter 202d illustrated in FIG. 14 includes a filter B (first filter), a filter Cy (second filter), and a filter R (third filter) that transmits light in the red wavelength band. .
  • the arrangement pattern of the color filter 202d is basically a filter B on the upper left, a filter Cy on the lower left and upper right, and a filter R on the lower right, based on 2 ⁇ 2 pixels.
  • the color filter 202d is arranged over all the pixels of the image sensor 202a according to the arrangement pattern described above.
  • a pixel in which each of the filter B, the filter Cy, and the filter R is arranged is denoted as a B pixel, a Cy pixel, and an R pixel.
  • FIG. 15 is a diagram illustrating the relationship between the sensitivity characteristic with respect to the wavelength of light and the wavelength of each of the B pixel, the Cy pixel, and the R pixel.
  • the vertical axis indicates sensitivity
  • the horizontal axis indicates wavelength.
  • each of a curve L B (solid line), a curve L Cy (one-dot chain line), and a curve L R (dotted line) indicates the sensitivity characteristics of the B pixel, the Cy pixel, and the R pixel.
  • B pixel has a first peak sensitivity in the wavelength band of blue light (400nm ⁇ 470nm).
  • the Cy pixel has a first peak of sensitivity for green light (500 nm to 550 nm) and a second peak of sensitivity for blue light (400 nm to 470 nm).
  • R pixel has a first peak of sensitivity to red light (600nm ⁇ 650nm).
  • FIG. 16 is a diagram showing narrowband light and sensitivity characteristics of each pixel.
  • the horizontal axis indicates the wavelength
  • the vertical axis indicates the sensitivity.
  • the Cy pixel has the highest sensitivity at each wavelength in the narrow band TG . Further, the sensitivity at each wavelength in the narrow band T B is, B pixels is highest.
  • the interpolation unit 411 performs an interpolation process based on the direction information of each of the B pixel, the Cy pixel, and the R pixel with respect to the image data generated by the imaging element 202a in which the color filter 202d configured as described above is arranged.
  • the interpolation unit 411 may perform the interpolation process using the direction information of each pixel other than using the direction information of the Cy pixel.
  • FIG. 17 is a diagram illustrating an outline of processing performed by the extraction unit 412 to extract the signal value generated by each narrowband light from the Cy channel.
  • the interpolation values of the B channel and the Cy channel are respectively expressed as P B and P Cy .
  • the signal value of the narrow band T B included in P B is b
  • the signal values of the narrow band T B and the narrow band T G included in P Cy are b ′ and g ′.
  • P B and P Cy at the coordinates (x, y) (pixel of interest) of the image sensor 202a are expressed by the following equations (3-1) and (3-2).
  • P B (x, y) b (x, y) (3-1)
  • P Cy (x, y) g ′ (x, y) + b ′ (x, y) (3-2)
  • b ' using the ⁇ coefficient obtained based on the sensitivity ratio of the respective wavelength for narrow band T B of each B pixel and Cy pixel is expressed by the following equation (3-3).
  • the extraction unit 412 obtains the surface information of the living tissue by obtaining the signal value b in all the pixels of the image sensor 202a using the above-described equations (3-1) and (3-3). Extract. Further, the extraction unit 412 extracts the mid-deep layer information by obtaining the signal value g ′ in all the pixels of the image sensor 202a using the above formula (3-4). Extraction unit 412 outputs the signal value b of the narrow band T B of each B pixel and Cy pixel, the display image generating unit 42 a signal value g 'of narrow-band T G of the G pixel.
  • the display image generation unit 42 generates a pseudo color image using the signal values b and g ′ output from the extraction unit 412.
  • the pseudo color image is composed of three channels of R, B, and G.
  • the B channel may be generated first, and similar information may be assigned to the G channel.
  • the display image generation unit 42 assigns the signal value b input from the extraction unit 412 to each pixel position on the B channel and the G channel based on the color filter information input from the control unit 45, thereby A tissue surface information image is generated. Further, the display image generation unit 42 generates a mid-depth information image by assigning the signal value g ′ input from the extraction unit 412 to each pixel position on the R channel.
  • high-quality narrow-band signals are obtained from P B and P Cy by performing arithmetic processing that does not amplify noise as much as possible for the B channel and the Cy channel. Can be extracted.
  • high-resolution images of the mucosal surface layer, middle layer, and deep layer at the observation site of the subject under narrow band light can be acquired while suppressing noise amplification.
  • FIG. 18 is a diagram showing narrowband light and sensitivity characteristics of each pixel according to Embodiment 4 of the present invention.
  • the horizontal axis indicates the wavelength
  • the vertical axis indicates the sensitivity.
  • each of a curve L B (solid line), a curve L Cy (one-dot chain line), and a curve L Mg (dotted line) indicates the sensitivity characteristics of the B pixel, the Cy pixel, and the Mg pixel.
  • the Mg pixel is so small that the sensitivity at each wavelength of the narrow band TG can be ignored.
  • FIG. 19 is a diagram illustrating an outline of processing performed by the extraction unit 412 to extract the signal value generated by each narrow band light from the Cy channel and the Mg channel.
  • B-channel the respective interpolated value of Cy channel and Mg channels, indicated as P B, P Cy and P Mg.
  • the signal value of the narrow band T B included in P B is b
  • the signal values of the narrow band T B and the narrow band T G included in P Cy are b ′ and g ′, respectively, and are included in PMg.
  • the signal values of the narrow band T B and the narrow band T G are assumed to be b ′′.
  • each of P B , P Cy and P Mg at the coordinates (x, y) (target pixel) of the image sensor 202a is expressed by the following equations (4-1) to (4-3).
  • P B (x, y) b (x, y) (4-1)
  • P Cy (x, y) g ′ (x, y) + b ′ (x, y) (4-2)
  • P Mg (x, y) b ′ ′ (x, y) (4-3)
  • b ′ is expressed by the following equation (4-4) using a coefficient ⁇ obtained based on the sensitivity ratio of each wavelength to the narrow band TB of each of the B pixel and the Cy pixel.
  • b ′ (x, y) ⁇ b (x, y) (4-4)
  • g ′ contained in P Cy can be obtained without amplifying the noise contained in P B by the above-described equations (4-5) and (4-6). Further, as shown in FIG. 18, Mg pixel, because it has no sensitivity to light of a narrow band T G, under the narrow band light, can be treated as a value generated by light as narrow-band T B .
  • the extraction unit 412 obtains the signal value b and the signal value b ′′ in all the pixels of the image sensor 202a using the above-described equations (4-1) and (4-5). Extract surface layer information of living tissue. Further, the extraction unit 412 extracts the mid-deep layer information by obtaining the signal value g ′ in all the pixels of the image sensor 202a using the above-described formula (4-6). Extraction unit 412 outputs of each B pixel and Mg pixels' and the signal value g of the narrow band T G of Cy pixel 'narrowband signal value b and the signal value b of T B' to the display image generating unit 42 a .
  • the display image generation unit 42 generates a pseudo color image using the signal values b, b ′′, g ′ output from the extraction unit 412.
  • the pseudo color image is composed of three channels of R, B, and G.
  • the display image generation unit 42 may generate the B channel and assign the same information to the G channel.
  • the display image generation unit 42 Based on the color filter information input from the control unit 45, the display image generation unit 42 sets the signal value b and the signal value b ′′ input from the extraction unit 412 to the position of the B pixel and the position of the Mg pixel, respectively. assign. In this case, the display image generation unit 42 multiplies b ′′ by a coefficient k to make the brightness of the signal value b and the signal value b ′′ uniform.
  • the coefficient k can be expressed by the above equation (1-12) based on the sensitivity ratio between the B pixel and the Mg pixel.
  • the display image generation unit 42 generates a surface layer information image using the signal value b at the position of the neighboring B pixel and the signal value b '' at the position of the Mg pixel at the position of the G pixel.
  • the display image generation unit 42 may use an interpolation process using direction information or a linear interpolation process. Further, the display image generation unit 42 may set the priority order without using the interpolation process, and may use either of the signal values b and b ′′.
  • the display image generation unit 42 generates the B channel.
  • the display image generating unit 42 assigns the signal value g ′ input from the extracting unit 412 to the position of each pixel on the R channel, so that the intermediate depth information Generate an image.
  • B channels by performing arithmetic processing as much as possible not to perform amplification of noise to Cy channels and Mg channels, high-quality narrowband P Cy and P Mg Can be extracted.
  • high-resolution images of the mucosal surface layer, middle layer, and deep layer at the observation site of the subject under narrow band light can be acquired without amplifying noise.
  • the color filter according to the present invention can be appropriately changed as long as the arrangement satisfies the above conditions.
  • the color filter having a plurality of transmission filters that each transmit light in a predetermined wavelength band is provided on the light receiving surface of the image sensor.
  • each transmission filter corresponds to each pixel of the image sensor. May be provided individually.
  • a 2 ⁇ 2 filter unit has been described, but an n (integer) ⁇ m (integer) filter unit, for example, a 4 ⁇ 4 filter unit, can also be applied.
  • the endoscope according to the present invention is applicable to an ultrasonic endoscope in which an imaging element and an ultrasonic transducer are built in the tip, and a capsule endoscope that can be introduced into a subject. can do.
  • a capsule endoscope when switching between two light sources to emit either white illumination light or narrow-band illumination light, for example, a light source, a color filter, and an image sensor are provided in a capsule housing Just do it.
  • the present invention can include various embodiments not described herein, and various design changes can be made within the scope of the technical idea specified by the claims. It is.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

Provided are an endoscope system, an image processing apparatus, and an image processing method with which a high-resolution enhanced image can be obtained without amplifying noise even when a subject is irradiated with narrow-band illumination light. This endoscope system 1 includes an extraction unit 412 which extracts components of narrow-band light obtained at least from green light from pixel values generated from each of a plurality of pixels, said extraction being carried out by subtracting, from a pixel value obtained from the pixel with the least sensitivity to blue light, which constitutes the narrow-band light, among a plurality of pixels with sensitivity to blue light, a pixel value obtained from a pixel with high sensitivity to blue wavelength light multiplied by a coefficient with an absolute value less than or equal to 1.

Description

内視鏡システム、画像処理装置および画像処理方法Endoscope system, image processing apparatus, and image processing method
 本発明は、生体内に導入され、該生体内の画像を取得する内視鏡から出力された画像データを用いて被検体内における粘膜表層の毛細血管を強調した強調画像を生成する内視鏡システム、画像処理装置および画像処理方法に関する。 The present invention is an endoscope that generates an enhanced image in which capillaries on a mucosal surface layer in a subject are emphasized using image data that is introduced into a living body and output from an endoscope that acquires the in-vivo image. The present invention relates to a system, an image processing apparatus, and an image processing method.
 近年、内視鏡システムにおいて、白色光源が発する光の波長帯域より狭く、青色と緑色の波長帯域にそれぞれ含まれる狭帯域の光(以下、「狭帯域照明光」という)を被検体の観察部位に照射し、この観察部位で反射した反射光を画像化して粘膜表層の毛細血管および粘膜微細模様を強調表示する技術が知られている(特許文献1参照)。この技術では、赤色(R)、緑色(G)および青色(B)をそれぞれ透過する3種類のフィルタを所定のパターンで配置したカラーフィルタが受光面上に設けられた撮像素子において、緑色の成分を透過するフィルタ(Gフィルタ)の特性として、緑色光に感応する主感度領域に加えて、青色の狭帯域照明光に感応する副感度領域を設けている。 In recent years, in endoscopic systems, narrow-band light (hereinafter referred to as “narrow-band illumination light”) that is narrower than the wavelength band of light emitted from a white light source and included in the blue and green wavelength bands is referred to as an observation site of a subject. A technique for highlighting the capillaries and mucosal fine patterns on the surface of the mucosa by imaging the reflected light reflected at the observation site (see Patent Document 1) is known. In this technique, in an image sensor in which a color filter in which three types of filters that transmit red (R), green (G), and blue (B) are arranged in a predetermined pattern is provided on a light receiving surface, a green component In addition to a main sensitivity region sensitive to green light, a sub-sensitivity region sensitive to blue narrow-band illumination light is provided as a characteristic of a filter that transmits light (G filter).
 このようなGフィルタを透過した光を受光する画素(G画素)、および赤色の成分を透過するフィルタ(Rフィルタ)を透過した光を受光する画素(R画素)それぞれの画素値の相関演算を行うことによって、G画素の画素値から副感度領域の成分を抽出し、抽出した副感度領域の成分および青色の成分を透過するフィルタ(Bフィルタ)を透過した光を受光する画素(B画素)の画素値に基づいて、粘膜表層の毛細血管および粘膜微細模様等を強調した強調画像を表示モニタに表示させる。 Correlation calculation of pixel values of the pixel (G pixel) that receives the light transmitted through the G filter and the pixel (R pixel) that receives the light transmitted through the filter (R filter) that transmits the red component is performed. By doing this, a component of the sub-sensitivity region is extracted from the pixel value of the G pixel, and a pixel (B pixel) that receives the light transmitted through the filter (B filter) that transmits the extracted component of the sub-sensitivity region and the blue component Based on the pixel value, an enhanced image in which the capillaries and mucous fine patterns on the mucosal surface layer are emphasized is displayed on the display monitor.
特開2012-170639号公報JP 2012-170639 A
 ところで、上述した特許文献1では、被検体に狭帯域照明光を照射する場合にR画素の画素値が非常に小さいため、R画素の画素値のゲインを大きくして用いている。この際、狭帯域照明光の緑色光におけるR画像のG画素に対する感度差が大きいほど、R画素の画素値に含まれるノイズ成分が増幅されるため、相関演算によって得られる強調画像のSN比が低下してしまうという問題点があった。 Incidentally, in Patent Document 1 described above, when the subject is irradiated with narrow-band illumination light, since the pixel value of the R pixel is very small, the gain of the pixel value of the R pixel is increased. At this time, since the noise component included in the pixel value of the R pixel is amplified as the sensitivity difference with respect to the G pixel of the R image in the green light of the narrowband illumination light increases, the SN ratio of the enhanced image obtained by the correlation calculation is increased. There was a problem of being lowered.
 本発明は、上記に鑑みてなされたものであって、被検体に狭帯域照明光を照射した場合であっても、ノイズを増幅させること無く、高解像度の強調画像を得ることができる内視鏡システム、画像処理装置および画像処理方法を提供することを目的とする。 The present invention has been made in view of the above, and is an endoscope that can obtain a high-resolution enhanced image without amplifying noise even when a subject is irradiated with narrow-band illumination light. An object is to provide a mirror system, an image processing apparatus, and an image processing method.
 上述した課題を解決し、目的を達成するために、本発明に係る内視鏡システムは、少なくとも青色の波長帯域に含まれる狭帯域の第1の光と、緑色の波長帯域に含まれる狭帯域の第2の光と、からなる狭帯域光を出射する光源装置と、2次元格子状に配置された複数の画素の各々が受光した光を光電変換することによって電気信号を生成する撮像素子と、前記青色の波長帯域の光を透過する第1のフィルタと、前記青色の波長帯域と前記緑色の波長帯域との光を透過する第2のフィルタと、少なくとも赤色の波長帯域の光を透過する第3のフィルタと、を有する複数のフィルタで構成された第1のフィルタユニット、または前記青色の波長帯域の光を透過する第1のフィルタと、前記青色の波長帯域の光および前記赤色の波長帯域の光を透過する第4のフィルタと、少なくとも前記緑色の波長帯域の光を透過する第5のフィルタと、を有する複数のフィルタで構成された第2のフィルタユニットであって、前記緑色の波長帯域の光を透過するフィルタ数が、全フィルタ数の半数以上であり、かつ、前記青色の波長帯域の光を透過するフィルタ数が前記緑色の波長帯域の光を透過するフィルタ数以上である第1のフィルタユニットまたは第2のフィルタユニットを前記複数の画素に対応させて配置してなるカラーフィルタと、前記複数の画素の各々が生成した画素値から前記狭帯域光の成分を抽出する演算処理であって、前記第1の光に対する感度をもつ前記複数の画素のうち前記第1の光に対する感度が最も低い感度をもつ前記画素の画素値から、前記複数の画素のうち前記第1の光に対する感度が最も高い前記画素の画素値と絶対値が1以下である第1の係数との積を減算することによって、少なくとも前記第2の光によって得られる前記狭帯域光の成分を抽出する演算処理を行う抽出部と、を備えたことを特徴とする。 In order to solve the above-described problems and achieve the object, an endoscope system according to the present invention includes at least a narrow-band first light included in a blue wavelength band and a narrow-band included in a green wavelength band. A light source device that emits narrow-band light, and an imaging device that generates an electrical signal by photoelectrically converting light received by each of a plurality of pixels arranged in a two-dimensional grid. The first filter that transmits light in the blue wavelength band, the second filter that transmits light in the blue wavelength band and the green wavelength band, and transmits light in at least the red wavelength band A first filter unit comprising a plurality of filters having a third filter, or a first filter that transmits light in the blue wavelength band, and light in the blue wavelength band and the red wavelength. Transmits light in the band A second filter unit comprising a plurality of filters having a fourth filter that transmits at least the light in the green wavelength band, and the light in the green wavelength band. The first filter unit in which the number of filters to be transmitted is more than half of the total number of filters, and the number of filters that transmit light in the blue wavelength band is greater than or equal to the number of filters that transmit light in the green wavelength band Or a color filter formed by arranging a second filter unit corresponding to the plurality of pixels, and an arithmetic processing for extracting the narrowband light component from the pixel value generated by each of the plurality of pixels, From the pixel value of the pixel having the lowest sensitivity to the first light among the plurality of pixels having sensitivity to the first light, the first of the plurality of pixels. By subtracting the product of the pixel value of the pixel having the highest sensitivity to light and the first coefficient having an absolute value of 1 or less, at least the component of the narrowband light obtained by the second light is extracted. And an extraction unit that performs arithmetic processing.
 また、本発明に係る内視鏡システムは、上記発明において、前記抽出部は、少なくとも前記赤色の波長帯域の光に感度をもつ前記画素の画素値から、前記複数の画素のうち前記第2の光に対する感度が最も高い前記画素の画素値と絶対値が1以下である第2の係数との積を減算することによって、前記第1の光によって得られる前記狭帯域光の成分をさらに抽出することを特徴とする。 The endoscope system according to the present invention is the endoscope system according to the above invention, wherein the extraction unit uses the second value among the plurality of pixels based on a pixel value of the pixel having sensitivity to light in at least the red wavelength band. By further subtracting the product of the pixel value of the pixel having the highest sensitivity to light and the second coefficient whose absolute value is 1 or less, the component of the narrowband light obtained by the first light is further extracted. It is characterized by that.
 また、本発明に係る内視鏡システムは、上記発明において、前記抽出部は、前記第2の光によって得られる前記狭帯域光の成分を抽出した後に、前記第1の光によって得られる前記狭帯域光の成分を抽出することを特徴とする。 In the endoscope system according to the present invention, in the above invention, the extraction unit extracts the narrowband light component obtained by the second light and then obtains the narrow light obtained by the first light. A band light component is extracted.
 また、本発明に係る内視鏡システムは、上記発明において、前記複数の画素の各々が生成した画素値に対して補間処理を行うことによって色毎のチャンネルの補間値を生成する補間部をさらに備え、前記抽出部は、前記補間部が前記補間処理を行って生成した前記補間値から前記狭帯域光の成分を抽出することを特徴とする。 The endoscope system according to the present invention may further include an interpolation unit that generates an interpolation value of a channel for each color by performing an interpolation process on the pixel value generated by each of the plurality of pixels. And the extraction unit extracts the narrowband light component from the interpolation value generated by the interpolation unit performing the interpolation process.
 また、本発明に係る内視鏡システムは、上記発明において、前記第1のフィルタを配置してなる前記画素は、前記複数の画素のうち、前記第1の光に対して最も感度が高く、前記第2のフィルタを配置してなる前記画素または前記5のフィルタが配置してなる前記画素は、前記複数の画素のうち、前記第2の光に対して最も感度が高いことを特徴とする。 Further, in the endoscope system according to the present invention, in the above invention, the pixel in which the first filter is arranged has the highest sensitivity to the first light among the plurality of pixels. The pixel in which the second filter is disposed or the pixel in which the fifth filter is disposed has the highest sensitivity to the second light among the plurality of pixels. .
 また、本発明に係る内視鏡システムは、上記発明において、前記抽出部が抽出した前記狭帯域光の成分に基づいて、表示画像信号を生成する表示画像生成部をさらに備えたことを特徴とする。 The endoscope system according to the present invention is characterized in that, in the above-mentioned invention, the endoscope system further includes a display image generation unit that generates a display image signal based on the narrowband light component extracted by the extraction unit. To do.
 また、本発明に係る画像処理装置は、2次元格子状に配置された複数の画素の各々が受光した光を光電変換することによって電気信号を生成する撮像素子を備えた内視鏡が生成した画像データに対して画像処理を行う画像処理装置であって、前記複数の画素の各々が生成した画素値から、光源装置によって出射された青色の波長帯域に含まれる狭帯域の第1の光と、緑色の波長帯域に含まれる狭帯域の第2の光と、からなる狭帯域光の成分を抽出する演算処理であって、前記第1の光に対する感度をもつ前記複数の画素のうち前記第1の光に対する感度が最も低い感度をもつ前記画素の画素値から、前記複数の画素のうち前記第1の光に対する感度が最も高い前記画素の画素値と絶対値が1以下である第1の係数との積を減算することによって、少なくとも前記第2の光によって得られる前記狭帯域光の成分を抽出する演算処理を行う抽出部を備え、前記内視鏡は、前記青色の波長帯域の光を透過する第1のフィルタと、前記青色の波長帯域と前記緑色の波長帯域との光を透過する第2のフィルタと、少なくとも赤色の波長帯域の光を透過する第3のフィルタと、を有する複数のフィルタで構成された第1のフィルタユニット、または前記青色の波長帯域の光を透過する第1のフィルタと、前記青色の波長帯域の光および前記赤色の波長帯域の光を透過する第4のフィルタと、少なくとも前記緑色の波長帯域の光を透過する第5のフィルタと、を有する複数のフィルタで構成された第2のフィルタユニットであって、前記緑色の波長帯域の光を透過するフィルタ数が、全フィルタ数の半数以上であり、かつ、前記青色の波長帯域の光を透過するフィルタ数が前記緑色の波長帯域の光を透過するフィルタ数以上である第1のフィルタユニットまたは第2のフィルタユニットを前記複数の画素に対応させて配置してなるカラーフィルタを備えることを特徴とする。 In addition, the image processing apparatus according to the present invention is generated by an endoscope including an image sensor that generates an electric signal by photoelectrically converting light received by each of a plurality of pixels arranged in a two-dimensional grid. An image processing device that performs image processing on image data, wherein the first light of the narrow band included in the blue wavelength band emitted by the light source device from the pixel value generated by each of the plurality of pixels , A processing for extracting a narrowband light component comprising a second narrowband light included in the green wavelength band, wherein the first of the plurality of pixels having sensitivity to the first light. The pixel value and absolute value of the pixel having the highest sensitivity to the first light out of the plurality of pixels is 1 or less from the pixel value of the pixel having the lowest sensitivity to the light of 1 By subtracting the product with the coefficient And an extraction unit that performs calculation processing to extract at least a component of the narrowband light obtained by the second light, and the endoscope includes a first filter that transmits light in the blue wavelength band, A second filter configured to transmit light in the blue wavelength band and the green wavelength band, and a third filter configured to transmit at least light in the red wavelength band. One filter unit, or a first filter that transmits light in the blue wavelength band, a fourth filter that transmits light in the blue wavelength band and light in the red wavelength band, and at least the green filter A second filter unit comprising a plurality of filters having a fifth filter that transmits light in the wavelength band, and the number of filters that transmit light in the green wavelength band is the total number of filters. And a plurality of the first filter units or the second filter units, wherein the number of filters that transmit light in the blue wavelength band is greater than or equal to the number of filters that transmit light in the green wavelength band. It is characterized by comprising a color filter arranged corresponding to the pixels.
 また、本発明に係る画像処理方法は、2次元格子状に配置された複数の画素の各々が受光した光を光電変換することによって電気信号を生成する撮像素子を備えた内視鏡が生成した画像データに対して画像処理を行う画像処理装置が実行する画像処理方法であって、前記複数の画素の各々が生成した画素値から、光源装置によって出射された青色の波長帯域に含まれる狭帯域の第1の光と、緑色の波長帯域に含まれる狭帯域の第2の光と、からなる狭帯域光の成分を抽出する演算処理であって、前記第1の光に対する感度をもつ前記複数の画素のうち前記第1の光に対する感度が最も低い感度をもつ前記画素の画素値から、前記複数の画素のうち前記第1の光に対する感度が最も高い前記画素の画素値と絶対値が1以下である第1の係数との積を減算することによって、少なくとも前記第2の光によって得られる前記狭帯域光の成分を抽出する演算処理を行う抽出ステップを含み、前記内視鏡は、前記青色の波長帯域の光を透過する第1のフィルタと、前記青色の波長帯域と前記緑色の波長帯域との光を透過する第2のフィルタと、少なくとも赤色の波長帯域の光を透過する第3のフィルタと、を有する複数のフィルタで構成された第1のフィルタユニット、または前記青色の波長帯域の光を透過する第1のフィルタと、前記青色の波長帯域の光および前記赤色の波長帯域の光を透過する第4のフィルタと、少なくとも前記緑色の波長帯域の光を透過する第5のフィルタと、を有する複数のフィルタで構成された第2のフィルタユニットであって、前記緑色の波長帯域の光を透過するフィルタ数が、全フィルタ数の半数以上であり、かつ、前記青色の波長帯域の光を透過するフィルタ数が前記緑色の波長帯域の光を透過するフィルタ数以上である第1のフィルタユニットまたは第2のフィルタユニットを前記複数の画素に対応させて配置してなるカラーフィルタを備えることを特徴とする。 In addition, the image processing method according to the present invention is generated by an endoscope including an image sensor that generates an electric signal by photoelectrically converting light received by each of a plurality of pixels arranged in a two-dimensional grid. An image processing method executed by an image processing apparatus that performs image processing on image data, wherein a narrow band included in a blue wavelength band emitted by a light source device from a pixel value generated by each of the plurality of pixels The first light and a narrowband second light included in the green wavelength band, and extracting the narrowband light component. The pixel value and the absolute value of the pixel having the highest sensitivity to the first light among the plurality of pixels are 1 from the pixel value of the pixel having the lowest sensitivity to the first light among the pixels of A first coefficient which is An extraction step of performing an arithmetic process of extracting at least a component of the narrowband light obtained by the second light by subtracting a product, wherein the endoscope transmits light in the blue wavelength band A plurality of filters having a first filter, a second filter that transmits light in the blue wavelength band and the green wavelength band, and a third filter that transmits light in at least the red wavelength band Or a first filter that transmits light in the blue wavelength band, and a fourth filter that transmits light in the blue wavelength band and light in the red wavelength band. A second filter unit comprising a plurality of filters having at least a fifth filter that transmits light in the green wavelength band, and transmits light in the green wavelength band A first filter unit, wherein the number of filters is equal to or greater than half of the total number of filters, and the number of filters that transmit light in the blue wavelength band is equal to or greater than the number of filters that transmit light in the green wavelength band; It is provided with a color filter formed by arranging a second filter unit corresponding to the plurality of pixels.
 本発明によれば、被検体に狭帯域照明光を照射した場合であっても、ノイズを増幅させること無く高解像度の強調画像を得ることができるという効果を奏する。 According to the present invention, it is possible to obtain a high-resolution enhanced image without amplifying noise even when the subject is irradiated with narrow-band illumination light.
図1は、本発明の実施の形態1に係る内視鏡システムの概略構成を示す模式図である。FIG. 1 is a schematic diagram showing a schematic configuration of an endoscope system according to Embodiment 1 of the present invention. 図2は、本発明の実施の形態1に係るカラーフィルタの構成の一例を示す模式図である。FIG. 2 is a schematic diagram showing an example of the configuration of the color filter according to Embodiment 1 of the present invention. 図3は、本発明の実施の形態1に係るB画素、Cy画素およびMg画素の各々の光の波長に対する感度特性と波長との関係を示す図である。FIG. 3 is a diagram showing the relationship between the wavelength and the sensitivity characteristic of each of the B pixel, Cy pixel, and Mg pixel according to Embodiment 1 of the present invention. 図4は、本発明の実施の形態1に係る狭帯域光と各画素との感度特性を示す図である。FIG. 4 is a diagram showing sensitivity characteristics between the narrowband light and each pixel according to the first embodiment of the present invention. 図5は、本発明の実施の形態1に係る内視鏡システムにより観察する生体組織の層方向の構造を示す模式図である。FIG. 5 is a schematic diagram showing the structure in the layer direction of the biological tissue observed by the endoscope system according to Embodiment 1 of the present invention. 図6Aは、本発明の実施の形態1に係る補間部がCy画素に対して行う補間処理の一例を示す図である。FIG. 6A is a diagram illustrating an example of interpolation processing performed on Cy pixels by the interpolation unit according to Embodiment 1 of the present invention. 図6Bは、本発明の実施の形態1に係る補間部がCy画素に対して行う補間処理の一例を示す図である。FIG. 6B is a diagram illustrating an example of interpolation processing performed on the Cy pixels by the interpolation unit according to Embodiment 1 of the present invention. 図7Aは、本発明の実施の形態1に係る補間部がB画素またはMg画素に対して行う補間処理の一例を示す図である。FIG. 7A is a diagram illustrating an example of an interpolation process performed on a B pixel or an Mg pixel by the interpolation unit according to Embodiment 1 of the present invention. 図7Bは、本発明の実施の形態1に係る補間部がB画素またはMg画素に対して行う補間処理の一例を示す図である。FIG. 7B is a diagram illustrating an example of interpolation processing performed on the B pixel or the Mg pixel by the interpolation unit according to Embodiment 1 of the present invention. 図7Cは、本発明の実施の形態1に係る補間部がB画素またはMg画素に対して行う補間処理の一例を示す図である。FIG. 7C is a diagram illustrating an example of interpolation processing performed on the B pixel or the Mg pixel by the interpolation unit according to Embodiment 1 of the present invention. 図8は、本発明の実施の形態1に係る抽出部が行う各狭帯域の光によって発生する信号値をCyチャンネルまたはMgチャンネルから抽出する処理の概要を示す図である。FIG. 8 is a diagram showing an outline of processing for extracting a signal value generated by each narrowband light performed by the extraction unit according to Embodiment 1 of the present invention from the Cy channel or the Mg channel. 図9Aは、狭帯域におけるB画素の面積を示す図である。FIG. 9A is a diagram illustrating an area of a B pixel in a narrow band. 図9Bは、狭帯域におけるCy画素の面積を示す図である。FIG. 9B is a diagram illustrating an area of a Cy pixel in a narrow band. 図9Cは、狭帯域におけるB画素およびCy画素の各々の面積を示す図である。FIG. 9C is a diagram illustrating the areas of B pixels and Cy pixels in a narrow band. 図10は、本発明の実施の形態2に係るカラーフィルタの構成の一例を示す模式図である。FIG. 10 is a schematic diagram showing an example of the configuration of the color filter according to Embodiment 2 of the present invention. 図11は、本発明の実施の形態2に係るB画素、G画素およびMg画素の各々の光の波長に対する感度特性と波長との関係を示す図である。FIG. 11 is a diagram showing the relationship between the wavelength and the sensitivity characteristic of each of the B pixel, the G pixel, and the Mg pixel according to Embodiment 2 of the present invention. 図12は、本発明の実施の形態2に係る各狭帯域の光と各画素の感度特性を示した図である。FIG. 12 is a diagram showing sensitivity characteristics of each narrowband light and each pixel according to the second embodiment of the present invention. 図13は、本発明の実施の形態2に係る抽出部が行う各狭帯域の光によって発生する信号値をMgチャンネルから抽出する処理の概要を示す図である。FIG. 13 is a diagram showing an outline of processing for extracting a signal value generated by each narrowband light from the Mg channel performed by the extraction unit according to Embodiment 2 of the present invention. 図14は、本発明の実施の形態3に係るカラーフィルタの構成の一例を示す模式図である。FIG. 14 is a schematic diagram showing an example of the configuration of the color filter according to Embodiment 3 of the present invention. 図15は、本発明の実施の形態3に係るB画素、Cy画素およびR画素の各々の光の波長に対する感度特性と波長との関係を示す図である。FIG. 15 is a diagram showing the relationship between the wavelength and the sensitivity characteristic of each of the B pixel, the Cy pixel, and the R pixel according to the third embodiment of the present invention. 図16は、本発明の実施の形態3に係る各狭帯域の光と各画素の感度特性を示した図である。FIG. 16 is a diagram showing sensitivity characteristics of each narrowband light and each pixel according to the third embodiment of the present invention. 図17は、本発明の実施の形態3に係る抽出部が行う各狭帯域の光によって発生する信号をCyチャンネルから抽出する処理の概要を示す図である。FIG. 17 is a diagram illustrating an outline of a process of extracting, from the Cy channel, a signal generated by each narrowband light performed by the extraction unit according to Embodiment 3 of the present invention. 図18は、本発明の実施の形態4に係る各狭帯域光と各画素の感度特性を示した図である。FIG. 18 is a diagram showing sensitivity characteristics of each narrowband light and each pixel according to the fourth embodiment of the present invention. 図19は、本発明の実施の形態4に係る抽出部が行う各狭帯域の光によって発生する信号をCyチャンネルおよびMgチャンネルから抽出する処理の概要を示す図である。FIG. 19 is a diagram showing an outline of processing for extracting a signal generated by each narrowband light performed by the extraction unit according to Embodiment 4 of the present invention from the Cy channel and the Mg channel.
 以下、本発明を実施するための形態(以下、「実施の形態」という)を説明する。本実施の形態では、患者等の被検体の体腔内の画像を撮像して表示する医療用の内視鏡システムについて説明する。また、本実施の形態により、本発明が限定されるものではない。さらに、図面の記載において、同一部分には同一の符号を付して説明する。 Hereinafter, modes for carrying out the present invention (hereinafter referred to as “embodiments”) will be described. In the present embodiment, a medical endoscope system that captures and displays an image of a body cavity of a subject such as a patient will be described. Further, the present invention is not limited by the present embodiment. Furthermore, in the description of the drawings, the same portions will be described with the same reference numerals.
(実施の形態1)
 図1は、本発明の実施の形態1に係る内視鏡システムの概略構成を示す模式図である。図1に示す内視鏡システム1は、被検体に挿入することで体腔内の画像信号を取得する内視鏡2と、内視鏡2の先端から出射する照明光を発生させる光源装置3と、内視鏡2により取得された画像信号に所定の画像処理を施すとともに、内視鏡システム1全体の動作を統括的に制御するプロセッサ部4と、プロセッサ部4から入力された画像信号に対応する画像を表示する表示部5と、を備える。
(Embodiment 1)
FIG. 1 is a schematic diagram showing a schematic configuration of an endoscope system according to Embodiment 1 of the present invention. An endoscope system 1 shown in FIG. 1 includes an endoscope 2 that acquires an image signal in a body cavity by being inserted into a subject, and a light source device 3 that generates illumination light emitted from the distal end of the endoscope 2. The processor unit 4 performs predetermined image processing on the image signal acquired by the endoscope 2 and controls the overall operation of the endoscope system 1. The processor unit 4 corresponds to the image signal input from the processor unit 4. And a display unit 5 for displaying an image to be displayed.
 〔内視鏡の構成〕
 まず、内視鏡2の構成について説明する。
 内視鏡2は、操作部200と、撮像レンズ201と、撮像部202と、ライトガイド203と、照明レンズ204と、A/D変換部205と、撮像情報記憶部206と、を備える。
[Configuration of endoscope]
First, the configuration of the endoscope 2 will be described.
The endoscope 2 includes an operation unit 200, an imaging lens 201, an imaging unit 202, a light guide 203, an illumination lens 204, an A / D conversion unit 205, and an imaging information storage unit 206.
 操作部200は、光源装置3に照明光の切り替えを指示する指示信号、プロセッサ部4の制御内容を変更する指示信号に加えて、送気手段、送水手段および送ガス手段等の周辺機器の動作を指示する指示信号の入力を受け付ける。 The operation unit 200 operates peripheral devices such as an air supply unit, a water supply unit, and a gas supply unit in addition to an instruction signal for instructing the light source device 3 to switch the illumination light and an instruction signal for changing the control content of the processor unit 4. An input of an instruction signal for instructing is received.
 撮像レンズ201は、内視鏡2の先端に設けられ、少なくとも被検体からの反射光を集光する。撮像レンズ201は、1枚または複数枚のレンズおよびプリズム等を用いて構成される。 The imaging lens 201 is provided at the distal end of the endoscope 2 and collects at least reflected light from the subject. The imaging lens 201 is configured using one or a plurality of lenses and a prism.
 撮像部202は、撮像レンズ201が集光した光を受光して光電変換を行うことによって、被検体の画像信号を生成し、この画像信号をA/D変換部205へ出力する。撮像部202は、撮像素子202aと、カラーフィルタ202b(第1のカラーフィルタ)と、を有する。 The imaging unit 202 receives the light collected by the imaging lens 201 and performs photoelectric conversion, thereby generating an image signal of the subject, and outputs the image signal to the A / D conversion unit 205. The imaging unit 202 includes an imaging element 202a and a color filter 202b (first color filter).
 撮像素子202aは、CCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)等のイメージセンサ等を用いて実現される。撮像素子202aは、複数の画素が2次元格子状に配置される。 The imaging element 202a is realized by using an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). The image sensor 202a has a plurality of pixels arranged in a two-dimensional lattice.
 カラーフィルタ202bは、各々が個別に設定される波長帯域の光を透過する複数の透過フィルタを有する。具体的には、青色の波長帯域の光を透過する第1のフィルタと、青色の波長帯域と緑色の波長帯域との光を透過する第2のフィルタと、少なくとも赤色の波長帯域の光を透過する第3のフィルタと、を有する複数のフィルタで構成されたフィルタユニットであって、緑色の波長帯域の光を透過するフィルタ数が、全フィルタ数の半数以上であり、かつ、青色の波長帯域の光を透過するフィルタ数が緑色の波長帯域の光を透過するフィルタ数以上であるフィルタユニットを撮像素子202aの複数の画素に対応させて配置してなる。ここで、カラーフィルタ202bの構成について説明する。図2は、カラーフィルタ202bの構成の一例を示す模式図である。 The color filter 202b has a plurality of transmission filters that transmit light in wavelength bands that are individually set. Specifically, a first filter that transmits light in the blue wavelength band, a second filter that transmits light in the blue wavelength band and the green wavelength band, and light in at least the red wavelength band A filter unit comprising a plurality of filters, wherein the number of filters transmitting light in the green wavelength band is more than half of the total number of filters, and the blue wavelength band A filter unit in which the number of filters that transmit light is equal to or greater than the number of filters that transmit light in the green wavelength band is arranged in correspondence with a plurality of pixels of the image sensor 202a. Here, the configuration of the color filter 202b will be described. FIG. 2 is a schematic diagram illustrating an example of the configuration of the color filter 202b.
 図2に示すように、カラーフィルタ202bは、青色の波長帯域の光を透過するフィルタB(第1のフィルタ)と、青色の波長帯域の光および緑色の波長帯域の光を透過するフィルタCy(第2のフィルタ)と、赤色の波長帯域の光および青色の波長帯域の光を透過するフィルタMg(第3のフィルタ)と、を有する。カラーフィルタ202bの配置パターンは、2×2画素を基本として左上にフィルタB、左下と右上とにフィルタCy、および右下にフィルタMgである。カラーフィルタ202bは、上述した配置パターンに従い撮像素子202aの全画素に渡って配置されている。以下において、フィルタB、フィルタCyおよびフィルタMgの各々が配置された画素を、B画素、Cy画素およびMg画素として表記する。 As shown in FIG. 2, the color filter 202b includes a filter B (first filter) that transmits light in the blue wavelength band, and a filter Cy (that transmits light in the blue wavelength band and light in the green wavelength band). A second filter) and a filter Mg (third filter) that transmits light in the red wavelength band and light in the blue wavelength band. The arrangement pattern of the color filter 202b is basically a filter B on the upper left, a filter Cy on the lower left and upper right, and a filter Mg on the lower right based on 2 × 2 pixels. The color filter 202b is arranged over all the pixels of the image sensor 202a according to the arrangement pattern described above. In the following, pixels in which each of the filter B, the filter Cy, and the filter Mg are arranged are denoted as B pixel, Cy pixel, and Mg pixel.
 このように構成されたB画素、Cy画素およびMg画素の各々の光の波長に対する感度特性について説明する。図3は、B画素、Cy画素およびMg画素の各々の光の波長に対する感度特性と波長との関係を示す図である。図3において、縦軸が感度を示し、横軸が波長を示す。また、図3において、曲線LB(実線)、曲線LCy(一点鎖線)および曲線LMg(点線)の各々がB画素、Cy画素およびMg画素の感度特性を示す。 The sensitivity characteristics with respect to the wavelength of light of each of the B pixel, the Cy pixel, and the Mg pixel configured as described above will be described. FIG. 3 is a diagram illustrating the relationship between the sensitivity characteristic with respect to the wavelength of light and the wavelength of each of the B pixel, the Cy pixel, and the Mg pixel. In FIG. 3, the vertical axis represents sensitivity, and the horizontal axis represents wavelength. In FIG. 3, each of a curve L B (solid line), a curve L Cy (one-dot chain line), and a curve L Mg (dotted line) indicates sensitivity characteristics of the B pixel, the Cy pixel, and the Mg pixel.
 図3の曲線LBに示すように、B画素は、青色の光(400nm~470nm)の波長帯域に感度の第1ピークを有する。また、図3の曲線LCyに示すように、Cy画素は、緑色の光(500nm~550nm)に感度の第1ピークを有し、青色の光(400nm~470nm)に感度の第2ピークを有する。また、図3の曲線LMgに示すように、Mg画素は、赤色の光(600nm~650nm)に感度の第1ピークを有し、青色の光(400nm~470nm)に感度の第2ピークを有する。 As shown by the curve L B in FIG. 3, B pixel has a first peak sensitivity in the wavelength band of blue light (400nm ~ 470nm). Further, as shown by the curve L Cy in FIG. 3, the Cy pixel has a first peak of sensitivity for green light (500 nm to 550 nm) and a second peak of sensitivity for blue light (400 nm to 470 nm). Have. Further, as shown by the curve L Mg in FIG. 3, the Mg pixel has a first peak of sensitivity for red light (600 nm to 650 nm) and a second peak of sensitivity for blue light (400 nm to 470 nm). Have.
 図1に戻り、内視鏡2の構成の説明を続ける。
 ライトガイド203は、グラスファイバ等を用いて構成され、後述する光源装置3が出射した光を導光する。
Returning to FIG. 1, the description of the configuration of the endoscope 2 will be continued.
The light guide 203 is configured using glass fiber or the like, and guides light emitted from the light source device 3 described later.
 照明レンズ204は、ライトガイド203の先端に設けられ、ライトガイド203により導光された光を拡散して内視鏡2の先端から外部へ出射する。 The illumination lens 204 is provided at the tip of the light guide 203, diffuses the light guided by the light guide 203, and emits the light from the tip of the endoscope 2 to the outside.
 A/D変換部205は、撮像素子202aが生成したアナログの画像信号に対してA/D変換を行うことによって、デジタルの画像信号を生成し、この画像信号をプロセッサ部4へ出力する。 The A / D conversion unit 205 performs A / D conversion on the analog image signal generated by the image sensor 202 a to generate a digital image signal, and outputs the image signal to the processor unit 4.
 撮像情報記憶部206は、内視鏡2を動作させるための各種プログラム、内視鏡2の動作に必要な各種パラメータおよび内視鏡2の識別情報等を含むデータを記録する。また、撮像情報記憶部206は、識別情報を記憶する識別情報記憶部206aを有する。識別情報には、内視鏡2の固有情報(ID)、年式、スペック情報、伝送方式およびカラーフィルタ202bにかかるフィルタの配列情報等が含まれる。撮像情報記憶部206は、フラッシュメモリ等を用いて実現される。 The imaging information storage unit 206 records data including various programs for operating the endoscope 2, various parameters necessary for the operation of the endoscope 2, identification information of the endoscope 2, and the like. In addition, the imaging information storage unit 206 includes an identification information storage unit 206a that stores identification information. The identification information includes unique information (ID) of the endoscope 2, year, specification information, transmission method, filter arrangement information for the color filter 202 b, and the like. The imaging information storage unit 206 is realized using a flash memory or the like.
 〔光源装置の構成〕
 次に、光源装置3の構成について説明する。
 光源装置3は、照明部31と、照明制御部32と、を備える。
[Configuration of light source device]
Next, the configuration of the light source device 3 will be described.
The light source device 3 includes an illumination unit 31 and an illumination control unit 32.
 照明部31は、照明制御部32の制御のもと、波長帯域が異なる複数の照明光を切り替えて出射する。照明部31は、光源部311と、光源ドライバ312と、切替フィルタ313と、駆動部314と、駆動ドライバ315と、集光レンズ316と、を有する。 The illumination unit 31 switches and emits a plurality of illumination lights having different wavelength bands under the control of the illumination control unit 32. The illumination unit 31 includes a light source unit 311, a light source driver 312, a switching filter 313, a drive unit 314, a drive driver 315, and a condenser lens 316.
 光源部311は、赤色、緑色および青色の光を含む白色光を出射する。光源部311から出射された白色光は、切替フィルタ313、集光レンズ316およびライトガイド203を経由して内視鏡2の先端から外部に出射される。光源部311は、LED(Light Emitting Diode)やキセノンランプ等を用いて実現される。 The light source unit 311 emits white light including red, green, and blue light. White light emitted from the light source unit 311 is emitted to the outside from the distal end of the endoscope 2 via the switching filter 313, the condenser lens 316, and the light guide 203. The light source unit 311 is realized using an LED (Light Emitting Diode), a xenon lamp, or the like.
 光源ドライバ312は、照明制御部32の制御のもと、光源部311に電力を供給することにより、光源部311に白色光を出射させる。 The light source driver 312 supplies white light to the light source unit 311 by supplying power to the light source unit 311 under the control of the illumination control unit 32.
 切替フィルタ313は、光源部311が出射する白色光の光路上に挿脱自在に配置される。切替フィルタ313は、光源部311が出射した白色光のうち、青色の狭帯域光および緑色の狭帯域光の各々のみを透過する。具体的には、切替フィルタ313は、白色光に含まれる狭帯域TB(例えば390nm~445nm)の光と、狭帯域TG(例えば530nm~550nm)の光と、からなる狭帯域光を透過する。狭帯域TBおよび狭帯域TGの各々の光は、血液中のヘモグロビンに吸収されやすい青色および緑色の波長帯域である。 The switching filter 313 is detachably disposed on the optical path of white light emitted from the light source unit 311. The switching filter 313 transmits only the blue narrow-band light and the green narrow-band light among the white light emitted from the light source unit 311. Specifically, the switching filter 313 transmits narrowband light composed of light in a narrow band T B (eg, 390 nm to 445 nm) included in white light and light in a narrow band T G (eg, 530 nm to 550 nm). To do. The light in each of the narrow band T B and the narrow band T G has a blue and green wavelength band that is easily absorbed by hemoglobin in the blood.
 図4は、狭帯域光と各画素との感度特性を示す図である。図4において、縦軸が感度を示し、横軸が波長を示す。 FIG. 4 is a diagram showing sensitivity characteristics between narrowband light and each pixel. In FIG. 4, the vertical axis indicates sensitivity, and the horizontal axis indicates wavelength.
 図4に示すように、狭帯域TGの感度は、Cy画素が最も高い。また、狭帯域TBの感度は、Cy画素が最も低く、B画素とMg画素との間に差が生じないように設定される。 As shown in FIG. 4, the sensitivity of the narrow band TG is the highest for Cy pixels. Also, the sensitivity of the narrowband T B is, Cy pixel is the lowest, the difference is set so as not to occur between the B pixel and the Mg pixel.
 図1に戻り、光源装置3の構成の説明を続ける。
 駆動部314は、切替フィルタ313を光源部311が出射する白色光の光路上に挿入または退位させる。駆動部314は、ステッピングモータやDCモータ等を用いて実現される。
Returning to FIG. 1, the description of the configuration of the light source device 3 will be continued.
The drive unit 314 inserts or displaces the switching filter 313 on the optical path of white light emitted from the light source unit 311. The drive unit 314 is realized using a stepping motor, a DC motor, or the like.
 駆動ドライバ315は、照明制御部32の制御のもと、駆動部314に所定の電力を供給する。 The drive driver 315 supplies predetermined power to the drive unit 314 under the control of the illumination control unit 32.
 集光レンズ316は、光源部311が出射した白色光または切替フィルタ313を透過した狭帯域光を集光し、光源装置3の外部(ライトガイド203)に出射する。 The condensing lens 316 collects the white light emitted from the light source unit 311 or the narrowband light transmitted through the switching filter 313 and emits the light to the outside of the light source device 3 (light guide 203).
 照明制御部32は、光源ドライバ312を制御して光源部311をオン/オフ動作させる。また、照明制御部32は、駆動ドライバ315を制御して切替フィルタ313を光源部311が出射する白色光の光路に対して挿脱動作させることによって、照明部31により出射される照明光を白色光、および狭帯域TBと狭帯域TGの光からなる狭帯域光に切り替える制御を行う。 The illumination control unit 32 controls the light source driver 312 to turn on / off the light source unit 311. In addition, the illumination control unit 32 controls the drive driver 315 to insert and remove the switching filter 313 with respect to the optical path of white light emitted from the light source unit 311, thereby changing the illumination light emitted from the illumination unit 31 to white. light, and narrow-band T B narrowband T switch to narrowband light control consisting of the G light is carried out.
 〔プロセッサの構成〕
 次に、プロセッサ部4の構成について説明する。
 プロセッサ部4は、画像処理部41と、表示画像生成部42と、入力部43と、記憶部44と、制御部45と、を備える。なお、本実施の形態1では、プロセッサ部4が画像処理装置として機能する。
[Processor configuration]
Next, the configuration of the processor unit 4 will be described.
The processor unit 4 includes an image processing unit 41, a display image generation unit 42, an input unit 43, a storage unit 44, and a control unit 45. In the first embodiment, the processor unit 4 functions as an image processing apparatus.
 画像処理部41は、A/D変換部205から入力された画像信号に対して、所定の画像処理を行って表示画像生成部42へ出力する。画像処理部41は、補間部411と、抽出部412と、を有する。 The image processing unit 41 performs predetermined image processing on the image signal input from the A / D conversion unit 205 and outputs the image signal to the display image generation unit 42. The image processing unit 41 includes an interpolation unit 411 and an extraction unit 412.
 補間部411は、A/D変換部205から入力される画像信号に対して補間処理を行う。具体的には、補間部411は、B画素、Cy画素およびMg画素の各々に対して、方向情報(例えばエッジ)に基づいた補間処理を行うことによって、Bチャンネル、CyチャンネルおよびMgチャンネルの各々の補間値を生成し、このBチャンネル、CyチャンネルおよびMgチャンネルの各々の補間値を抽出部412へ出力する。また、補間部411は、狭帯域光の観察下では、B画素、Cy画素およびMg画素の各々が所定の画素の方向情報を用いることで、BチャンネルおよびMgチャンネルを高解像で生成する。 The interpolation unit 411 performs an interpolation process on the image signal input from the A / D conversion unit 205. Specifically, the interpolation unit 411 performs an interpolation process based on direction information (for example, an edge) on each of the B pixel, the Cy pixel, and the Mg pixel, thereby each of the B channel, the Cy channel, and the Mg channel. The interpolation values of the B channel, the Cy channel, and the Mg channel are output to the extraction unit 412. Further, the interpolation unit 411 generates the B channel and the Mg channel with high resolution by using the direction information of the predetermined pixels for each of the B pixel, the Cy pixel, and the Mg pixel under the observation of the narrow band light.
 抽出部412は、狭帯域光の観察下において、撮像素子202aを構成する複数の画素の各々が生成した画素値から狭帯域光の成分を抽出する演算処理であって、狭帯域TBの光に対する感度をもつ複数の画素のうち狭帯域TBの光に対する感度が最も低い感度をもつ画素の画素値から、複数の画素のうち狭帯域TBの光に対する感度が最も高い画素の画素値と絶対値が1以下である第1の係数との積を減算することによって、少なくとも狭帯域TGによって得られる狭帯域光の成分を抽出する演算処理を行い、この演算結果を表示画像生成部42へ出力する。具体的には、抽出部412は、補間部411から入力されたBチャンネル、CyチャンネルおよびMgチャンネルの各々の補間値から所定の信号値を抽出し、抽出した信号値を表示画像生成部42へ出力する。例えば、抽出部412は、狭帯域光を構成する狭帯域TBおよび狭帯域TGの各々の信号値を、チャンネル間の演算処理により抽出し、抽出した信号値を表示画像生成部42へ出力する。また、抽出部412は、白色光の観察下において、Bチャンネル、CyチャンネルおよびMgチャンネルの各々の補間値に対して、色変換マトリックスまたは変換テーブル等を用いることによって、B画素、G画素およびR画像の各々の信号値を抽出する。 Extraction unit 412, the observation of a narrow-band light, a processing for extracting a component of the narrow-band light from the pixel values, each generated in a plurality of pixels constituting the image pickup element 202a, the narrow-band T B light sensitivity from the pixel value of the pixel having the lowest sensitivity sensitivity to light of a narrow band T B of the plurality of pixels with respect to the pixel value of the highest pixel sensitivity to light of a narrow band T B of the plurality of pixels By subtracting the product of the first coefficient having an absolute value of 1 or less, a calculation process for extracting at least a narrowband light component obtained by the narrowband TG is performed, and the calculation result is displayed on the display image generation unit 42. Output to. Specifically, the extraction unit 412 extracts a predetermined signal value from each of the interpolation values of the B channel, the Cy channel, and the Mg channel input from the interpolation unit 411, and sends the extracted signal value to the display image generation unit 42. Output. For example, the extraction unit 412 extracts the signal values of the narrow band T B and the narrow band T G constituting the narrow band light by the arithmetic processing between the channels, and outputs the extracted signal value to the display image generation unit 42. To do. In addition, the extraction unit 412 uses a color conversion matrix or a conversion table for the interpolation values of the B channel, the Cy channel, and the Mg channel under white light observation, so that the B pixel, the G pixel, and the R pixel are used. Extract the signal value of each of the images.
 表示画像生成部42は、画像処理部41から入力された信号値に基づいて、カラー画像および疑似カラー画像を生成して表示部5へ出力する。具体的には、表示画像生成部42は、狭帯域光下において、画像処理部41から入力された狭帯域光を構成する狭帯域TBおよび狭帯域TGの各々の信号値、および制御部45から入力されたカラーフィルタ情報に基づいて、疑似カラーを生成する。例えば、表示画像生成部42は、狭帯域TBの信号値を表示画像信号のBチャンネルおよびGチャンネルの各々に割り当てるとともに、狭帯域TGの信号値を表示画像信号のRチャンネルに割り当てる。また、表示画像生成部42は、白色光下において、画像処理部41から出力されたBチャンネル、GチャンネルおよびRチャンネルの各々の信号を用いてカラー画像を生成する。さらに、表示画像生成部42は、疑似カラーおよびカラー画像に対して、階調変換処理、拡大処理、構造強調処理等を行って表示画像を生成して表示部5へ出力する。 The display image generation unit 42 generates a color image and a pseudo color image based on the signal value input from the image processing unit 41 and outputs the color image and the pseudo color image to the display unit 5. Specifically, the display image generating unit 42, under the narrow band light, each of the signal values of the narrow-band T B and narrowband T G constituting the narrowband light input from the image processing unit 41, and a control unit A pseudo color is generated based on the color filter information input from 45. For example, the display image generating unit 42 assigns to each of the B channel and the G channel of the display image signal a signal value of the narrow band T B, allocating a signal value of the narrow band T G to R channel of the display image signal. In addition, the display image generation unit 42 generates a color image using the B channel, G channel, and R channel signals output from the image processing unit 41 under white light. Further, the display image generation unit 42 performs gradation conversion processing, enlargement processing, structure enhancement processing, and the like on the pseudo color and the color image, generates a display image, and outputs the display image to the display unit 5.
 入力部43は、内視鏡システム1に関する各種操作を指示する指示信号の入力を受け付ける。例えば、入力部43は、光源装置3が出射する照明光を指示する指示信号の入力を受け付ける。 The input unit 43 receives input of instruction signals for instructing various operations related to the endoscope system 1. For example, the input unit 43 receives an input of an instruction signal that instructs illumination light emitted from the light source device 3.
 記憶部44は、内視鏡システム1を動作させるための各種プログラム、および内視鏡システム1の動作に必要な各種パラメータ等を含むデータを記憶する。また、記憶部44は、内視鏡2の識別情報、例えば撮像素子202aの固有情報とカラーフィルタ202bのフィルタ配置に関する情報との関係テーブル等を記憶する。記憶部44は、FlashメモリやDRAM等の半導体メモリを用いて実現される。 The storage unit 44 stores various programs for operating the endoscope system 1 and data including various parameters necessary for the operation of the endoscope system 1. In addition, the storage unit 44 stores identification information of the endoscope 2, for example, a relationship table between unique information of the image sensor 202a and information related to the filter arrangement of the color filter 202b. The storage unit 44 is realized using a semiconductor memory such as a flash memory or a DRAM.
 制御部45は、内視鏡2および光源装置3を含む各構成部の駆動制御および各構成部に対する情報の入出制御等を行う。また、制御部45は、記憶部44に記憶されている撮像制御のための設定データ、撮像に関する制御データおよび照明に関する制御データを取得し、取得した設定データおよび制御データの各々に対応する制御信号を、所定の信号線またはバスを介して内視鏡2および光源装置3へ出力する。また、制御部45は、撮像情報記憶部206を介して出力したカラーフィルタ情報を画像処理部41および表示画像生成部42の各々へ出力する。制御部45は、CPU(Central Processing Unit)等を用いて構成される。 The control unit 45 performs drive control of each component including the endoscope 2 and the light source device 3 and information input / output control for each component. Further, the control unit 45 acquires setting data for imaging control, control data related to imaging, and control data related to illumination stored in the storage unit 44, and control signals corresponding to the acquired setting data and control data, respectively. Is output to the endoscope 2 and the light source device 3 via a predetermined signal line or bus. In addition, the control unit 45 outputs the color filter information output via the imaging information storage unit 206 to each of the image processing unit 41 and the display image generation unit 42. The control unit 45 is configured using a CPU (Central Processing Unit) or the like.
 〔表示部の構成〕
 次に、表示部5について説明する。
 表示部5は、プロセッサ部4から入力された表示画像信号に対応する画像を表示する。表示部5は、液晶または有機EL(Electro Luminescence)等を用いて実現される。
[Configuration of display section]
Next, the display unit 5 will be described.
The display unit 5 displays an image corresponding to the display image signal input from the processor unit 4. The display unit 5 is realized using liquid crystal, organic EL (Electro Luminescence), or the like.
 以上の構成を有する内視鏡システム1において、画像処理部41の抽出部412による高解像度処理の概要について詳細に説明する。図5は、内視鏡システム1により観察する生体組織の層方向の構造を示す模式図である。 In the endoscope system 1 having the above configuration, an outline of high resolution processing by the extraction unit 412 of the image processing unit 41 will be described in detail. FIG. 5 is a schematic diagram showing a layered structure of a biological tissue observed by the endoscope system 1.
 図5に示すように、被検体である生体組織71は、深さが異なる血管を有する。具体的には、生体組織71の粘膜表層付近には、主に毛細血管72が多く分布する。また、生体組織71の粘膜表層より深い中層には、毛細血管72の他に毛細血管72より太い血管73が分布する。さらに、生体組織71の深層には、中層の太い血管73よりもさらに太い血管74が分布する。 As shown in FIG. 5, the living tissue 71 as the subject has blood vessels with different depths. Specifically, many capillaries 72 are mainly distributed in the vicinity of the mucous membrane surface layer of the living tissue 71. Further, in the middle layer deeper than the surface layer of the mucous membrane of the living tissue 71, blood vessels 73 that are thicker than the capillaries 72 are distributed in addition to the capillaries 72. Furthermore, a thicker blood vessel 74 is distributed in the deep layer of the biological tissue 71 than the thick blood vessel 73 in the middle layer.
 また、生体組織71に対して光がどの程度の深さまで達するか否かは、波長に依存している。例えば、狭帯域TBの光は、粘膜表層付近で反射し、狭帯域TGの光は、粘膜表層よりも深い位置まで達する。即ち、内視鏡システム1は、狭帯域光を生体組織71に照射し、生体組織71で反射した狭帯域TBの光を撮像(受光)することによって、生体組織71における表層の毛細血管72の情報(以下、「表層情報」という)を取得することができるとともに、生体組織71で反射した狭帯域TGの光を撮像(受光)することによって、生体組織71における中層の太い血管73および深層の太い血管74の各々の情報(以下、「中深層情報」という)を取得することができる。 Further, the depth of light reaching the living tissue 71 depends on the wavelength. For example, narrow band of light T B is reflected in the vicinity of the mucosal surface, the light of the narrow-band T G reaches to a position deeper than the mucosal surface layer. That is, the endoscope system 1, the narrow-band light is irradiated to the living body tissue 71, by the light of the narrow-band T B which is reflected to the imaging (light) in the living tissue 71, the surface of the capillary 72 in the body tissue 71 Information (hereinafter referred to as “surface layer information”), and by imaging (receiving) the light of the narrow band TG reflected by the living tissue 71, the thick blood vessel 73 in the middle layer in the living tissue 71 and Information on each of the deep blood vessels 74 (hereinafter referred to as “mid-deep layer information”) can be acquired.
 また、B画素は、狭帯域光の観察下において、狭帯域TBに感度を有する。このため、B画素の画素値には、表層情報が含まれる。さらに、Cy画素およびMg画像の各々は、狭帯域TBの光および狭帯域TGの光に対して感度を有する。このため、Cy画素の画素値およびMg画素の画素値の各々には、表層情報と中深層情報が含まれる。 Also, B pixels, in the observation of a narrow-band light, sensitive to narrowband T B. For this reason, the surface layer information is included in the pixel value of the B pixel. Furthermore, each of the Cy pixel and Mg image has a sensitivity to light of the light and narrowband T G narrowband T B. For this reason, each of the pixel value of the Cy pixel and the pixel value of the Mg pixel includes surface layer information and mid-deep layer information.
 このように、本実施の形態1では、補間部411が補間処理を行ったBチャンネル、CyチャンネルおよびMgチャンネルの補間値を抽出部412へ出力し、抽出部412が狭帯域TBおよび狭帯域TGの各々の光によって得られる信号値をCyチャンネルまたはMgチャンネルから抽出して表示画像生成部42へ出力する。 Thus, in the first embodiment, B channel interpolation unit 411 performs interpolation processing, and outputs the interpolated value of Cy channel and Mg channel to the extraction unit 412, extraction unit 412 narrowband T B and narrowband A signal value obtained by each light of TG is extracted from the Cy channel or the Mg channel, and is output to the display image generation unit 42.
 〔補間部による補間処理〕
 次に、補間部411が行う補間処理について説明する。図6Aおよび図6Bは、補間部411がCy画素に対して行う補間処理の一例を示す図である。図7A~図7Cは、補間部411がB画素またはMg画素に対して行う補間処理の一例を示す図である。
[Interpolation by the interpolation unit]
Next, an interpolation process performed by the interpolation unit 411 will be described. 6A and 6B are diagrams illustrating an example of an interpolation process performed by the interpolation unit 411 on the Cy pixel. 7A to 7C are diagrams illustrating an example of interpolation processing performed by the interpolation unit 411 for the B pixel or the Mg pixel.
 図6Aおよび図6Bに示すように、まず、補間部411は、補間対象画素P1(注目画素)の近傍のCy画素について、縦方向(矢印A1を参照)および横方向(矢印A1を参照)の各々の画素値の差分を求め、差分が小さい方向の画素値を優先するよう重み付け平均を行うことによって、Cyチャンネルを補間する(図6A→図6B)。 As shown in FIGS. 6A and 6B, first, the interpolation unit 411 performs vertical (see arrow A1) and horizontal (see arrow A1) Cy pixels near the interpolation target pixel P1 (target pixel). The Cy channel is interpolated by calculating the difference between the pixel values and performing weighted averaging so as to give priority to the pixel value in the direction in which the difference is small (FIG. 6A → FIG. 6B).
 また、図7A~図7Cに示すように、補間部411は、まず、補間対象画素P1(注目画素)の近傍のB画素について、斜め方向(矢印A2を参照)の各々の画素値の差分を求め、差分が小さい方向の画素値を優先するよう重み付け平均を行うことによって、市松状に値を持つBチャンネルを作成する(図7A→図7B)。 As shown in FIGS. 7A to 7C, the interpolating unit 411 first calculates the difference between the pixel values in the diagonal direction (see the arrow A2) for the B pixel in the vicinity of the interpolation target pixel P1 (target pixel). A B channel having a checkered value is created by performing weighted averaging so as to give priority to pixel values in a direction where the difference is small (FIG. 7A → FIG. 7B).
 続いて、補間部411は、補間対象画素(注目画素)の近傍のB画素について、縦方向(矢印A3を参照)および横方向(矢印A3を参照)の各々の画素値の差分を求め、差分が小さい方向の画素値を優先するよう重み付け平均を行うことによって、Bチャンネルの信号値を補間する。また、補間部411は、B画素と同様に上述した補間処理によって、Mgチャンネルを補間する。(図7B→図7C)。なお、補間部411は、上記説明の補間処理以外にも、B画素、Cy画素およびMg画素は狭帯域TBに感度を持ち、互いの相関が高いため、Cy画素の方向情報に基づいて、B画素およびMg画素の補間を行っても良い。 Subsequently, the interpolation unit 411 obtains a difference between pixel values in the vertical direction (see the arrow A3) and the horizontal direction (see the arrow A3) for the B pixel in the vicinity of the interpolation target pixel (target pixel). The B channel signal value is interpolated by performing weighted averaging so as to give priority to the pixel value in the direction of smaller B. In addition, the interpolation unit 411 interpolates the Mg channel by the above-described interpolation processing as in the case of the B pixel. (FIG. 7B → FIG. 7C). Incidentally, the interpolation unit 411, in addition to the interpolation process described above also, B pixels, Cy pixel and Mg pixel has a sensitivity to narrowband T B, due to a high mutual correlation, based on the direction information of the Cy pixel, You may perform interpolation of B pixel and Mg pixel.
 〔抽出部の処理〕
 次に、抽出部412がCyチャンネルまたはMgチャンネルの各々の補間値から狭帯域TBの光および狭帯域TGの光によって発生する信号値を抽出する処理について説明する。図8は、抽出部412が行う各狭帯域の光によって発生する信号値をCyチャンネルまたはMgチャンネルから抽出する処理の概要を示す図である。図8において、Bチャンネル、CyチャンネルおよびMgチャンネルの各々の補間値を、PB、PCyおよびPMgと表示する。また、PBに含まれる狭帯域TBの信号値をbとし、PCyに含まれる狭帯域TBおよび狭帯域TGの各々の信号値をb’、g’とし、PMgに含まれる狭帯域TBおよび狭帯域TGの各々の信号値をb’’、g’’とする。
[Processing of the extraction unit]
Next, the extraction unit 412 will be described the process for extracting a signal value generated by the light of the light and narrowband T G narrowband T B from each of the interpolated values of Cy channel or Mg channel. FIG. 8 is a diagram illustrating an outline of processing performed by the extraction unit 412 to extract a signal value generated by each narrow-band light from the Cy channel or the Mg channel. In FIG. 8, B-channel, the respective interpolated value of Cy channel and Mg channels, indicated as P B, P Cy and P Mg. Further, the signal value of the narrow band T B included in P B is b, and the signal values of the narrow band T B and the narrow band T G included in P Cy are b ′ and g ′, respectively, and are included in PMg. The signal values of the narrow band T B and the narrow band T G are b ″ and g ″.
 図8に示すように、撮像素子202aの座標(x,y)(注目画素)における各チャンネルの補間値PB、PCyおよびPMgの各々は、以下の式(1-1)~式(1-3)で表される。
 PB(x,y)=b(x,y)            ・・・(1-1)
 PCy(x,y)=g’(x,y)+b’(x,y)   ・・・(1-2)
 PMg(x,y)=g’’(x,y)+b’’(x,y) ・・・(1-3)
さらに、b’は、係数αを用いて、以下の式(1-4)で表される。
 b’(x,y)=αb(x,y)          ・・・(1-4)
As shown in FIG. 8, the imaging device 202a of the coordinates (x, y) each of the interpolated value P B of each channel in the (pixel of interest), P Cy and P Mg has the following formula (1-1) to Formula ( 1-3).
P B (x, y) = b (x, y) (1-1)
P Cy (x, y) = g ′ (x, y) + b ′ (x, y) (1-2)
P Mg (x, y) = g ″ (x, y) + b ″ (x, y) (1-3)
Further, b ′ is expressed by the following equation (1-4) using the coefficient α.
b ′ (x, y) = αb (x, y) (1-4)
 ここで、係数αについて、図9A~図9Cを用いて説明する。図9Aは、狭帯域TBにおけるB画素(曲線LB)の面積を示す図である。図9Bは、狭帯域TBにおけるCy画素(曲線LCy)の面積を示す図である。図9Cは、狭帯域TBにおけるB画素(曲線LB)およびCy画素(曲線LCy)の各々の面積を示す図である。図9A~図9Cにおいて、狭帯域TBにおける曲線LBの面積をSB、狭帯域TBにおける曲線LCyの面積をSCyとする。 Here, the coefficient α will be described with reference to FIGS. 9A to 9C. FIG. 9A is a diagram showing the area of the B pixel (curve L B ) in the narrow band T B. Figure 9B is a diagram showing the area of the Cy pixel in the narrow-band T B (curve L Cy). FIG. 9C is a diagram illustrating the areas of the B pixel (curve L B ) and the Cy pixel (curve L Cy ) in the narrow band T B. In FIG. 9A ~ Figure 9C, the area of S B of the curve L B in the narrow-band T B, the area of the curve L Cy and S Cy in narrowband T B.
 図9A~図9Cに示すように、係数αは、狭帯域TBにおける曲線LBの面積SBと曲線LCyの面積SCyとの比であり、以下の式(1-5)で表される。
 α=SCy/SB                    ・・・(1-5)
上述した図4により、狭帯域TBに含まれるB画素の感度は、常にCy画素の感度より高い。このため、係数αは、0<α≦1となる。従って、上述した式(1-1)および式(1-4)より、
 b’(x,y)=αPB(x,y)          ・・・(1-6)
となる。係数αは、0<α≦1であるので、式(1-6)よりPBに含まれるノイズを増幅することなく、b’を求めることができる。さらに、上述した式(1-2)および式(1-6)により、
 g’(x,y)=PCy(x,y)-αPB(x,y)   ・・・(1-7)
となり、PCyに含まれるg’を求めることができる。
As shown in FIGS. 9A ~ FIG 9C, the coefficient alpha, the ratio between the area S Cy narrow area of the curve L B in the band T B S B and the curve L Cy, described in equation (1-5) Is done.
α = S Cy / S B (1-5)
The Figure 4 described above, the sensitivity of the B pixels included in the narrowband T B is always higher than the sensitivity of the Cy pixel. Therefore, the coefficient α is 0 <α ≦ 1. Therefore, from the above formula (1-1) and formula (1-4),
b ′ (x, y) = αP B (x, y) (1-6)
It becomes. Since the coefficient α is 0 <α ≦ 1, b ′ can be obtained from the equation (1-6) without amplifying the noise included in P B. Further, according to the above-described formula (1-2) and formula (1-6),
g ′ (x, y) = P Cy (x, y) −αP B (x, y) (1-7)
Thus, g ′ included in P Cy can be obtained.
 また、Cy画素およびMg画素の各々の狭帯域TGにおける感度比は、係数βを用いて、以下の式(1-8)によって表される。
 g’’(x,y)=βg’(x,y)        ・・・(1-8)
ここで、係数βは、狭帯域TGにおける曲線LCyの面積SCy’および曲線LMgの面積SMg’の比であり、以下の式(1-9)で表される。
 β=SMg’/SCy’                 ・・・(1-9)
上述した図4より狭帯域TGに対するCy画素の感度は、常にMg画素の感度よりも高い。このため、係数βは、0<β≦1となる。従って、上述した式(1-7)および式(1-8)より、
 g’’(x,y)=β(PCy(x,y)-αPB(x,y))  
                         ・・・(1-10)
となり、式(1-3)および式(1-10)より、
 b’’(x,y)=PMg(x,y)
            -β(PCy(x,y)-αPB(x,y)) 
                                            ・・・(1-11)
となる。係数αおよび係数βの各々は、0<α≦1および0<β≦1であるので、PBおよびPCyの各々に含まれるノイズを増幅することなく、PMgに含まれるb’’を求めることができる。
Further, the sensitivity ratio in the narrow band TG of each of the Cy pixel and the Mg pixel is expressed by the following expression (1-8) using the coefficient β.
g ″ (x, y) = βg ′ (x, y) (1-8)
Here, the coefficient beta, the ratio of narrowband T area of the curve L Cy in G S Cy 'and curve L Mg area S Mg', represented by the following formula (1-9).
β = S Mg '/ S Cy ' (1-9)
From FIG. 4 described above, the sensitivity of the Cy pixel with respect to the narrow band TG is always higher than the sensitivity of the Mg pixel. Therefore, the coefficient β is 0 <β ≦ 1. Therefore, from the above formulas (1-7) and (1-8),
g ″ (x, y) = β (P Cy (x, y) −αP B (x, y))
... (1-10)
From the equations (1-3) and (1-10),
b ″ (x, y) = PMg (x, y)
-Β (P Cy (x, y) -αP B (x, y))
... (1-11)
It becomes. Since each of the coefficient α and the coefficient β is 0 <α ≦ 1 and 0 <β ≦ 1, b ″ included in P Mg is not amplified without amplifying the noise included in each of P B and P Cy. Can be sought.
 このように、抽出部412は、上述した式(1-1)および式(1-11)を用いて、撮像素子202aの全画素において、信号値bおよび信号値b’’を求めることで、生体組織の表層情報を抽出する。さらに、抽出部412は、上述した式(1-7)を用いて、撮像素子202aの全画素において、信号値g’を求めることで、中深層情報を抽出する。抽出部412は、B画素およびMg画素の各々の狭帯域TBの信号値bおよび信号値b’’と、Cy画素の狭帯域TGの信号値g’を表示画像生成部42へ出力する。 As described above, the extraction unit 412 obtains the signal value b and the signal value b ″ in all the pixels of the image sensor 202a using the above formulas (1-1) and (1-11). Extract surface layer information of living tissue. Further, the extraction unit 412 extracts the mid-deep layer information by obtaining the signal value g ′ in all the pixels of the image sensor 202a using the above-described equation (1-7). Extraction unit 412 outputs of each B pixel and Mg pixels' and the signal value g of the narrow band T G of Cy pixel 'narrowband signal value b and the signal value b of T B' to the display image generating unit 42 a .
 〔表示画像生成部の処理〕
 次に、表示画像生成部42が行う処理について説明する。
 表示画像生成部42では、抽出部412から出力された信号値b、b”、g’を用いて疑似カラー画像を生成する。疑似カラー画像は、R、B、Gの3チャンネルで構成される。まず、表示画像生成部42は、BチャンネルおよびGチャンネルを生成するにあたり、Bチャンネルを生成し、同様の情報をGチャンネルに割り当てても良い。
[Processing of display image generation unit]
Next, processing performed by the display image generation unit 42 will be described.
The display image generation unit 42 generates a pseudo color image using the signal values b, b ″, and g ′ output from the extraction unit 412. The pseudo color image includes three channels of R, B, and G. First, in generating the B channel and the G channel, the display image generating unit 42 may generate the B channel and assign the same information to the G channel.
 具体的には、まず、表示画像生成部42は、制御部45から入力されたカラーフィルタ情報に基づいて、抽出部412から入力された信号値bおよび信号値b’’の各々をBチャンネル上のB画素の位置およびMg画素の位置に割り当てる。この場合、表示画像生成部42は、信号値b’’に、係数kを乗算し、信号値bおよび信号値b’’の明るさを揃える。係数kは、B画素とMg画素との感度比率に基づいた以下の式(1-12)で表すことができる。
 k=SB/SMg                   ・・・(1-12)
ここで、SBおよびSMgの各々は、狭帯域TBにおける曲線LBおよび曲線LMgの面積である。また、B画素およびMg画素の各々は、狭帯域TBにおける感度差が少ないため、係数kによる信号値b’’のノイズの増加が少ない。
Specifically, first, the display image generation unit 42 outputs each of the signal value b and the signal value b ″ input from the extraction unit 412 on the B channel based on the color filter information input from the control unit 45. To the B pixel position and Mg pixel position. In this case, the display image generation unit 42 multiplies the signal value b ″ by the coefficient k to make the brightness of the signal value b and the signal value b ″ uniform. The coefficient k can be expressed by the following equation (1-12) based on the sensitivity ratio between the B pixel and the Mg pixel.
k = S B / S Mg (1-12)
Wherein each of S B and S Mg is the area of the curve L B and the curve L Mg in narrowband T B. Also, each B pixel and Mg pixel, since a small difference in sensitivity in the narrow-band T B, a small increase in noise in the signal value b '' by a factor k.
 なお、B画素とMg画素との感度比率の係数kは、以下の条件(1-13)を満たすことが好ましい。
 1≦k≦2                   ・・・(1-13)
さらに以下の条件(1-14)を満たすことが好ましい。
 k≦1/β(1+α)              ・・・(1-14)
さらに以下の条件(1-15)を満たすことが好ましい。
 β(1+α)≦1                ・・・(1-15)
ここで、αは、上述した式(1-5)の値であり、βは、上述した式(1-9)の値である。
It should be noted that the coefficient k of the sensitivity ratio between the B pixel and the Mg pixel preferably satisfies the following condition (1-13).
1 ≦ k ≦ 2 (1-13)
Furthermore, it is preferable that the following condition (1-14) is satisfied.
k ≦ 1 / β (1 + α) (1-14)
Furthermore, it is preferable to satisfy the following condition (1-15).
β (1 + α) ≦ 1 (1-15)
Here, α is the value of the above equation (1-5), and β is the value of the above equation (1-9).
 その後、表示画像生成部42は、Bチャンネル上のCy画素の位置について、近傍のB画素の位置における信号値bおよびMg画素の位置における信号値b’’を用いて、表層情報画像を生成する。ここで、表層情報画像を生成には、方向情報を用いた補間処理を用いても良いし、線形の補間処理を用いてよい。また、補間処理を用いず、優先順位を設定し、信号値b、b”のどちらかを用いても良い。表示画像生成部42はBチャンネルの生成を行った後、Bチャンネルと同様の情報をGチャンネルに割り当てる。 Thereafter, the display image generation unit 42 generates a surface layer information image for the position of the Cy pixel on the B channel using the signal value b at the position of the neighboring B pixel and the signal value b '' at the position of the Mg pixel. . Here, in generating the surface layer information image, interpolation processing using direction information may be used, or linear interpolation processing may be used. Alternatively, the priority order may be set without using the interpolation process, and either of the signal values b and b ″ may be used. After the display image generation unit 42 generates the B channel, the same information as the B channel is generated. Are assigned to the G channel.
 また、表示画像生成部42は、抽出部412から入力される信号値g’をRチャンネル上の各画素の位置に割り当てることによって中深層情報画像を生成する。 Further, the display image generation unit 42 generates a mid-depth information image by assigning the signal value g ′ input from the extraction unit 412 to the position of each pixel on the R channel.
 以上説明した本発明の実施の形態1によれば、抽出部412がBチャンネル、CyチャンネルおよびMgチャンネルに対して極力ノイズの増幅を行わない演算処理を行うことによって、補間値PCyおよび補間値PMgから高品質な狭帯域の信号を抽出することができるので、ノイズの増幅を抑えつつ、狭帯域光下での被検体の観察部位における粘膜表層、中層および深層の高解像度な画像を取得することができる。 According to the first embodiment of the present invention described above, the extraction unit 412 is B channels, by performing arithmetic processing as much as possible not to perform amplification of noise to Cy channels and Mg channel interpolation value P Cy and interpolated values it is possible to extract high-quality narrow-band signal from the P Mg, while suppressing amplification of noise, the mucosal surface in the observation region of a subject under the narrow band light, the middle layer and deep high-resolution image acquisition can do.
 また、本発明の実施の形態1によれば、抽出部412が少なくともMgチャンネルの補間値から、撮像素子202aを構成する複数の画素のうち狭帯域TGの光に対する感度が最も高いCyチャンネルの補間値とβとの積を減算することによって、狭帯域TBの光によって得られる狭帯域光の成分をさらに抽出するので、ノイズの増幅を抑えつつ、狭帯域光下での被検体の観察部位における粘膜表層、中層および深層の高解像度な画像を取得することができる。 Further, according to the first embodiment of the present invention, the extraction unit 412 determines the Cy channel with the highest sensitivity to the light in the narrow band TG among the plurality of pixels constituting the image sensor 202a from the interpolation value of the Mg channel. by subtracting the product of the interpolation value and the beta, since further extracts the narrowband light components obtained by the optical narrow-band T B, while suppressing amplification of noise, the subject under the narrow band light observation High-resolution images of the mucosal surface layer, middle layer and deep layer at the site can be acquired.
 さらに、本発明の実施の形態1によれば、抽出部412が狭帯域TGの光によって得られる狭帯域光の成分を抽出した後に、狭帯域TBの光によって得られる狭帯域光の成分を抽出するので、高解像度な画像を取得することができる。 Furthermore, according to the first embodiment of the present invention, after the extraction unit 412 extracts a narrow-band light components obtained by the optical narrow-band T G, narrowband T component of the narrow-band light obtained by the light of B Therefore, a high-resolution image can be acquired.
(実施の形態2)
 次に、本発明の実施の形態2について説明する。本実施の形態2に係る内視鏡システムは、上述した実施の形態1に係るカラーフィルタ202bと構成が異なるうえ、抽出部および表示画像生成部の各々が行う処理が異なる。以下においては、本実施の形態2に係るカラーフィルタの構成を説明後、本実施の形態2に係る抽出部および表示画像生成部が実行する処理について説明する。なお、上述した実施の形態1に係る内視鏡システム1と同一の構成には同一の符号を付して説明を省略する。
(Embodiment 2)
Next, a second embodiment of the present invention will be described. The endoscope system according to the second embodiment has a configuration different from that of the color filter 202b according to the first embodiment described above, and a process performed by each of the extraction unit and the display image generation unit. In the following, after describing the configuration of the color filter according to the second embodiment, the processing executed by the extraction unit and the display image generation unit according to the second embodiment will be described. In addition, the same code | symbol is attached | subjected to the structure same as the endoscope system 1 which concerns on Embodiment 1 mentioned above, and description is abbreviate | omitted.
 図10は、本発明の実施の形態2に係るカラーフィルタの構成の一例を示す模式図である。 FIG. 10 is a schematic diagram showing an example of the configuration of the color filter according to Embodiment 2 of the present invention.
 図10に示すカラーフィルタ202c(第2のカラーフィルタ)は、フィルタB(第1のフィルタ)と、緑色の波長帯域の光を透過するフィルタG(第5のフィルタ)と、フィルタMg(第4のフィルタ)と、を有する。カラーフィルタ202cの配置パターンは、2×2画素を基本として、左上にフィルタB、左下と右上にフィルタG、および右下フィルタにフィルタMgである。カラーフィルタ202cは、上述した配置パターンに従い撮像素子202aの全画素に渡って配置されている。以下において、フィルタB、フィルタGおよびフィルタMgの各々が配置された画素を、B画素、G画素およびMg画素として表記する。 The color filter 202c (second color filter) illustrated in FIG. 10 includes a filter B (first filter), a filter G (fifth filter) that transmits light in the green wavelength band, and a filter Mg (fourth filter). Filter). The arrangement pattern of the color filter 202c is based on 2 × 2 pixels, and is a filter B on the upper left, a filter G on the lower left and upper right, and a filter Mg on the lower right filter. The color filter 202c is arranged over all the pixels of the image sensor 202a according to the arrangement pattern described above. In the following, the pixels in which each of the filter B, the filter G, and the filter Mg are arranged are denoted as a B pixel, a G pixel, and an Mg pixel.
 このように構成されたB画素、G画素およびMg画素の各々の光の波長に対する感度特性について説明する。図11は、B画素、G画素およびMg画素の各々の光の波長に対する感度特性と波長との関係を示す図である。図11において、縦軸が感度を示し、横軸が波長を示す。また、図11において、曲線LB(実線)、曲線LG(一点鎖線)および曲線LMg(点線)の各々がB画素、G画素およびMg画素の感度特性を示す。 The sensitivity characteristics with respect to the wavelength of light of each of the B pixel, the G pixel, and the Mg pixel configured as described above will be described. FIG. 11 is a diagram illustrating the relationship between the sensitivity characteristic with respect to the wavelength of light and the wavelength of each of the B pixel, the G pixel, and the Mg pixel. In FIG. 11, the vertical axis indicates the sensitivity, and the horizontal axis indicates the wavelength. In FIG. 11, each of a curve L B (solid line), a curve L G (dashed line), and a curve L Mg (dotted line) indicates sensitivity characteristics of the B pixel, the G pixel, and the Mg pixel.
 図11の曲線LBに示すように、B画素は、青色の光(400nm~470nm)の波長帯域に感度の第1ピークを有する。また、図11の曲線Lに示すように、G画素は、緑色の光(500nm~550nm)に感度の第1ピークを有する。また、図11の曲線LMgに示すように、Mg画素は、赤色の光(600nm~650nm)に感度の第1ピークを有し、青色の光(400nm~470nm)に感度の第2ピークを有する。 As shown by the curve L B in FIG. 11, B pixel has a first peak sensitivity in the wavelength band of blue light (400nm ~ 470nm). Further, as shown by the curve L g of FIG. 11, G pixel has a first peak of the sensitivity to green light (500nm ~ 550nm). Further, as shown by a curve L Mg in FIG. 11, the Mg pixel has a first peak of sensitivity for red light (600 nm to 650 nm) and a second peak of sensitivity for blue light (400 nm to 470 nm). Have.
 図12は、狭帯域光と各画素の感度特性を示した図である。図12において、横軸が波長を示し、縦軸が感度を示す。 FIG. 12 is a diagram showing the narrowband light and the sensitivity characteristics of each pixel. In FIG. 12, the horizontal axis indicates the wavelength, and the vertical axis indicates the sensitivity.
 図12に示すように、狭帯域TGにおける各波長における感度は、G画素が最も高い。また、狭帯域TBにおける各波長における感度は、B画素とMg画素との間に差が生じないように設計される。 As shown in FIG. 12, the G pixel has the highest sensitivity at each wavelength in the narrow band TG . Further, the sensitivity at each wavelength in the narrow band T B, the difference is designed so as not to occur between the B pixel and the Mg pixel.
 このように構成されたカラーフィルタ202cが配置された撮像素子202aが生成した画像データに対して、補間部411がB画素、G画素およびMg画素の各々の画素値を用いて補間を行う補間処理を行う。この場合、補間部411は、B画素、G画素およびMg画素の画素値を用いて線形補間等を用いて所定の補間を行う。補間部411は、B画素、G画素およびMg画素の画素値から生成されたBチャンネル、GチャンネルおよびMgチャンネルの補間値を抽出部412へ出力する。 Interpolation processing in which the interpolation unit 411 performs interpolation using the pixel values of the B pixel, the G pixel, and the Mg pixel with respect to the image data generated by the imaging element 202a in which the color filter 202c configured as described above is arranged. I do. In this case, the interpolation unit 411 performs predetermined interpolation using linear interpolation or the like using the pixel values of the B pixel, the G pixel, and the Mg pixel. The interpolation unit 411 outputs the interpolation values of the B channel, the G channel, and the Mg channel generated from the pixel values of the B pixel, the G pixel, and the Mg pixel to the extraction unit 412.
 〔抽出部の処理〕
 次に、抽出部412が行う処理について説明する。図13は、抽出部412が行う各狭帯域の光によって発生する信号値をMgチャンネルから抽出する処理の概要を示す図である。図13において、Bチャンネル、GチャンネルおよびMgチャンネルの各々の補間値を、PB、PGおよびPMgと表示する。また、PBに含まれる狭帯域TBの信号値をbとし、PGに含まれる狭帯域TGの各々の信号値をg’とし、PMgに含まれる狭帯域TBおよび狭帯域TGの各々の信号値をb’’、g’’とする。
[Processing of the extraction unit]
Next, processing performed by the extraction unit 412 will be described. FIG. 13 is a diagram illustrating an outline of processing performed by the extraction unit 412 to extract a signal value generated by each narrowband light from the Mg channel. In Figure 13, B-channel, each of the interpolation value of the G channel and Mg channels, indicated as P B, P G, and P Mg. Further, the signal value of the narrow band T B included in P B is set as b, the signal value of each of the narrow bands T G included in P G is set as g ′, and the narrow band T B and the narrow band T included in P Mg are set. Let each signal value of G be b ″ and g ″.
 図13に示すように、撮像素子202aの座標(x,y)(注目画素)におけるPB、PGおよびPMgの各々は、以下の式(2-1)~(2-3)で表される。
 PB(x,y)=b(x,y)            ・・・(2-1)
 PG(x,y)=g’(x,y)           ・・・(2-2)
 PMg(x,y)=g’’(x,y)+b’’(x,y  ・・・(2-3)
さらに、g’は、G画素およびMg画素の各々の狭帯域TGに対する各波長に基づいた係数γを用いて、以下の式(2-4)で表される。
 g’’(x,y)=γPG(x,y)         ・・・(2-4)
上述した式(2-3)および式(2-4)より、
 b’’(x,y)=PMg(x,y)-γPG(x,y) ・・・(2―5)
As shown in FIG. 13, each of P B , P G, and P Mg at the coordinates (x, y) (target pixel) of the image sensor 202a is expressed by the following equations (2-1) to (2-3). Is done.
P B (x, y) = b (x, y) (2-1)
P G (x, y) = g ′ (x, y) (2-2)
P Mg (x, y) = g ″ (x, y) + b ″ (x, y (2-3)
Further, g ′ is expressed by the following equation (2-4) using a coefficient γ based on each wavelength for the narrow band TG of each of the G pixel and the Mg pixel.
g ″ (x, y) = γP G (x, y) (2-4)
From the above formulas (2-3) and (2-4),
b '' (x, y) = P Mg (x, y) -γP G (x, y) ··· (2-5)
 ここで、図12に示すように、G画素の感度は、狭帯域TGに含まれる各波長成分において、常にMg画素の感度よりも高い。このため、係数γは、0<γ≦1となる。従って、上述した式(2-5)によりPMgに含まれるノイズを増幅することなく、PMgに含まれるb’’を求めることができる。 Here, as shown in FIG. 12, the sensitivity of the G pixel is always higher than the sensitivity of the Mg pixel in each wavelength component included in the narrow band TG . Therefore, the coefficient γ is 0 <γ ≦ 1. Therefore, without amplifying the noise contained in the P Mg by formula (2-5) described above, can be obtained b '' contained in P Mg.
 このように、抽出部412は、上述した式(2-1)および式(2-5)を用いて、撮像素子202aの全画素において、信号値bおよび信号値b’’を求めることで、生体組織の表層情報を抽出する。さらに、抽出部412は、上述した式(2-2)を用いて、撮像素子202aの全画素において、信号値g’を求めることで、中深層情報を抽出する。抽出部412は、B画素およびMg画素の各々の狭帯域TBの信号値bおよび信号値b’’と、G画素の狭帯域TGの信号値g’を表示画像生成部42へ出力する。 As described above, the extraction unit 412 obtains the signal value b and the signal value b ″ in all the pixels of the image sensor 202a using the above-described equations (2-1) and (2-5). Extract surface layer information of living tissue. Further, the extraction unit 412 extracts the mid-deep layer information by obtaining the signal value g ′ in all the pixels of the image sensor 202a using the above-described equation (2-2). The extraction unit 412 outputs the signal value b and the signal value b ″ of the narrow band T B of each of the B pixel and the Mg pixel and the signal value g ′ of the narrow band T G of the G pixel to the display image generation unit 42. .
 〔表示画像生成部の処理〕
 次に、表示画像生成部42の処理について説明する。
 表示画像生成部42では、抽出部412から出力された信号値b、b”、g’を用いて疑似カラー画像を生成する。疑似カラー画像はR、B、Gの3チャンネルで構成される。まず、表示画像生成部42は、BチャンネルおよびGチャンネルを生成するにあたり、Bチャンネルを生成し、同様の情報をGチャンネルにわりあてても良い。
[Processing of display image generation unit]
Next, processing of the display image generation unit 42 will be described.
The display image generation unit 42 generates a pseudo color image using the signal values b, b ″, g ′ output from the extraction unit 412. The pseudo color image is composed of three channels of R, B, and G. First, in generating the B channel and the G channel, the display image generation unit 42 may generate the B channel and assign the same information to the G channel.
 表示画像生成部42は、制御部45から入力されたカラーフィルタ情報に基づいて、抽出部412から入力された信号値bおよび信号値b’’の各々をBチャンネル上のB画素の位置およびMg画素の位置に割り当てる。この場合、表示画像生成部42は、b’’に、係数kを乗算し、信号値bおよび信号値b’’の明るさを揃える。係数kは、B画素とMg画素との感度比率に基づいた上述の式(1-12)で表すことができる。その後、表示画像生成部42は、G画素の位置において、近傍のB画素の位置における信号値bおよびMg画素の位置における信号値b’’を用いて、表層情報画像を生成する。ここで、表層情報画像を生成には、方向情報を用いた補間処理を用いても良いし、線形の補間処理を用いてよい。また、補間を用いず、優先順位を設定し、信号値b、b”のどちらかを用いても良い。表示画像生成部42はBチャンネルの生成を行った後、Bチャンネルと同様の情報をGチャンネルに割り当てる。また、表示画像生成部42は、抽出部412から入力される信号値g’をRチャンネル上の各画素の位置に割り当てることによって中深層情報画像を生成する。 Based on the color filter information input from the control unit 45, the display image generation unit 42 converts each of the signal value b and the signal value b ″ input from the extraction unit 412 to the position of the B pixel on the B channel and Mg Assign to pixel location. In this case, the display image generation unit 42 multiplies b ″ by a coefficient k to make the brightness of the signal value b and the signal value b ″ uniform. The coefficient k can be expressed by the above equation (1-12) based on the sensitivity ratio between the B pixel and the Mg pixel. Thereafter, the display image generation unit 42 generates a surface layer information image using the signal value b at the position of the neighboring B pixel and the signal value b '' at the position of the Mg pixel at the position of the G pixel. Here, in generating the surface layer information image, interpolation processing using direction information may be used, or linear interpolation processing may be used. Alternatively, the priority order may be set without using interpolation, and either of the signal values b and b ″ may be used. The display image generation unit 42 generates the same information as the B channel after generating the B channel. Further, the display image generation unit 42 generates a mid-depth information image by assigning the signal value g ′ input from the extraction unit 412 to the position of each pixel on the R channel.
 以上説明した本発明の実施の形態2によれば、Bチャンネル、GチャンネルおよびMgチャンネルに対して極力ノイズの増幅を行わない演算処理を行うことで、PGおよびPMgから高品質な狭帯域の信号を抽出することができる。これにより、ノイズを増幅すること無く、狭帯域光下での被検体の観察部位における粘膜表層、中層および深層の、高解像度な画像を取得することができる。 According to the second embodiment of the present invention described above, B channels, by performing arithmetic processing as much as possible not to perform amplification of noise to the G channel and Mg channels, high-quality narrowband P G and P Mg Can be extracted. Thereby, it is possible to acquire high-resolution images of the mucosal surface layer, the middle layer, and the deep layer at the observation site of the subject under narrow band light without amplifying noise.
(実施の形態3)
 次に、本発明の実施の形態3について説明する。本実施の形態3に係る内視鏡システムは、上述した実施の形態1に係るカラーフィルタ202bと構成が異なるうえ、抽出部および表示画像生成部の各々が行う処理が異なる。以下においては、本実施の形態3に係るカラーフィルタの構成を説明後、本実施の形態3に係る抽出部および表示画像生成部が実行する処理について説明する。なお、上述した実施の形態1に係る内視鏡システム1と同一の構成には同一の符号を付して説明を省略する。
(Embodiment 3)
Next, a third embodiment of the present invention will be described. The endoscope system according to the third embodiment has a configuration different from that of the color filter 202b according to the first embodiment described above, and different processes performed by each of the extraction unit and the display image generation unit. In the following, after describing the configuration of the color filter according to the third embodiment, the processing executed by the extraction unit and the display image generation unit according to the third embodiment will be described. In addition, the same code | symbol is attached | subjected to the structure same as the endoscope system 1 which concerns on Embodiment 1 mentioned above, and description is abbreviate | omitted.
 図14は、本発明の実施の形態3に係るカラーフィルタの構成の一例を示す模式図である。 FIG. 14 is a schematic diagram showing an example of the configuration of a color filter according to Embodiment 3 of the present invention.
 図14に示すカラーフィルタ202dは、フィルタB(第1のフィルタ)と、フィルタCy(第2のフィルタ)と、赤色の波長帯域の光を透過するフィルタR(第3のフィルタ)と、を有する。カラーフィルタ202dの配置パターンは、2×2画素を基本として、左上にフィルタB、左下と右上にフィルタCy、および右下にフィルタRである。カラーフィルタ202dは、上述した配置パターンに従い撮像素子202aの全画素に渡って配置されている。以下において、フィルタB、フィルタCyおよびフィルタRの各々が配置された画素を、B画素、Cy画素およびR画素として表記する。 The color filter 202d illustrated in FIG. 14 includes a filter B (first filter), a filter Cy (second filter), and a filter R (third filter) that transmits light in the red wavelength band. . The arrangement pattern of the color filter 202d is basically a filter B on the upper left, a filter Cy on the lower left and upper right, and a filter R on the lower right, based on 2 × 2 pixels. The color filter 202d is arranged over all the pixels of the image sensor 202a according to the arrangement pattern described above. In the following, a pixel in which each of the filter B, the filter Cy, and the filter R is arranged is denoted as a B pixel, a Cy pixel, and an R pixel.
 このように構成されたB画素、Cy画素およびR画素の各々の光の波長に対する感度特性について説明する。図15は、B画素、Cy画素およびR画素の各々の光の波長に対する感度特性と波長との関係を示す図である。図15において、縦軸が感度を示し、横軸が波長を示す。また、図15において、曲線LB(実線)、曲線L Cy(一点鎖線)および曲線LR(点線)の各々がB画素、Cy画素およびR画素の感度特性を示す。 The sensitivity characteristics with respect to the wavelength of light of each of the B pixel, the Cy pixel, and the R pixel configured as described above will be described. FIG. 15 is a diagram illustrating the relationship between the sensitivity characteristic with respect to the wavelength of light and the wavelength of each of the B pixel, the Cy pixel, and the R pixel. In FIG. 15, the vertical axis indicates sensitivity, and the horizontal axis indicates wavelength. In FIG. 15, each of a curve L B (solid line), a curve L Cy (one-dot chain line), and a curve L R (dotted line) indicates the sensitivity characteristics of the B pixel, the Cy pixel, and the R pixel.
 図15の曲線LBに示すように、B画素は、青色の光(400nm~470nm)の波長帯域に感度の第1ピークを有する。また、図15の曲線LCyに示すように、Cy画素は、緑色の光(500nm~550nm)に感度の第1ピークを有し、青色の光(400nm~470nm)に感度の第2ピークを有する。また、図15の曲線LRに示すように、R画素は、赤色の光(600nm~650nm)に感度の第1ピークを有する。 As shown by the curve L B in FIG. 15, B pixel has a first peak sensitivity in the wavelength band of blue light (400nm ~ 470nm). Further, as shown by the curve L Cy in FIG. 15, the Cy pixel has a first peak of sensitivity for green light (500 nm to 550 nm) and a second peak of sensitivity for blue light (400 nm to 470 nm). Have. Further, as shown by the curve L R in Figure 15, R pixel has a first peak of sensitivity to red light (600nm ~ 650nm).
 図16は、狭帯域光と各画素の感度特性を示した図である。図16において、横軸が波長を示し、縦軸が感度を示す。 FIG. 16 is a diagram showing narrowband light and sensitivity characteristics of each pixel. In FIG. 16, the horizontal axis indicates the wavelength, and the vertical axis indicates the sensitivity.
 図16に示すように、狭帯域TGにおける各波長における感度は、Cy画素が最も高い。また、狭帯域TBにおける各波長における感度は、B画素が最も高い。 As shown in FIG. 16, the Cy pixel has the highest sensitivity at each wavelength in the narrow band TG . Further, the sensitivity at each wavelength in the narrow band T B is, B pixels is highest.
 このように構成されたカラーフィルタ202dが配置された撮像素子202aが生成した画像データに対して、補間部411がB画素、Cy画素およびR画素の各々の方向情報に基づいた補間処理を行う。B画素およびCy画素の各々は、狭帯域光観察下において、狭帯域TBに感度を持つため、Cy画素の方向情報を用いることで、Bチャンネルを高解像度で生成することができる。また、補間部411は、Cy画素の方向情報を用いる以外でも、各々の画素の方向情報を用いて補間処理を行っても良い。 The interpolation unit 411 performs an interpolation process based on the direction information of each of the B pixel, the Cy pixel, and the R pixel with respect to the image data generated by the imaging element 202a in which the color filter 202d configured as described above is arranged. Each B pixel and Cy pixels, the narrow-band light observation under for sensitive to narrowband T B, by using the direction information of the Cy pixel, it is possible to generate a B-channel at high resolution. Further, the interpolation unit 411 may perform the interpolation process using the direction information of each pixel other than using the direction information of the Cy pixel.
 〔抽出部の処理〕
 次に、抽出部412が行う処理について説明する。
 図17は、抽出部412が行う各狭帯域の光によって発生する信号値をCyチャンネルから抽出する処理の概要を示す図である。図17において、BチャンネルおよびCyチャンネルの各々の補間値を、PBおよびPCyと表示する。また、PBに含まれる狭帯域TBの信号値をbとし、PCyに含まれる狭帯域TBおよび狭帯域TGの各々の信号値をb’、g’とする。
[Processing of the extraction unit]
Next, processing performed by the extraction unit 412 will be described.
FIG. 17 is a diagram illustrating an outline of processing performed by the extraction unit 412 to extract the signal value generated by each narrowband light from the Cy channel. In FIG. 17, the interpolation values of the B channel and the Cy channel are respectively expressed as P B and P Cy . Further, the signal value of the narrow band T B included in P B is b, and the signal values of the narrow band T B and the narrow band T G included in P Cy are b ′ and g ′.
 図17に示すように、撮像素子202aの座標(x,y)(注目画素)におけるPBおよびPCyの各々は、以下の式(3-1)および式(3-2)で表される。
 PB(x,y)=b(x,y)            ・・・(3-1)
 PCy(x,y)=g’(x,y)+b’(x,y)   ・・・(3-2)
 さらに、b’は、B画素およびCy画素の各々の狭帯域TBに対する各波長の感度比に基づいて求められる係数αを用いて、以下の式(3-3)で表される。
 b’(x,y)=αb(x,y)          ・・・(3-3)
と、表せる。式(3-2)および式(3-3)より、
 g’(x,y)=PCy(x,y)-αPB(x,y)  ・・・(3-4)
As shown in FIG. 17, P B and P Cy at the coordinates (x, y) (pixel of interest) of the image sensor 202a are expressed by the following equations (3-1) and (3-2). .
P B (x, y) = b (x, y) (3-1)
P Cy (x, y) = g ′ (x, y) + b ′ (x, y) (3-2)
Further, b ', using the α coefficient obtained based on the sensitivity ratio of the respective wavelength for narrow band T B of each B pixel and Cy pixel is expressed by the following equation (3-3).
b ′ (x, y) = αb (x, y) (3-3)
It can be expressed. From formula (3-2) and formula (3-3),
g ′ (x, y) = P Cy (x, y) −αP B (x, y) (3-4)
 ここで、図16に示すように、B画素の感度は、狭帯域TBに含まれる各波長成分において、常にCy画素の感度よりも高い。このため、係数αは、0<α≦1となる。従って、上述した式(3-4)によりPBに含まれるノイズを増幅することなく、PCyに含まれるg’を求めることができる。 Here, as shown in FIG. 16, the sensitivity of the B pixels, at each wavelength component contained in the narrowband T B, higher always than the sensitivity of the Cy pixel. Therefore, the coefficient α is 0 <α ≦ 1. Therefore, g ′ included in P Cy can be obtained without amplifying the noise included in P B by the above-described equation (3-4).
 このように、抽出部412は、上述した式(3-1)および式(3-3)を用いて、撮像素子202aの全画素において、信号値bを求めることで、生体組織の表層情報を抽出する。さらに、抽出部412は、上述した式(3-4)を用いて、撮像素子202aの全画素において、信号値g’を求めることで、中深層情報を抽出する。抽出部412は、B画素およびCy画素の各々の狭帯域TBの信号値bと、G画素の狭帯域TGの信号値g’を表示画像生成部42へ出力する。 As described above, the extraction unit 412 obtains the surface information of the living tissue by obtaining the signal value b in all the pixels of the image sensor 202a using the above-described equations (3-1) and (3-3). Extract. Further, the extraction unit 412 extracts the mid-deep layer information by obtaining the signal value g ′ in all the pixels of the image sensor 202a using the above formula (3-4). Extraction unit 412 outputs the signal value b of the narrow band T B of each B pixel and Cy pixel, the display image generating unit 42 a signal value g 'of narrow-band T G of the G pixel.
 〔表示画像生成部の処理〕
 次に、表示画像生成部42の処理について説明する。
 表示画像生成部42では、抽出部412から出力された信号値b、g’を用いて疑似カラー画像を生成する。疑似カラー画像はR、B、Gの3チャンネルで構成される。BチャンネルおよびGチャンネルを生成するにあたり、まずBチャンネルを生成し、同様の情報をGチャンネルに割り当てても良い。
[Processing of display image generation unit]
Next, processing of the display image generation unit 42 will be described.
The display image generation unit 42 generates a pseudo color image using the signal values b and g ′ output from the extraction unit 412. The pseudo color image is composed of three channels of R, B, and G. In generating the B channel and the G channel, the B channel may be generated first, and similar information may be assigned to the G channel.
 表示画像生成部42は、制御部45から入力されたカラーフィルタ情報に基づいて、抽出部412から入力された信号値bをBチャンネル上、およびGチャンネル上の各画素位置に割り当てることで、生体組織の表層情報画像を生成する。また、表示画像生成部42は、抽出部412から入力された信号値g’をRチャンネル上の各画素位置に割り当てることで、中深層情報画像を生成する。 The display image generation unit 42 assigns the signal value b input from the extraction unit 412 to each pixel position on the B channel and the G channel based on the color filter information input from the control unit 45, thereby A tissue surface information image is generated. Further, the display image generation unit 42 generates a mid-depth information image by assigning the signal value g ′ input from the extraction unit 412 to each pixel position on the R channel.
 以上説明した本発明の実施の形態3によれば、Bチャンネル、Cyチャンネルに対して極力ノイズの増幅を行わない演算処理を行うことで、PBおよびPCyから高品質な狭帯域の信号を抽出することができる。これにより、ノイズの増幅を抑えつつ、狭帯域光下での被検体の観察部位における粘膜表層、中層および深層の高解像度な画像を取得することができる。 According to the third embodiment of the present invention described above, high-quality narrow-band signals are obtained from P B and P Cy by performing arithmetic processing that does not amplify noise as much as possible for the B channel and the Cy channel. Can be extracted. As a result, high-resolution images of the mucosal surface layer, middle layer, and deep layer at the observation site of the subject under narrow band light can be acquired while suppressing noise amplification.
(実施の形態4)
 次に、本発明の実施の形態4について説明する。本実施の形態4は、上述した実施の形態1に係るカラーフィルタ202bを構成する各フィルタの感度特性、および抽出部および表示画像生成部の各々の処理が異なる。以下においては、本実施の形態4に係る各画素の感度特性を説明後、本実施の形態4に係る抽出部および表示画像生成部が実行する処理について説明する。なお、上述した実施の形態1に係る内視鏡システム1と同一の構成には同一の符号を付して説明を省略する。
(Embodiment 4)
Next, a fourth embodiment of the present invention will be described. In the fourth embodiment, the sensitivity characteristics of each filter constituting the color filter 202b according to the first embodiment described above, and the processes of the extraction unit and the display image generation unit are different. In the following, after describing the sensitivity characteristics of each pixel according to the fourth embodiment, the processing executed by the extraction unit and the display image generation unit according to the fourth embodiment will be described. In addition, the same code | symbol is attached | subjected to the structure same as the endoscope system 1 which concerns on Embodiment 1 mentioned above, and description is abbreviate | omitted.
 図18は、本発明の実施の形態4に係る狭帯域光と各画素の感度特性を示す図である。図18において、横軸が波長を示し、縦軸が感度を示す。また、図18において、曲線LB(実線)、曲線LCy(一点鎖線)および曲線LMg(点線)の各々がB画素、Cy画素およびMg画素の感度特性を示す。 FIG. 18 is a diagram showing narrowband light and sensitivity characteristics of each pixel according to Embodiment 4 of the present invention. In FIG. 18, the horizontal axis indicates the wavelength, and the vertical axis indicates the sensitivity. In FIG. 18, each of a curve L B (solid line), a curve L Cy (one-dot chain line), and a curve L Mg (dotted line) indicates the sensitivity characteristics of the B pixel, the Cy pixel, and the Mg pixel.
 図18に示すように、Mg画素は、狭帯域TGの各波長における感度が無視できるほど小さい。 As shown in FIG. 18, the Mg pixel is so small that the sensitivity at each wavelength of the narrow band TG can be ignored.
 〔抽出部の処理〕
 次に、抽出部412が行う処理について説明する。
 図19は、抽出部412が行う各狭帯域の光によって発生する信号値をCyチャンネルおよびMgチャンネルから抽出する処理の概要を示す図である。図19において、Bチャンネル、CyチャンネルおよびMgチャンネルの各々の補間値を、PB、PCyおよびPMgと表示する。また、PBに含まれる狭帯域TBの信号値をbとし、PCyに含まれる狭帯域TBおよび狭帯域TGの各々の信号値をb’、g’とし、PMgに含まれる狭帯域TBおよび狭帯域TGの各々の信号値をb’’とする。
[Processing of the extraction unit]
Next, processing performed by the extraction unit 412 will be described.
FIG. 19 is a diagram illustrating an outline of processing performed by the extraction unit 412 to extract the signal value generated by each narrow band light from the Cy channel and the Mg channel. In Figure 19, B-channel, the respective interpolated value of Cy channel and Mg channels, indicated as P B, P Cy and P Mg. Further, the signal value of the narrow band T B included in P B is b, and the signal values of the narrow band T B and the narrow band T G included in P Cy are b ′ and g ′, respectively, and are included in PMg. The signal values of the narrow band T B and the narrow band T G are assumed to be b ″.
 図18に示すように、撮像素子202aの座標(x,y)(注目画素)におけるPB、PCyおよびPMgの各々は、以下の式(4-1)~式(4-3)で表される。
 PB(x,y)=b(x,y)            ・・・(4-1)
 PCy(x,y)=g’(x,y)+b’(x,y)   ・・・(4-2)
 PMg(x,y)=b’ ’(x,y)         ・・・(4-3)
さらに、b’は、B画素およびCy画素の各々の狭帯域TBに対する各波長の感度比に基づいて求められる係数αを用いて、以下の式(4-4)で表される。
 b’(x,y)=αb(x,y)          ・・・(4-4)
As shown in FIG. 18, each of P B , P Cy and P Mg at the coordinates (x, y) (target pixel) of the image sensor 202a is expressed by the following equations (4-1) to (4-3). expressed.
P B (x, y) = b (x, y) (4-1)
P Cy (x, y) = g ′ (x, y) + b ′ (x, y) (4-2)
P Mg (x, y) = b ′ ′ (x, y) (4-3)
Further, b ′ is expressed by the following equation (4-4) using a coefficient α obtained based on the sensitivity ratio of each wavelength to the narrow band TB of each of the B pixel and the Cy pixel.
b ′ (x, y) = αb (x, y) (4-4)
 ここで、図18に示すように、B画素の感度は、狭帯域TBに含まれる各波長成分において、常にCy画素の感度よりも高い。このため、係数αは、0<α≦1となる。従って、上述した式(4-3)および式(4-4)により、
 b’(x,y)=αPB(x,y)          ・・・(4-5)
となる。また、上述した式(4-2)および式(4-5)より、
 g’(x,y)=PCy(x,y)-αPB(x,y)  ・・・(4―6)
となる。
Here, as shown in FIG. 18, the sensitivity of the B pixels, at each wavelength component contained in the narrowband T B, higher always than the sensitivity of the Cy pixel. Therefore, the coefficient α is 0 <α ≦ 1. Therefore, according to the above formulas (4-3) and (4-4),
b ′ (x, y) = αP B (x, y) (4-5)
It becomes. From the above formulas (4-2) and (4-5),
g ′ (x, y) = P Cy (x, y) −αP B (x, y) (4-6)
It becomes.
 従って、上述した式(4-5)および式(4―6)によりPBに含まれるノイズを増幅することなく、PCyに含まれるg’を求めることができる。また、図18に示すように、Mg画素は、狭帯域TGの光に対して感度を持たないため、狭帯域光下では、そのまま狭帯域TBの光によって発生する値として扱うことができる。 Therefore, g ′ contained in P Cy can be obtained without amplifying the noise contained in P B by the above-described equations (4-5) and (4-6). Further, as shown in FIG. 18, Mg pixel, because it has no sensitivity to light of a narrow band T G, under the narrow band light, can be treated as a value generated by light as narrow-band T B .
 このように、抽出部412は、上述した式(4-1)および式(4-5)を用いて、撮像素子202aの全画素において、信号値bおよび信号値b’’を求めることで、生体組織の表層情報を抽出する。さらに、抽出部412は、上述した式(4-6)を用いて、撮像素子202aの全画素において、信号値g’を求めることで、中深層情報を抽出する。抽出部412は、B画素およびMg画素の各々の狭帯域TBの信号値bおよび信号値b’’と、Cy画素の狭帯域TGの信号値g’を表示画像生成部42へ出力する。 As described above, the extraction unit 412 obtains the signal value b and the signal value b ″ in all the pixels of the image sensor 202a using the above-described equations (4-1) and (4-5). Extract surface layer information of living tissue. Further, the extraction unit 412 extracts the mid-deep layer information by obtaining the signal value g ′ in all the pixels of the image sensor 202a using the above-described formula (4-6). Extraction unit 412 outputs of each B pixel and Mg pixels' and the signal value g of the narrow band T G of Cy pixel 'narrowband signal value b and the signal value b of T B' to the display image generating unit 42 a .
 〔表示画像生成部の処理〕
 次に、表示画像生成部42の処理について説明する。
 表示画像生成部42では、抽出部412から出力された信号値b、b”、g’を用いて疑似カラー画像を生成する。疑似カラー画像はR、B、Gの3チャンネルで構成される。表示画像生成部42は、まず、BチャンネルおよびGチャンネルを生成するにあたり、Bチャンネルを生成し、同様の情報をGチャンネルにわりあてても良い。
[Processing of display image generation unit]
Next, processing of the display image generation unit 42 will be described.
The display image generation unit 42 generates a pseudo color image using the signal values b, b ″, g ′ output from the extraction unit 412. The pseudo color image is composed of three channels of R, B, and G. First, when generating the B channel and the G channel, the display image generation unit 42 may generate the B channel and assign the same information to the G channel.
 表示画像生成部42は、制御部45から入力されたカラーフィルタ情報に基づいて、抽出部412から入力された信号値bおよび信号値b’’の各々をB画素の位置およびMg画素の位置に割り当てる。この場合、表示画像生成部42は、b’’に、係数kを乗算し、信号値bおよび信号値b’’の明るさを揃える。係数kは、B画素とMg画素との感度比率に基づいた上述の式(1-12)で表すことができる。その後、表示画像生成部42は、G画素の位置において、近傍のB画素の位置における信号値bおよびMg画素の位置における信号値b’’を用いて、表層情報画像を生成する。ここで、表示画像生成部42は、方向情報を用いた補間処理を用いても良いし、線形の補間処理を用いてよい。また、表示画像生成部42は、補間処理を用いず、優先順位を設定し、信号値b、b”のどちらかを用いても良い。表示画像生成部42は、Bチャンネルの生成を行った後、Bチャンネルと同様の情報をGチャンネルに割り当てる。また、表示画像生成部42は、抽出部412から入力される信号値g’をRチャンネル上の各画素の位置に割り当てることによって中深層情報画像を生成する。 Based on the color filter information input from the control unit 45, the display image generation unit 42 sets the signal value b and the signal value b ″ input from the extraction unit 412 to the position of the B pixel and the position of the Mg pixel, respectively. assign. In this case, the display image generation unit 42 multiplies b ″ by a coefficient k to make the brightness of the signal value b and the signal value b ″ uniform. The coefficient k can be expressed by the above equation (1-12) based on the sensitivity ratio between the B pixel and the Mg pixel. Thereafter, the display image generation unit 42 generates a surface layer information image using the signal value b at the position of the neighboring B pixel and the signal value b '' at the position of the Mg pixel at the position of the G pixel. Here, the display image generation unit 42 may use an interpolation process using direction information or a linear interpolation process. Further, the display image generation unit 42 may set the priority order without using the interpolation process, and may use either of the signal values b and b ″. The display image generation unit 42 generates the B channel. After that, the same information as the B channel is assigned to the G channel, and the display image generating unit 42 assigns the signal value g ′ input from the extracting unit 412 to the position of each pixel on the R channel, so that the intermediate depth information Generate an image.
 以上説明した本発明の実施の形態4によれば、Bチャンネル、CyチャンネルおよびMgチャンネルに対して極力ノイズの増幅を行わない演算処理を行うことで、PCyおよびPMgから高品質な狭帯域の信号を抽出することができる。これにより、ノイズを増幅すること無く狭帯域光下での被検体の観察部位における粘膜表層、中層および深層の高解像度な画像を取得することができる。 According to the fourth embodiment of the present invention described above, B channels, by performing arithmetic processing as much as possible not to perform amplification of noise to Cy channels and Mg channels, high-quality narrowband P Cy and P Mg Can be extracted. As a result, high-resolution images of the mucosal surface layer, middle layer, and deep layer at the observation site of the subject under narrow band light can be acquired without amplifying noise.
(その他の実施の形態)
 なお、本発明に係るカラーフィルタは、上述した配列のほか、上記の条件を満たす配列であれば適宜変更することができる。
(Other embodiments)
In addition to the arrangement described above, the color filter according to the present invention can be appropriately changed as long as the arrangement satisfies the above conditions.
 また、本発明では、各々が所定の波長帯域の光を透過する透過フィルタを複数有するカラーフィルタが撮像素子の受光面に設けられているものとして説明したが、各透過フィルタが撮像素子の各画素に個別に設けられているものであってもよい。具体的には、本発明では、2×2のフィルタユニットで説明したが、n(整数)×m(整数)のフィルタユニット、例えば4×4のフィルタユニットであっても適用することができる。 In the present invention, the color filter having a plurality of transmission filters that each transmit light in a predetermined wavelength band is provided on the light receiving surface of the image sensor. However, each transmission filter corresponds to each pixel of the image sensor. May be provided individually. Specifically, in the present invention, a 2 × 2 filter unit has been described, but an n (integer) × m (integer) filter unit, for example, a 4 × 4 filter unit, can also be applied.
 また、本発明に係る内視鏡は、撮像素子と超音波トランスディーサが先端部に内蔵された超音波内視鏡、および被検体内に導入可能なカプセル型内視鏡であっても適用することができる。カプセル型内視鏡に適用する場合、2つの光源を切り替えて白色照明光および狭帯域照明光のいずれかを出射する場合、例えば、光源部、カラーフィルタおよび撮像素子をカプセル型の筐体内に設ければよい。 In addition, the endoscope according to the present invention is applicable to an ultrasonic endoscope in which an imaging element and an ultrasonic transducer are built in the tip, and a capsule endoscope that can be introduced into a subject. can do. When applied to a capsule endoscope, when switching between two light sources to emit either white illumination light or narrow-band illumination light, for example, a light source, a color filter, and an image sensor are provided in a capsule housing Just do it.
 なお、本明細書における処理の説明では、「まず」、「その後」、「続いて」等の表現を用いて前後関係を明示していたが、本発明を実施するために必要な処理の順序は、それらの表現によって一意的に定められるわけではない。即ち、本明細書で記載した処理における処理の順序は、矛盾のない範囲で変更することができる。 In the description of the processing in the present specification, the context is clearly shown using expressions such as “first”, “subsequent”, and “follow”, but the order of processing necessary to implement the present invention is described. Are not uniquely defined by their representation. That is, the order of processing in the processing described in this specification can be changed within a consistent range.
 このように、本発明は、ここでは記載していない様々な実施の形態を含みうるものであり、請求の範囲によって特定される技術的思想の範囲内で種々の設計変更等を行うことが可能である。 As described above, the present invention can include various embodiments not described herein, and various design changes can be made within the scope of the technical idea specified by the claims. It is.
 1 内視鏡システム
 2 内視鏡
 3 光源装置
 4 プロセッサ部
 5 表示部
 31 照明部
 32 照明制御部
 41 画像処理部
 42 表示画像生成部
 43 入力部
 44 記憶部
 45 制御部
 71 生体組織
 72 毛細血管
 73,74 太い血管
 200 操作部
 201 撮像レンズ
 202 撮像部
 202a 撮像素子
 202a 撮像素子
 202b,202c,202d カラーフィルタ
 203 ライトガイド
 204 照明レンズ
 205 A/D変換部
 206 撮像情報記憶部
 206a 識別情報記憶部
 311 光源部
 312 光源ドライバ
 313 切替フィルタ
 314 駆動部
 315 駆動ドライバ
 316 集光レンズ
 411 補間部
 412 抽出部
DESCRIPTION OF SYMBOLS 1 Endoscope system 2 Endoscope 3 Light source device 4 Processor part 5 Display part 31 Illumination part 32 Illumination control part 41 Image processing part 42 Display image generation part 43 Input part 44 Storage part 45 Control part 71 Biological tissue 72 Capillary vessel 73 , 74 Thick blood vessel 200 Operation unit 201 Imaging lens 202 Imaging unit 202a Imaging device 202a Imaging device 202b, 202c, 202d Color filter 203 Light guide 204 Illumination lens 205 A / D conversion unit 206 Imaging information storage unit 206a Identification information storage unit 311 Light source Unit 312 light source driver 313 switching filter 314 drive unit 315 drive driver 316 condenser lens 411 interpolation unit 412 extraction unit

Claims (8)

  1.  少なくとも青色の波長帯域に含まれる狭帯域の第1の光と、緑色の波長帯域に含まれる狭帯域の第2の光と、からなる狭帯域光を出射する光源装置と、
     2次元格子状に配置された複数の画素の各々が受光した光を光電変換することによって電気信号を生成する撮像素子と、
     前記青色の波長帯域の光を透過する第1のフィルタと、前記青色の波長帯域と前記緑色の波長帯域との光を透過する第2のフィルタと、少なくとも赤色の波長帯域の光を透過する第3のフィルタと、を有する複数のフィルタで構成された第1のフィルタユニット、または前記青色の波長帯域の光を透過する第1のフィルタと、前記青色の波長帯域の光および前記赤色の波長帯域の光を透過する第4のフィルタと、少なくとも前記緑色の波長帯域の光を透過する第5のフィルタと、を有する複数のフィルタで構成された第2のフィルタユニットであって、前記緑色の波長帯域の光を透過するフィルタ数が、全フィルタ数の半数以上であり、かつ、前記青色の波長帯域の光を透過するフィルタ数が前記緑色の波長帯域の光を透過するフィルタ数以上である第1のフィルタユニットまたは第2のフィルタユニットを前記複数の画素に対応させて配置してなるカラーフィルタと、
     前記複数の画素の各々が生成した画素値から前記狭帯域光の成分を抽出する演算処理であって、前記第1の光に対する感度をもつ前記複数の画素のうち前記第1の光に対する感度が最も低い感度をもつ前記画素の画素値から、前記複数の画素のうち前記第1の光に対する感度が最も高い前記画素の画素値と絶対値が1以下である第1の係数との積を減算することによって、少なくとも前記第2の光によって得られる前記狭帯域光の成分を抽出する演算処理を行う抽出部と、
     を備えたことを特徴とする内視鏡システム。
    A light source device that emits narrowband light including at least narrowband first light included in at least a blue wavelength band and narrowband second light included in a green wavelength band;
    An imaging device that generates an electrical signal by photoelectrically converting light received by each of a plurality of pixels arranged in a two-dimensional grid; and
    A first filter that transmits light in the blue wavelength band; a second filter that transmits light in the blue wavelength band and the green wavelength band; and a first filter that transmits light in at least the red wavelength band. A first filter unit including a plurality of filters, or a first filter that transmits light in the blue wavelength band, and light in the blue wavelength band and the red wavelength band. A second filter unit comprising a plurality of filters having a fourth filter that transmits light of the first wavelength and a fifth filter that transmits light of at least the green wavelength band, wherein the green wavelength The number of filters that transmit light in the band is more than half of the total number of filters, and the number of filters that transmit light in the blue wavelength band is less than the number of filters that transmit light in the green wavelength band. A color filter of the first filter unit or the second filter unit formed by arranging in correspondence to the plurality of pixels is,
    Computation processing for extracting a component of the narrowband light from a pixel value generated by each of the plurality of pixels, wherein the sensitivity to the first light among the plurality of pixels having sensitivity to the first light is The product of the pixel value of the pixel having the highest sensitivity to the first light and the first coefficient having an absolute value of 1 or less is subtracted from the pixel value of the pixel having the lowest sensitivity. An extraction unit that performs arithmetic processing to extract at least a component of the narrowband light obtained by the second light,
    An endoscope system comprising:
  2.  前記抽出部は、少なくとも前記赤色の波長帯域の光に感度をもつ前記画素の画素値から、前記複数の画素のうち前記第2の光に対する感度が最も高い前記画素の画素値と絶対値が1以下である第2の係数との積を減算することによって、前記第1の光によって得られる前記狭帯域光の成分をさらに抽出することを特徴とする請求項1に記載の内視鏡システム。 The extraction unit has a pixel value and an absolute value of 1 that are the highest in sensitivity to the second light among the plurality of pixels from a pixel value of the pixel that is sensitive to light in at least the red wavelength band. The endoscope system according to claim 1, wherein a component of the narrowband light obtained by the first light is further extracted by subtracting a product with a second coefficient which is the following.
  3.  前記抽出部は、前記第2の光によって得られる前記狭帯域光の成分を抽出した後に、前記第1の光によって得られる前記狭帯域光の成分を抽出することを特徴とする請求項2に記載の内視鏡システム。 The extraction unit extracts the narrowband light component obtained by the first light after extracting the narrowband light component obtained by the second light. The endoscope system described.
  4.  前記複数の画素の各々が生成した画素値に対して補間処理を行うことによって色毎のチャンネルの補間値を生成する補間部をさらに備え、
     前記抽出部は、前記補間部が前記補間処理を行って生成した前記補間値から前記狭帯域光の成分を抽出することを特徴とする請求項1~3のいずれか一つに記載の内視鏡システム。
    An interpolation unit that generates an interpolation value of a channel for each color by performing an interpolation process on a pixel value generated by each of the plurality of pixels;
    The endoscope according to any one of claims 1 to 3, wherein the extraction unit extracts the narrowband light component from the interpolation value generated by the interpolation unit performing the interpolation process. Mirror system.
  5.  前記第1のフィルタを配置してなる前記画素は、前記複数の画素のうち、前記第1の光に対して最も感度が高く、
     前記第2のフィルタを配置してなる前記画素または前記5のフィルタが配置してなる前記画素は、前記複数の画素のうち、前記第2の光に対して最も感度が高いことを特徴とする請求項1~4のいずれか一つに記載の内視鏡システム。
    The pixel formed by arranging the first filter has the highest sensitivity to the first light among the plurality of pixels,
    The pixel in which the second filter is disposed or the pixel in which the fifth filter is disposed has the highest sensitivity to the second light among the plurality of pixels. The endoscope system according to any one of claims 1 to 4.
  6.  前記抽出部が抽出した前記狭帯域光の成分に基づいて、表示画像信号を生成する表示画像生成部をさらに備えたことを特徴とする請求項1~5のいずれか一つに記載の内視鏡システム。 The endoscopy according to any one of claims 1 to 5, further comprising a display image generation unit that generates a display image signal based on the narrowband light component extracted by the extraction unit. Mirror system.
  7.  2次元格子状に配置された複数の画素の各々が受光した光を光電変換することによって電気信号を生成する撮像素子を備えた内視鏡が生成した画像データに対して画像処理を行う画像処理装置であって、
     前記複数の画素の各々が生成した画素値から、光源装置によって出射された青色の波長帯域に含まれる狭帯域の第1の光と、緑色の波長帯域に含まれる狭帯域の第2の光と、からなる狭帯域光の成分を抽出する演算処理であって、前記第1の光に対する感度をもつ前記複数の画素のうち前記第1の光に対する感度が最も低い感度をもつ前記画素の画素値から、前記複数の画素のうち前記第1の光に対する感度が最も高い前記画素の画素値と絶対値が1以下である第1の係数との積を減算することによって、少なくとも前記第2の光によって得られる前記狭帯域光の成分を抽出する演算処理を行う抽出部を備え、
     前記内視鏡は、
     前記青色の波長帯域の光を透過する第1のフィルタと、前記青色の波長帯域と前記緑色の波長帯域との光を透過する第2のフィルタと、少なくとも赤色の波長帯域の光を透過する第3のフィルタと、を有する複数のフィルタで構成された第1のフィルタユニット、または前記青色の波長帯域の光を透過する第1のフィルタと、前記青色の波長帯域の光および前記赤色の波長帯域の光を透過する第4のフィルタと、少なくとも前記緑色の波長帯域の光を透過する第5のフィルタと、を有する複数のフィルタで構成された第2のフィルタユニットであって、前記緑色の波長帯域の光を透過するフィルタ数が、全フィルタ数の半数以上であり、かつ、前記青色の波長帯域の光を透過するフィルタ数が前記緑色の波長帯域の光を透過するフィルタ数以上である第1のフィルタユニットまたは第2のフィルタユニットを前記複数の画素に対応させて配置してなるカラーフィルタを備えることを特徴とする画像処理装置。
    Image processing that performs image processing on image data generated by an endoscope that includes an image sensor that generates an electrical signal by photoelectrically converting light received by each of a plurality of pixels arranged in a two-dimensional grid. A device,
    From the pixel values generated by each of the plurality of pixels, the first light in the narrow band included in the blue wavelength band and the second light in the narrow band included in the green wavelength band emitted by the light source device And a pixel value of the pixel having the lowest sensitivity to the first light among the plurality of pixels having sensitivity to the first light. From the plurality of pixels, by subtracting the product of the pixel value of the pixel having the highest sensitivity to the first light and the first coefficient having an absolute value of 1 or less, at least the second light An extraction unit that performs arithmetic processing to extract the component of the narrowband light obtained by
    The endoscope is
    A first filter that transmits light in the blue wavelength band; a second filter that transmits light in the blue wavelength band and the green wavelength band; and a first filter that transmits light in at least the red wavelength band. A first filter unit including a plurality of filters, or a first filter that transmits light in the blue wavelength band, and light in the blue wavelength band and the red wavelength band. A second filter unit comprising a plurality of filters having a fourth filter that transmits light of the first wavelength and a fifth filter that transmits light of at least the green wavelength band, wherein the green wavelength The number of filters that transmit light in the band is more than half of the total number of filters, and the number of filters that transmit light in the blue wavelength band is less than the number of filters that transmit light in the green wavelength band. The image processing apparatus according to the first filter unit or further comprising a color filter of the second filter unit formed by arranging in correspondence to the plurality of pixels is.
  8.  2次元格子状に配置された複数の画素の各々が受光した光を光電変換することによって電気信号を生成する撮像素子を備えた内視鏡が生成した画像データに対して画像処理を行う画像処理装置が実行する画像処理方法であって、
     前記複数の画素の各々が生成した画素値から、光源装置によって出射された青色の波長帯域に含まれる狭帯域の第1の光と、緑色の波長帯域に含まれる狭帯域の第2の光と、からなる狭帯域光の成分を抽出する演算処理であって、前記第1の光に対する感度をもつ前記複数の画素のうち前記第1の光に対する感度が最も低い感度をもつ前記画素の画素値から、前記複数の画素のうち前記第1の光に対する感度が最も高い前記画素の画素値と絶対値が1以下である第1の係数との積を減算することによって、少なくとも前記第2の光によって得られる前記狭帯域光の成分を抽出する演算処理を行う抽出ステップを含み、
     前記内視鏡は、
     前記青色の波長帯域の光を透過する第1のフィルタと、前記青色の波長帯域と前記緑色の波長帯域との光を透過する第2のフィルタと、少なくとも赤色の波長帯域の光を透過する第3のフィルタと、を有する複数のフィルタで構成された第1のフィルタユニット、または前記青色の波長帯域の光を透過する第1のフィルタと、前記青色の波長帯域の光および前記赤色の波長帯域の光を透過する第4のフィルタと、少なくとも前記緑色の波長帯域の光を透過する第5のフィルタと、を有する複数のフィルタで構成された第2のフィルタユニットであって、前記緑色の波長帯域の光を透過するフィルタ数が、全フィルタ数の半数以上であり、かつ、前記青色の波長帯域の光を透過するフィルタ数が前記緑色の波長帯域の光を透過するフィルタ数以上である第1のフィルタユニットまたは第2のフィルタユニットを前記複数の画素に対応させて配置してなるカラーフィルタを備えることを特徴とする画像処理方法。
    Image processing that performs image processing on image data generated by an endoscope that includes an image sensor that generates an electrical signal by photoelectrically converting light received by each of a plurality of pixels arranged in a two-dimensional grid. An image processing method executed by an apparatus,
    From the pixel values generated by each of the plurality of pixels, the first light in the narrow band included in the blue wavelength band and the second light in the narrow band included in the green wavelength band emitted by the light source device And a pixel value of the pixel having the lowest sensitivity to the first light among the plurality of pixels having sensitivity to the first light. From the plurality of pixels, by subtracting the product of the pixel value of the pixel having the highest sensitivity to the first light and the first coefficient having an absolute value of 1 or less, at least the second light An extraction step of performing an arithmetic process of extracting the narrowband light component obtained by:
    The endoscope is
    A first filter that transmits light in the blue wavelength band; a second filter that transmits light in the blue wavelength band and the green wavelength band; and a first filter that transmits light in at least the red wavelength band. A first filter unit including a plurality of filters, or a first filter that transmits light in the blue wavelength band, and light in the blue wavelength band and the red wavelength band. A second filter unit comprising a plurality of filters having a fourth filter that transmits light of the first wavelength and a fifth filter that transmits light of at least the green wavelength band, wherein the green wavelength The number of filters that transmit light in the band is more than half of the total number of filters, and the number of filters that transmit light in the blue wavelength band is less than the number of filters that transmit light in the green wavelength band. Image processing method characterized by the first filter unit or the second filter unit is provided with a color filter formed by arranging in correspondence to the plurality of pixels.
PCT/JP2015/076200 2015-09-15 2015-09-15 Endoscope system, image processing apparatus, and image processing method WO2017046876A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/076200 WO2017046876A1 (en) 2015-09-15 2015-09-15 Endoscope system, image processing apparatus, and image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/076200 WO2017046876A1 (en) 2015-09-15 2015-09-15 Endoscope system, image processing apparatus, and image processing method

Publications (1)

Publication Number Publication Date
WO2017046876A1 true WO2017046876A1 (en) 2017-03-23

Family

ID=58288290

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/076200 WO2017046876A1 (en) 2015-09-15 2015-09-15 Endoscope system, image processing apparatus, and image processing method

Country Status (1)

Country Link
WO (1) WO2017046876A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62118686A (en) * 1985-11-19 1987-05-30 Toshiba Corp Single board type solid-state image pickup device
JP2012170639A (en) * 2011-02-22 2012-09-10 Fujifilm Corp Endoscope system, and method for displaying emphasized image of capillary of mucous membrane surface layer

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62118686A (en) * 1985-11-19 1987-05-30 Toshiba Corp Single board type solid-state image pickup device
JP2012170639A (en) * 2011-02-22 2012-09-10 Fujifilm Corp Endoscope system, and method for displaying emphasized image of capillary of mucous membrane surface layer

Similar Documents

Publication Publication Date Title
JP5670264B2 (en) Endoscope system and method for operating endoscope system
CN106388756B (en) Image processing apparatus, method of operating the same, and endoscope system
EP2468186B1 (en) Endoscopic diagnosis system
JP5303012B2 (en) Endoscope system, processor device for endoscope system, and method for operating endoscope system
JP5496075B2 (en) Endoscopic diagnosis device
US20150216400A1 (en) Endoscopic device
US9629527B2 (en) Endoscope system, processor device of endoscope system, and image processing method
JP5159904B2 (en) Endoscopic diagnosis device
JP6471173B2 (en) Image processing apparatus, operation method of endoscope apparatus, image processing program, and endoscope apparatus
WO2015093295A1 (en) Endoscopic device
US10070771B2 (en) Image processing apparatus, method for operating image processing apparatus, computer-readable recording medium, and endoscope device
US20140340497A1 (en) Processor device, endoscope system, and operation method of endoscope system
JP2010069063A (en) Method and apparatus for capturing image
US20150173595A1 (en) Imaging apparatus
US11571111B2 (en) Endoscope scope, endoscope processor, and endoscope adaptor
JPWO2016104386A1 (en) Endoscope system
JP5558331B2 (en) Endoscope system, processor device for endoscope system, and method for operating endoscope system
US10863149B2 (en) Image processing apparatus, image processing method, and computer readable recording medium
US10901199B2 (en) Endoscope system having variable focal length lens that switches between two or more values
CN111712178A (en) Endoscope system and method for operating same
WO2017203996A1 (en) Image signal processing device, image signal processing method, and image signal processing program
WO2017046876A1 (en) Endoscope system, image processing apparatus, and image processing method
JP7454417B2 (en) Medical control device and medical observation system
JP7224963B2 (en) Medical controller and medical observation system
JP2012217485A (en) Endoscope system and driving method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15904065

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15904065

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP